Sample records for extremely complex problem

  1. OPTIMIZING THROUGH CO-EVOLUTIONARY AVALANCHES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. BOETTCHER; A. PERCUS

    2000-08-01

    We explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problems. The method, called extremal optimization, is inspired by ''self-organized critically,'' a concept introduced to describe emergent complexity in many physical systems. In contrast to Genetic Algorithms which operate on an entire ''gene-pool'' of possible solutions, extremal optimization successively replaces extremely undesirable elements of a sub-optimal solution with new, random ones. Large fluctuations, called ''avalanches,'' ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements approximation methods inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Those phase transitions are found in the parameter space of most optimization problems, and have recently been conjectured to be the origin of some of the hardest instances in computational complexity. We will demonstrate how extremal optimization can be implemented for a variety of combinatorial optimization problems. We believe that extremal optimization will be a useful tool in the investigation of phase transitions in combinatorial optimization problems, hence valuable in elucidating the origin of computational complexity.« less

  2. Combining local search with co-evolution in a remarkably simple way

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boettcher, S.; Percus, A.

    2000-05-01

    The authors explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problem. The method, called extremal optimization, is inspired by self-organized criticality, a concept introduced to describe emergent complexity in physical systems. In contrast to genetic algorithms, which operate on an entire gene-pool of possible solutions, extremal optimization successively replaces extremely undesirable elements of a single sub-optimal solution with new, random ones. Large fluctuations, or avalanches, ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements heuristics inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Phase transitions are found in many combinatorial optimization problems, and have been conjectured to occur in the region of parameter space containing the hardest instances. We demonstrate how extremal optimization can be implemented for a variety of hard optimization problems. We believe that this will be a useful tool in the investigation of phase transitions in combinatorial optimization, thereby helping to elucidate the origin of computational complexity.« less

  3. Extreme fluctuations in stochastic network coordination with time delays

    NASA Astrophysics Data System (ADS)

    Hunt, D.; Molnár, F.; Szymanski, B. K.; Korniss, G.

    2015-12-01

    We study the effects of uniform time delays on the extreme fluctuations in stochastic synchronization and coordination problems with linear couplings in complex networks. We obtain the average size of the fluctuations at the nodes from the behavior of the underlying modes of the network. We then obtain the scaling behavior of the extreme fluctuations with system size, as well as the distribution of the extremes on complex networks, and compare them to those on regular one-dimensional lattices. For large complex networks, when the delay is not too close to the critical one, fluctuations at the nodes effectively decouple, and the limit distributions converge to the Fisher-Tippett-Gumbel density. In contrast, fluctuations in low-dimensional spatial graphs are strongly correlated, and the limit distribution of the extremes is the Airy density. Finally, we also explore the effects of nonlinear couplings on the stability and on the extremes of the synchronization landscapes.

  4. Artificial Intelligence (AI), Operations Research (OR), and Decision Support Systems (DSS): A conceptual framework

    NASA Technical Reports Server (NTRS)

    Parnell, Gregory S.; Rowell, William F.; Valusek, John R.

    1987-01-01

    In recent years there has been increasing interest in applying the computer based problem solving techniques of Artificial Intelligence (AI), Operations Research (OR), and Decision Support Systems (DSS) to analyze extremely complex problems. A conceptual framework is developed for successfully integrating these three techniques. First, the fields of AI, OR, and DSS are defined and the relationships among the three fields are explored. Next, a comprehensive adaptive design methodology for AI and OR modeling within the context of a DSS is described. These observations are made: (1) the solution of extremely complex knowledge problems with ill-defined, changing requirements can benefit greatly from the use of the adaptive design process, (2) the field of DSS provides the focus on the decision making process essential for tailoring solutions to these complex problems, (3) the characteristics of AI, OR, and DSS tools appears to be converging rapidly, and (4) there is a growing need for an interdisciplinary AI/OR/DSS education.

  5. From lepton protoplasm to the genesis of hadrons

    NASA Astrophysics Data System (ADS)

    Eliseev, S. M.; Kosmachev, O. S.

    2016-01-01

    Theory of matter under extreme conditions opens a new stage in particle physics. It is necessary here to combine Dirac's elementary particle physics with Prigogine's dynamics of nonequilibrium systems. In the article we discuss the problem of the hierarchy of complexity. What can be considered as the lowest level of the organization of extreme matter on the basis of which the self-organization of the complex form occur?

  6. Game theory and extremal optimization for community detection in complex dynamic networks.

    PubMed

    Lung, Rodica Ioana; Chira, Camelia; Andreica, Anca

    2014-01-01

    The detection of evolving communities in dynamic complex networks is a challenging problem that recently received attention from the research community. Dynamics clearly add another complexity dimension to the difficult task of community detection. Methods should be able to detect changes in the network structure and produce a set of community structures corresponding to different timestamps and reflecting the evolution in time of network data. We propose a novel approach based on game theory elements and extremal optimization to address dynamic communities detection. Thus, the problem is formulated as a mathematical game in which nodes take the role of players that seek to choose a community that maximizes their profit viewed as a fitness function. Numerical results obtained for both synthetic and real-world networks illustrate the competitive performance of this game theoretical approach.

  7. Bunched black (and grouped grey) swans: Dissipative and non-dissipative models of correlated extreme fluctuations in complex geosystems

    NASA Astrophysics Data System (ADS)

    Watkins, N. W.

    2013-01-01

    I review the hierarchy of approaches to complex systems, focusing particularly on stochastic equations. I discuss how the main models advocated by the late Benoit Mandelbrot fit into this classification, and how they continue to contribute to cross-disciplinary approaches to the increasingly important problems of correlated extreme events and unresolved scales. The ideas have broad importance, with applications ranging across science areas as diverse as the heavy tailed distributions of intense rainfall in hydrology, after which Mandelbrot named the "Noah effect"; the problem of correlated runs of dry summers in climate, after which the "Joseph effect" was named; and the intermittent, bursty, volatility seen in finance and fluid turbulence.

  8. Predictability of Extreme Climate Events via a Complex Network Approach

    NASA Astrophysics Data System (ADS)

    Muhkin, D.; Kurths, J.

    2017-12-01

    We analyse climate dynamics from a complex network approach. This leads to an inverse problem: Is there a backbone-like structure underlying the climate system? For this we propose a method to reconstruct and analyze a complex network from data generated by a spatio-temporal dynamical system. This approach enables us to uncover relations to global circulation patterns in oceans and atmosphere. This concept is then applied to Monsoon data; in particular, we develop a general framework to predict extreme events by combining a non-linear synchronization technique with complex networks. Applying this method, we uncover a new mechanism of extreme floods in the eastern Central Andes which could be used for operational forecasts. Moreover, we analyze the Indian Summer Monsoon (ISM) and identify two regions of high importance. By estimating an underlying critical point, this leads to an improved prediction of the onset of the ISM; this scheme was successful in 2016 and 2017.

  9. A Large-Scale Multi-Hop Localization Algorithm Based on Regularized Extreme Learning for Wireless Networks.

    PubMed

    Zheng, Wei; Yan, Xiaoyong; Zhao, Wei; Qian, Chengshan

    2017-12-20

    A novel large-scale multi-hop localization algorithm based on regularized extreme learning is proposed in this paper. The large-scale multi-hop localization problem is formulated as a learning problem. Unlike other similar localization algorithms, the proposed algorithm overcomes the shortcoming of the traditional algorithms which are only applicable to an isotropic network, therefore has a strong adaptability to the complex deployment environment. The proposed algorithm is composed of three stages: data acquisition, modeling and location estimation. In data acquisition stage, the training information between nodes of the given network is collected. In modeling stage, the model among the hop-counts and the physical distances between nodes is constructed using regularized extreme learning. In location estimation stage, each node finds its specific location in a distributed manner. Theoretical analysis and several experiments show that the proposed algorithm can adapt to the different topological environments with low computational cost. Furthermore, high accuracy can be achieved by this method without setting complex parameters.

  10. Simulated parallel annealing within a neighborhood for optimization of biomechanical systems.

    PubMed

    Higginson, J S; Neptune, R R; Anderson, F C

    2005-09-01

    Optimization problems for biomechanical systems have become extremely complex. Simulated annealing (SA) algorithms have performed well in a variety of test problems and biomechanical applications; however, despite advances in computer speed, convergence to optimal solutions for systems of even moderate complexity has remained prohibitive. The objective of this study was to develop a portable parallel version of a SA algorithm for solving optimization problems in biomechanics. The algorithm for simulated parallel annealing within a neighborhood (SPAN) was designed to minimize interprocessor communication time and closely retain the heuristics of the serial SA algorithm. The computational speed of the SPAN algorithm scaled linearly with the number of processors on different computer platforms for a simple quadratic test problem and for a more complex forward dynamic simulation of human pedaling.

  11. Aeropropulsion 1987. Session 2: Aeropropulsion Structures Research

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Aeropropulsion systems present unique problems to the structural engineer. The extremes in operating temperatures, rotational effects, and behaviors of advanced material systems combine into complexities that require advances in many scientific disciplines involved in structural analysis and design procedures. This session provides an overview of the complexities of aeropropulsion structures and the theoretical, computational, and experimental research conducted to achieve the needed advances.

  12. Grid-converged solution and analysis of the unsteady viscous flow in a two-dimensional shock tube

    NASA Astrophysics Data System (ADS)

    Zhou, Guangzhao; Xu, Kun; Liu, Feng

    2018-01-01

    The flow in a shock tube is extremely complex with dynamic multi-scale structures of sharp fronts, flow separation, and vortices due to the interaction of the shock wave, the contact surface, and the boundary layer over the side wall of the tube. Prediction and understanding of the complex fluid dynamics are of theoretical and practical importance. It is also an extremely challenging problem for numerical simulation, especially at relatively high Reynolds numbers. Daru and Tenaud ["Evaluation of TVD high resolution schemes for unsteady viscous shocked flows," Comput. Fluids 30, 89-113 (2001)] proposed a two-dimensional model problem as a numerical test case for high-resolution schemes to simulate the flow field in a square closed shock tube. Though many researchers attempted this problem using a variety of computational methods, there is not yet an agreed-upon grid-converged solution of the problem at the Reynolds number of 1000. This paper presents a rigorous grid-convergence study and the resulting grid-converged solutions for this problem by using a newly developed, efficient, and high-order gas-kinetic scheme. Critical data extracted from the converged solutions are documented as benchmark data. The complex fluid dynamics of the flow at Re = 1000 are discussed and analyzed in detail. Major phenomena revealed by the numerical computations include the downward concentration of the fluid through the curved shock, the formation of the vortices, the mechanism of the shock wave bifurcation, the structure of the jet along the bottom wall, and the Kelvin-Helmholtz instability near the contact surface. Presentation and analysis of those flow processes provide important physical insight into the complex flow physics occurring in a shock tube.

  13. Prebiotic coordination chemistry: The potential role of transition-metal complexes in the chemical evolution

    NASA Technical Reports Server (NTRS)

    Beck, M.

    1979-01-01

    In approaching the extremely involved and complex problem of the origin of life, consideration of the coordination chemistry appeared not only as a possibility but as a necessity. The first model experiments appear to be promising because of prebiotic-type synthesis by means of transition-metal complexes. It is especially significant that in some instances various types of vitally important substances (nucleic bases, amino acids) are formed simultaneously. There is ground to hope that systematic studies in this field will clarify the role of transition-metal complexes in the organizatorial phase of chemical evolution. It is obvious that researchers working in the fields of the chemistry of cyano and carbonyl complexes, and of the catalytic effect of transition-metal complexes are best suited to study these aspects of the attractive and interesting problem of the origin of life.

  14. Measuring the Impact of Business Rules on Inventory Balancing

    DTIC Science & Technology

    2013-09-01

    The Navy ERP system enables inventory to be redistributed across sites to help maintain optimum inventory levels. Holding too much inventory is...not unique to the Navy. In fact, the complexity of this problem is only magnified for competitive firms that are hesitant to share sensitive data with...lateral transshipment problems makes finding an analytical solution extremely difficult. The strength of simulation models lies within their ability

  15. USE OF REVA'S WEB-BASED ENVIRONMENTAL DECISION TOOLKIT (EDT) TO ASSESS VULNERABILITY TO MERCURY ACROSS THE UNITED STATES

    EPA Science Inventory

    The problem of assessing risk from mercury across the nation is extremely complex involving integration of 1) our understanding of the methylation process in ecosystems, 2) the identification and spatial distribution of sensitive populations, and 3) the spatial pattern of mercury...

  16. ASSESSING THE RISK ASSOCIATED WITH MERCURY: USING REVA'S WEBTOOL TO COMPARE DATA, ASSUMPTIONS, AND MODELS

    EPA Science Inventory

    The problem of assessing risk from mercury across the nation is extremely complex involving integration of I) our understanding of the methylation process in ecosystems, 2) the identification and spatial distribution of sensitive populations, and 3) the spatial pattern of mercury...

  17. Difficult Decisions Made Easier

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA missions are extremely complex and prone to sudden, catastrophic failure if equipment falters or if an unforeseen event occurs. For these reasons, NASA trains to expect the unexpected. It tests its equipment and systems in extreme conditions, and it develops risk-analysis tests to foresee any possible problems. The Space Agency recently worked with an industry partner to develop reliability analysis software capable of modeling complex, highly dynamic systems, taking into account variations in input parameters and the evolution of the system over the course of a mission. The goal of this research was multifold. It included performance and risk analyses of complex, multiphase missions, like the insertion of the Mars Reconnaissance Orbiter; reliability analyses of systems with redundant and/or repairable components; optimization analyses of system configurations with respect to cost and reliability; and sensitivity analyses to identify optimal areas for uncertainty reduction or performance enhancement.

  18. Evaluating teams in extreme environments: from issues to answers.

    PubMed

    Bishop, Sheryl L

    2004-07-01

    The challenge to effectively evaluating teams in extreme environments necessarily involves a wide range of physiological, psychological, and psychosocial factors. The high reliance on technology, the growing frequency of multinational and multicultural teams, and the demand for longer duration missions all further compound the complexity of the problem. The primary goal is the insurance of human health and well-being with expectations that such priorities will naturally lead to improved chances for performance and mission success. This paper provides an overview of some of the most salient immediate challenges for selecting, training, and supporting teams in extreme environments, gives exemplars of research findings concerning these challenges, and discusses the need for future research.

  19. A NIST Kinetic Data Base for PAH Reaction and Soot Particle Inception During Combusion

    DTIC Science & Technology

    2007-12-01

    in Computational Fluid Dynamics (CFD) codes hat have lead to the capability of describing complex reactive flow problems and thus simulating... parameters . However in the absence of data estimates must be made. Since the chemistry of combustion is extremely complex and for proper description...118:381-389 9. Babushok, V. and Tsang, W., J. Prop. and Pwr . 20 (2004) 403-414. 10. . Fournet, R., Warth, V., Glaude, P.A., Battin-Leclerc, F

  20. Modeling of thermomechanical changes of extreme-ultraviolet mask and their dependence on absorber variation

    NASA Astrophysics Data System (ADS)

    Ban, Chung-Hyun; Park, Eun-Sang; Park, Jae-Hun; Oh, Hye-Keun

    2018-06-01

    Thermal and structural deformation of extreme-ultraviolet lithography (EUVL) masks during the exposure process may become important issues as these masks are subject to rigorous image placement and flatness requirements. The reflective masks used for EUVL absorb energy during exposure, and the temperature of the masks rises as a result. This can cause thermomechanical deformation that can reduce the pattern quality. The use of very thick low-thermal-expansion substrate materials (LTEMs) may reduce energy absorption, but they do not completely eliminate mask deformation. Therefore, it is necessary to predict and optimize the effects of energy transferred from the extreme-ultraviolet (EUV) light source and the resultant patterns of structured EUV masks with complex multilayers. Our study shows that heat accumulates in the masks as exposure progresses. It has been found that a higher absorber ratio (pattern density) applied to the patterning of EUV masks exacerbates the problem, especially in masks with more complex patterns.

  1. Information and knowledge management in support of sustainable forestry: a review

    Treesearch

    H. Michael Rauscher; Daniel L. Schmoldt; Harald Vacik

    2007-01-01

    For individuals, organizations and nations, success and even survival depend upon making good decisions. Doing so can be extremely difficult when problems are not well structured and situations are complex, as they are for natural resource management. Recent advances in computer technology coupled with the increase in accessibility brought about by the...

  2. Ready, Aim, Perform! Targeted Micro-Training for Performance Intervention

    ERIC Educational Resources Information Center

    Carpenter, Julia; Forde, Dahlia S.; Stevens, Denise R.; Flango, Vincent; Babcock, Lisa K.

    2016-01-01

    The Department of Veterans Affairs has an immediate problem at hand. Tens of thousands of employees are working in a high-stress work environment where fast-paced daily production requirements are critical. Employees are faced with a tremendous backlog of veterans' claims. Unfortunately, not only are the claims extremely complex, but there is…

  3. Mental Images and the Modification of Learning Defects.

    ERIC Educational Resources Information Center

    Patten, Bernard M.

    Because human memory and thought involve extremely complex processes, it is possible to employ unusual modalities and specific visual strategies for remembering and problem-solving to assist patients with memory defects. This three-part paper discusses some of the research in the field of human memory and describes practical applications of these…

  4. Dannie Heineman Prize for Mathematical Physics: Applying mathematical techniques to solve important problems in quantum theory

    NASA Astrophysics Data System (ADS)

    Bender, Carl

    2017-01-01

    The theory of complex variables is extremely useful because it helps to explain the mathematical behavior of functions of a real variable. Complex variable theory also provides insight into the nature of physical theories. For example, it provides a simple and beautiful picture of quantization and it explains the underlying reason for the divergence of perturbation theory. By using complex-variable methods one can generalize conventional Hermitian quantum theories into the complex domain. The result is a new class of parity-time-symmetric (PT-symmetric) theories whose remarkable physical properties have been studied and verified in many recent laboratory experiments.

  5. Plastic Surgery Challenges in War Wounded I: Flap-Based Extremity Reconstruction

    PubMed Central

    Sabino, Jennifer M.; Slater, Julia; Valerio, Ian L.

    2016-01-01

    Scope and Significance: Reconstruction of traumatic injuries requiring tissue transfer begins with aggressive resuscitation and stabilization. Systematic advances in acute casualty care at the point of injury have improved survival and allowed for increasingly complex treatment before definitive reconstruction at tertiary medical facilities outside the combat zone. As a result, the complexity of the limb salvage algorithm has increased over 14 years of combat activities in Iraq and Afghanistan. Problem: Severe poly-extremity trauma in combat casualties has led to a large number of extremity salvage cases. Advanced reconstructive techniques coupled with regenerative medicine applications have played a critical role in the restoration, recovery, and rehabilitation of functional limb salvage. Translational Relevance: The past 14 years of war trauma have increased our understanding of tissue transfer for extremity reconstruction in the treatment of combat casualties. Injury patterns, flap choice, and reconstruction timing are critical variables to consider for optimal outcomes. Clinical Relevance: Subacute reconstruction with specifically chosen flap tissue and donor site location based on individual injuries result in successful tissue transfer, even in critically injured patients. These considerations can be combined with regenerative therapies to optimize massive wound coverage and limb salvage form and function in previously active patients. Summary: Traditional soft tissue reconstruction is integral in the treatment of war extremity trauma. Pedicle and free flaps are a critically important part of the reconstructive ladder for salvaging extreme extremity injuries that are seen as a result of the current practice of war. PMID:27679751

  6. Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biros, George

    Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. Thesemore » include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a central challenge in UQ, especially for large-scale models. We propose to develop the mathematical tools to address these challenges in the context of extreme-scale problems. 4. Parallel scalable algorithms for Bayesian optimal experimental design (OED). Bayesian inversion yields quantified uncertainties in the model parameters, which can be propagated forward through the model to yield uncertainty in outputs of interest. This opens the way for designing new experiments to reduce the uncertainties in the model parameters and model predictions. Such experimental design problems have been intractable for large-scale problems using conventional methods; we will create OED algorithms that exploit the structure of the PDE model and the parameter-to-output map to overcome these challenges. Parallel algorithms for these four problems were created, analyzed, prototyped, implemented, tuned, and scaled up for leading-edge supercomputers, including UT-Austin’s own 10 petaflops Stampede system, ANL’s Mira system, and ORNL’s Titan system. While our focus is on fundamental mathematical/computational methods and algorithms, we will assess our methods on model problems derived from several DOE mission applications, including multiscale mechanics and ice sheet dynamics.« less

  7. A fast isogeometric BEM for the three dimensional Laplace- and Helmholtz problems

    NASA Astrophysics Data System (ADS)

    Dölz, Jürgen; Harbrecht, Helmut; Kurz, Stefan; Schöps, Sebastian; Wolf, Felix

    2018-03-01

    We present an indirect higher order boundary element method utilising NURBS mappings for exact geometry representation and an interpolation-based fast multipole method for compression and reduction of computational complexity, to counteract the problems arising due to the dense matrices produced by boundary element methods. By solving Laplace and Helmholtz problems via a single layer approach we show, through a series of numerical examples suitable for easy comparison with other numerical schemes, that one can indeed achieve extremely high rates of convergence of the pointwise potential through the utilisation of higher order B-spline-based ansatz functions.

  8. Rahman Prize Lecture: Lattice Boltzmann simulation of complex states of flowing matter

    NASA Astrophysics Data System (ADS)

    Succi, Sauro

    Over the last three decades, the Lattice Boltzmann (LB) method has gained a prominent role in the numerical simulation of complex flows across an impressively broad range of scales, from fully-developed turbulence in real-life geometries, to multiphase flows in micro-fluidic devices, all the way down to biopolymer translocation in nanopores and lately, even quark-gluon plasmas. After a brief introduction to the main ideas behind the LB method and its historical developments, we shall present a few selected applications to complex flow problems at various scales of motion. Finally, we shall discuss prospects for extreme-scale LB simulations of outstanding problems in the physics of fluids and its interfaces with material sciences and biology, such as the modelling of fluid turbulence, the optimal design of nanoporous gold catalysts and protein folding/aggregation in crowded environments.

  9. Towards a unified study of extreme events using universality concepts and transdisciplinary analysis methods

    NASA Astrophysics Data System (ADS)

    Balasis, George; Donner, Reik V.; Donges, Jonathan F.; Radebach, Alexander; Eftaxias, Konstantinos; Kurths, Jürgen

    2013-04-01

    The dynamics of many complex systems is characterized by the same universal principles. In particular, systems which are otherwise quite different in nature show striking similarities in their behavior near tipping points (bifurcations, phase transitions, sudden regime shifts) and associated extreme events. Such critical phenomena are frequently found in diverse fields such as climate, seismology, or financial markets. Notably, the observed similarities include a high degree of organization, persistent behavior, and accelerated energy release, which are common to (among others) phenomena related to geomagnetic variability of the terrestrial magnetosphere (intense magnetic storms), seismic activity (electromagnetic emissions prior to earthquakes), solar-terrestrial physics (solar flares), neurophysiology (epileptic seizures), and socioeconomic systems (stock market crashes). It is an open question whether the spatial and temporal complexity associated with extreme events arises from the system's structural organization (geometry) or from the chaotic behavior inherent to the nonlinear equations governing the dynamics of these phenomena. On the one hand, the presence of scaling laws associated with earthquakes and geomagnetic disturbances suggests understanding these events as generalized phase transitions similar to nucleation and critical phenomena in thermal and magnetic systems. On the other hand, because of the structural organization of the systems (e.g., as complex networks) the associated spatial geometry and/or topology of interactions plays a fundamental role in the emergence of extreme events. Here, a few aspects of the interplay between geometry and dynamics (critical phase transitions) that could result in the emergence of extreme events, which is an open problem, will be discussed.

  10. Adaptive algorithm of selecting optimal variant of errors detection system for digital means of automation facility of oil and gas complex

    NASA Astrophysics Data System (ADS)

    Poluyan, A. Y.; Fugarov, D. D.; Purchina, O. A.; Nesterchuk, V. V.; Smirnova, O. V.; Petrenkova, S. B.

    2018-05-01

    To date, the problems associated with the detection of errors in digital equipment (DE) systems for the automation of explosive objects of the oil and gas complex are extremely actual. Especially this problem is actual for facilities where a violation of the accuracy of the DE will inevitably lead to man-made disasters and essential material damage, at such facilities, the diagnostics of the accuracy of the DE operation is one of the main elements of the industrial safety management system. In the work, the solution of the problem of selecting the optimal variant of the errors detection system of errors detection by a validation criterion. Known methods for solving these problems have an exponential valuation of labor intensity. Thus, with a view to reduce time for solving the problem, a validation criterion is compiled as an adaptive bionic algorithm. Bionic algorithms (BA) have proven effective in solving optimization problems. The advantages of bionic search include adaptability, learning ability, parallelism, the ability to build hybrid systems based on combining. [1].

  11. The Evolution of Biological Complexity in Digital Organisms

    NASA Astrophysics Data System (ADS)

    Ofria, Charles

    2013-03-01

    When Darwin first proposed his theory of evolution by natural selection, he realized that it had a problem explaining the origins of traits of ``extreme perfection and complication'' such as the vertebrate eye. Critics of Darwin's theory have latched onto this perceived flaw as a proof that Darwinian evolution is impossible. In anticipation of this issue, Darwin described the perfect data needed to understand this process, but lamented that such data are ``scarcely ever possible'' to obtain. In this talk, I will discuss research where we use populations of digital organisms (self-replicating and evolving computer programs) to elucidate the genetic and evolutionary processes by which new, highly-complex traits arise, drawing inspiration directly from Darwin's wistful thinking and hypotheses. During the process of evolution in these fully-transparent computational environments we can measure the incorporation of new information into the genome, a process akin to a natural Maxwell's Demon, and identify the original source of any such information. We show that, as Darwin predicted, much of the information used to encode a complex trait was already in the genome as part of simpler evolved traits, and that many routes must be possible for a new complex trait to have a high probability of successfully evolving. In even more extreme examples of the evolution of complexity, we are now using these same principles to examine the evolutionary dynamics the drive major transitions in evolution; that is transitions to higher-levels of organization, which are some of the most complex evolutionary events to occur in nature. Finally, I will explore some of the implications of this research to other aspects of evolutionary biology and as well as ways that these evolutionary principles can be applied toward solving computational and engineering problems.

  12. Improved mine blast algorithm for optimal cost design of water distribution systems

    NASA Astrophysics Data System (ADS)

    Sadollah, Ali; Guen Yoo, Do; Kim, Joong Hoon

    2015-12-01

    The design of water distribution systems is a large class of combinatorial, nonlinear optimization problems with complex constraints such as conservation of mass and energy equations. Since feasible solutions are often extremely complex, traditional optimization techniques are insufficient. Recently, metaheuristic algorithms have been applied to this class of problems because they are highly efficient. In this article, a recently developed optimizer called the mine blast algorithm (MBA) is considered. The MBA is improved and coupled with the hydraulic simulator EPANET to find the optimal cost design for water distribution systems. The performance of the improved mine blast algorithm (IMBA) is demonstrated using the well-known Hanoi, New York tunnels and Balerma benchmark networks. Optimization results obtained using IMBA are compared to those using MBA and other optimizers in terms of their minimum construction costs and convergence rates. For the complex Balerma network, IMBA offers the cheapest network design compared to other optimization algorithms.

  13. Everyday Expertise in Self-Management of Diabetes in the Dominican Republic: Implications for Learning and Performance Support Systems Design

    ERIC Educational Resources Information Center

    Reyes Paulino, Lisette G.

    2012-01-01

    An epidemic such as diabetes is an extremely complex public health, economic and social problem that is difficult to solve through medical expertise alone. Evidence-based models for improving healthcare delivery systems advocate educating patients to become more active participants in their own care. This shift demands preparing chronically ill…

  14. Getting Alice through the door: social science research and natural resource management

    Treesearch

    Alan W. Ewert

    1995-01-01

    A number of trends are altering the role of science in natural resource management. These trends include the growing political power of science, the recognition that most natural resource problems are extremely complex and not prone to uni-dimensional solutions, and the increasing need to integrate an understanding of the human component into the planning and decision-...

  15. Contamination of the freshwater ecosystem by pesticides

    USGS Publications Warehouse

    Cope, Oliver B.

    1966-01-01

    A large part of our disquieting present-day pesticide problem is intimately tied to the freshwater ecosystem. Economic poisons are used in so many types of terrain to control so many kinds of organisms that almost all lakes and streams are likely to be contaminated. In addition to accidental contamination many pesticides are deliberately applied directly to fresh waters for suppression of aquatic animals or plants. The problem is intensified because of the extreme susceptibility of freshwater organisms. The complexity of freshwater environments and their variety makes it difficult to comprehend the total effect of pesticides.

  16. Theoretical study of the effects of refraction on the noise produced by turbulence in jets

    NASA Technical Reports Server (NTRS)

    Graham, E. W.; Graham, B. B.

    1974-01-01

    The production of noise by turbulence in jets is an extremely complex problem. One aspect of that problem, the transmission of acoustic disturbances from the interior of the jet through the mean velocity profile and into the far field is studied. The jet (two-dimensional or circular cylindrical) is assumed infinitely long with mean velocity profile independent of streamwise location. The noise generator is a sequence of transient sources drifting with the surrounding fluid and confined to a short length of the jet.

  17. Towards large scale multi-target tracking

    NASA Astrophysics Data System (ADS)

    Vo, Ba-Ngu; Vo, Ba-Tuong; Reuter, Stephan; Lam, Quang; Dietmayer, Klaus

    2014-06-01

    Multi-target tracking is intrinsically an NP-hard problem and the complexity of multi-target tracking solutions usually do not scale gracefully with problem size. Multi-target tracking for on-line applications involving a large number of targets is extremely challenging. This article demonstrates the capability of the random finite set approach to provide large scale multi-target tracking algorithms. In particular it is shown that an approximate filter known as the labeled multi-Bernoulli filter can simultaneously track one thousand five hundred targets in clutter on a standard laptop computer.

  18. New Dandelion Algorithm Optimizes Extreme Learning Machine for Biomedical Classification Problems

    PubMed Central

    Li, Xiguang; Zhao, Liang; Gong, Changqing; Liu, Xiaojing

    2017-01-01

    Inspired by the behavior of dandelion sowing, a new novel swarm intelligence algorithm, namely, dandelion algorithm (DA), is proposed for global optimization of complex functions in this paper. In DA, the dandelion population will be divided into two subpopulations, and different subpopulations will undergo different sowing behaviors. Moreover, another sowing method is designed to jump out of local optimum. In order to demonstrate the validation of DA, we compare the proposed algorithm with other existing algorithms, including bat algorithm, particle swarm optimization, and enhanced fireworks algorithm. Simulations show that the proposed algorithm seems much superior to other algorithms. At the same time, the proposed algorithm can be applied to optimize extreme learning machine (ELM) for biomedical classification problems, and the effect is considerable. At last, we use different fusion methods to form different fusion classifiers, and the fusion classifiers can achieve higher accuracy and better stability to some extent. PMID:29085425

  19. A complex multi-notch astronomical filter to suppress the bright infrared sky.

    PubMed

    Bland-Hawthorn, J; Ellis, S C; Leon-Saval, S G; Haynes, R; Roth, M M; Löhmannsröben, H-G; Horton, A J; Cuby, J-G; Birks, T A; Lawrence, J S; Gillingham, P; Ryder, S D; Trinh, C

    2011-12-06

    A long-standing and profound problem in astronomy is the difficulty in obtaining deep near-infrared observations due to the extreme brightness and variability of the night sky at these wavelengths. A solution to this problem is crucial if we are to obtain the deepest possible observations of the early Universe, as redshifted starlight from distant galaxies appears at these wavelengths. The atmospheric emission between 1,000 and 1,800 nm arises almost entirely from a forest of extremely bright, very narrow hydroxyl emission lines that varies on timescales of minutes. The astronomical community has long envisaged the prospect of selectively removing these lines, while retaining high throughput between them. Here we demonstrate such a filter for the first time, presenting results from the first on-sky tests. Its use on current 8 m telescopes and future 30 m telescopes will open up many new research avenues in the years to come.

  20. Crew collaboration in space: a naturalistic decision-making perspective

    NASA Technical Reports Server (NTRS)

    Orasanu, Judith

    2005-01-01

    Successful long-duration space missions will depend on the ability of crewmembers to respond promptly and effectively to unanticipated problems that arise under highly stressful conditions. Naturalistic decision making (NDM) exploits the knowledge and experience of decision makers in meaningful work domains, especially complex sociotechnical systems, including aviation and space. Decision making in these ambiguous, dynamic, high-risk environments is a complex task that involves defining the nature of the problem and crafting a response to achieve one's goals. Goal conflicts, time pressures, and uncertain outcomes may further complicate the process. This paper reviews theory and research pertaining to the NDM model and traces some of the implications for space crews and other groups that perform meaningful work in extreme environments. It concludes with specific recommendations for preparing exploration crews to use NDM effectively.

  1. Mexican Hat Wavelet Kernel ELM for Multiclass Classification.

    PubMed

    Wang, Jie; Song, Yi-Fan; Ma, Tian-Lei

    2017-01-01

    Kernel extreme learning machine (KELM) is a novel feedforward neural network, which is widely used in classification problems. To some extent, it solves the existing problems of the invalid nodes and the large computational complexity in ELM. However, the traditional KELM classifier usually has a low test accuracy when it faces multiclass classification problems. In order to solve the above problem, a new classifier, Mexican Hat wavelet KELM classifier, is proposed in this paper. The proposed classifier successfully improves the training accuracy and reduces the training time in the multiclass classification problems. Moreover, the validity of the Mexican Hat wavelet as a kernel function of ELM is rigorously proved. Experimental results on different data sets show that the performance of the proposed classifier is significantly superior to the compared classifiers.

  2. Min-Max Spaces and Complexity Reduction in Min-Max Expansions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaubert, Stephane, E-mail: Stephane.Gaubert@inria.fr; McEneaney, William M., E-mail: wmceneaney@ucsd.edu

    2012-06-15

    Idempotent methods have been found to be extremely helpful in the numerical solution of certain classes of nonlinear control problems. In those methods, one uses the fact that the value function lies in the space of semiconvex functions (in the case of maximizing controllers), and approximates this value using a truncated max-plus basis expansion. In some classes, the value function is actually convex, and then one specifically approximates with suprema (i.e., max-plus sums) of affine functions. Note that the space of convex functions is a max-plus linear space, or moduloid. In extending those concepts to game problems, one finds amore » different function space, and different algebra, to be appropriate. Here we consider functions which may be represented using infima (i.e., min-max sums) of max-plus affine functions. It is natural to refer to the class of functions so represented as the min-max linear space (or moduloid) of max-plus hypo-convex functions. We examine this space, the associated notion of duality and min-max basis expansions. In using these methods for solution of control problems, and now games, a critical step is complexity-reduction. In particular, one needs to find reduced-complexity expansions which approximate the function as well as possible. We obtain a solution to this complexity-reduction problem in the case of min-max expansions.« less

  3. Approximation of Nash equilibria and the network community structure detection problem

    PubMed Central

    2017-01-01

    Game theory based methods designed to solve the problem of community structure detection in complex networks have emerged in recent years as an alternative to classical and optimization based approaches. The Mixed Nash Extremal Optimization uses a generative relation for the characterization of Nash equilibria to identify the community structure of a network by converting the problem into a non-cooperative game. This paper proposes a method to enhance this algorithm by reducing the number of payoff function evaluations. Numerical experiments performed on synthetic and real-world networks show that this approach is efficient, with results better or just as good as other state-of-the-art methods. PMID:28467496

  4. Reaching out to take on TB in Somalia.

    PubMed

    Moore, David A J; Granat, Simo M

    2014-01-01

    Among the many challenges facing populations disrupted by complex emergencies, personal security and food security rank much higher than access to healthcare. However, over time health needs assume increasing importance. Many complex crises occur in settings where the background incidence of TB is already high; social and economic conditions in crises are then highly conducive to amplification of the existing TB problem. Innovative approaches to delivery of diagnostic and treatment services, transition planning and integration with other healthcare providers and services are vital. In the extremely challenging environment of Somalia, multiple partners are making headway though collaboration and innovation.

  5. Rare events modeling with support vector machine: Application to forecasting large-amplitude geomagnetic substorms and extreme events in financial markets.

    NASA Astrophysics Data System (ADS)

    Gavrishchaka, V. V.; Ganguli, S. B.

    2001-12-01

    Reliable forecasting of rare events in a complex dynamical system is a challenging problem that is important for many practical applications. Due to the nature of rare events, data set available for construction of the statistical and/or machine learning model is often very limited and incomplete. Therefore many widely used approaches including such robust algorithms as neural networks can easily become inadequate for rare events prediction. Moreover in many practical cases models with high-dimensional inputs are required. This limits applications of the existing rare event modeling techniques (e.g., extreme value theory) that focus on univariate cases. These approaches are not easily extended to multivariate cases. Support vector machine (SVM) is a machine learning system that can provide an optimal generalization using very limited and incomplete training data sets and can efficiently handle high-dimensional data. These features may allow to use SVM to model rare events in some applications. We have applied SVM-based system to the problem of large-amplitude substorm prediction and extreme event forecasting in stock and currency exchange markets. Encouraging preliminary results will be presented and other possible applications of the system will be discussed.

  6. Dynamic pathway modeling of signal transduction networks: a domain-oriented approach.

    PubMed

    Conzelmann, Holger; Gilles, Ernst-Dieter

    2008-01-01

    Mathematical models of biological processes become more and more important in biology. The aim is a holistic understanding of how processes such as cellular communication, cell division, regulation, homeostasis, or adaptation work, how they are regulated, and how they react to perturbations. The great complexity of most of these processes necessitates the generation of mathematical models in order to address these questions. In this chapter we provide an introduction to basic principles of dynamic modeling and highlight both problems and chances of dynamic modeling in biology. The main focus will be on modeling of s transduction pathways, which requires the application of a special modeling approach. A common pattern, especially in eukaryotic signaling systems, is the formation of multi protein signaling complexes. Even for a small number of interacting proteins the number of distinguishable molecular species can be extremely high. This combinatorial complexity is due to the great number of distinct binding domains of many receptors and scaffold proteins involved in signal transduction. However, these problems can be overcome using a new domain-oriented modeling approach, which makes it possible to handle complex and branched signaling pathways.

  7. Overwintering of herbaceous plants in a changing climate. Still more questions than answers.

    PubMed

    Rapacz, Marcin; Ergon, Ashild; Höglind, Mats; Jørgensen, Marit; Jurczyk, Barbara; Ostrem, Liv; Rognli, Odd Arne; Tronsmo, Anne Marte

    2014-08-01

    The increase in surface temperature of the Earth indicates a lower risk of exposure for temperate grassland and crop to extremely low temperatures. However, the risk of low winter survival rate, especially in higher latitudes may not be smaller, due to complex interactions among different environmental factors. For example, the frequency, degree and length of extreme winter warming events, leading to snowmelt during winter increased, affecting the risks of anoxia, ice encasement and freezing of plants not covered with snow. Future climate projections suggest that cold acclimation will occur later in autumn, under shorter photoperiod and lower light intensity, which may affect the energy partitioning between the elongation growth, accumulation of organic reserves and cold acclimation. Rising CO2 levels may also disturb the cold acclimation process. Predicting problems with winter pathogens is also very complex, because climate change may greatly influence the pathogen population and because the plant resistance to these pathogens is increased by cold acclimation. All these factors, often with contradictory effects on winter survival, make plant overwintering viability under future climates an open question. Close cooperation between climatologists, ecologists, plant physiologists, geneticists and plant breeders is strongly required to predict and prevent possible problems. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  8. Comparing the basins of attraction for several methods in the circular Sitnikov problem with spheroid primaries

    NASA Astrophysics Data System (ADS)

    Zotos, Euaggelos E.

    2018-06-01

    The circular Sitnikov problem, where the two primary bodies are prolate or oblate spheroids, is numerically investigated. In particular, the basins of convergence on the complex plane are revealed by using a large collection of numerical methods of several order. We consider four cases, regarding the value of the oblateness coefficient which determines the nature of the roots (attractors) of the system. For all cases we use the iterative schemes for performing a thorough and systematic classification of the nodes on the complex plane. The distribution of the iterations as well as the probability and their correlations with the corresponding basins of convergence are also discussed. Our numerical computations indicate that most of the iterative schemes provide relatively similar convergence structures on the complex plane. However, there are some numerical methods for which the corresponding basins of attraction are extremely complicated with highly fractal basin boundaries. Moreover, it is proved that the efficiency strongly varies between the numerical methods.

  9. Phase Transitions in Combinatorial Optimization Problems: Basics, Algorithms and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Hartmann, Alexander K.; Weigt, Martin

    2005-10-01

    A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.

  10. Physician decision-making in the management of work related upper extremity injuries.

    PubMed

    Szekeres, Mike; Macdermid, Joy C; Katchky, Adam; Grewal, Ruby

    2018-05-22

    Physicians working in a tertiary care injured worker clinic are faced with clinical decision-making that must balance the needs of patients and society in managing complex clinical problems that are complicated by the work-workplace context. The purpose of this study is to describe and characterize the decision-making process of upper extremity specialized surgeons when managing injured workers within a specialized worker's compensation clinic. Surgeons were interviewed in a semi-structured manner. Following each interview, the surgeon was also observed in a clinic visit during a new patient assessment, allowing observation of the interactional patterns between surgeon and patient, and comparison of the process described in the interview to what actually occurred during clinic visits. The primary central theme emerging from the surgeon interviews and the clinical observation was the focus on the importance of comprehensive assessment to make the first critical decision: an accurate diagnosis. Two subthemes were also found. The first of these involved the decision whether to proceed to management strategies or to continue with further investigation if the correct diagnosis is uncertain. Once the central theme of diagnosis was achieved, a second subtheme was highlighted; selecting appropriate management options, given the complexities of managing the injured worker, the workplace, and the compensation board. This study illustrates that upper extremity surgeons rely on their training and experience with upper extremity conditions to follow a sequential but iterative decision-making process to provide a more definitive diagnosis and treatment plan for workers with injuries that are often complex. The surgeons are challenged by the context which takes them out of their familiar zone of typical clinical practice to deal with the interactions between the injury, worker, work, workplace and insurer.

  11. Modeling of the Human - Operator in a Complex System Functioning Under Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Getzov, Peter; Hubenova, Zoia; Yordanov, Dimitar; Popov, Wiliam

    2013-12-01

    Problems, related to the explication of sophisticated control systems of objects, operating under extreme conditions, have been examined and the impact of the effectiveness of the operator's activity on the systems as a whole. The necessity of creation of complex simulation models, reflecting operator's activity, is discussed. Organizational and technical system of an unmanned aviation complex is described as a sophisticated ergatic system. Computer realization of main subsystems of algorithmic system of the man as a controlling system is implemented and specialized software for data processing and analysis is developed. An original computer model of a Man as a tracking system has been implemented. Model of unmanned complex for operators training and formation of a mental model in emergency situation, implemented in "matlab-simulink" environment, has been synthesized. As a unit of the control loop, the pilot (operator) is simplified viewed as an autocontrol system consisting of three main interconnected subsystems: sensitive organs (perception sensors); central nervous system; executive organs (muscles of the arms, legs, back). Theoretical-data model of prediction the level of operator's information load in ergatic systems is proposed. It allows the assessment and prediction of the effectiveness of a real working operator. Simulation model of operator's activity in takeoff based on the Petri nets has been synthesized.

  12. A Fiducial Approach to Extremes and Multiple Comparisons

    ERIC Educational Resources Information Center

    Wandler, Damian V.

    2010-01-01

    Generalized fiducial inference is a powerful tool for many difficult problems. Based on an extension of R. A. Fisher's work, we used generalized fiducial inference for two extreme value problems and a multiple comparison procedure. The first extreme value problem is dealing with the generalized Pareto distribution. The generalized Pareto…

  13. Complex extreme learning machine applications in terahertz pulsed signals feature sets.

    PubMed

    Yin, X-X; Hadjiloucas, S; Zhang, Y

    2014-11-01

    This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Control and instanton trajectories for random transitions in turbulent flows

    NASA Astrophysics Data System (ADS)

    Bouchet, Freddy; Laurie, Jason; Zaboronski, Oleg

    2011-12-01

    Many turbulent systems exhibit random switches between qualitatively different attractors. The transition between these bistable states is often an extremely rare event, that can not be computed through DNS, due to complexity limitations. We present results for the calculation of instanton trajectories (a control problem) between non-equilibrium stationary states (attractors) in the 2D stochastic Navier-Stokes equations. By representing the transition probability between two states using a path integral formulation, we can compute the most probable trajectory (instanton) joining two non-equilibrium stationary states. Technically, this is equivalent to the minimization of an action, which can be related to a fluid mechanics control problem.

  15. Network community-detection enhancement by proper weighting

    NASA Astrophysics Data System (ADS)

    Khadivi, Alireza; Ajdari Rad, Ali; Hasler, Martin

    2011-04-01

    In this paper, we show how proper assignment of weights to the edges of a complex network can enhance the detection of communities and how it can circumvent the resolution limit and the extreme degeneracy problems associated with modularity. Our general weighting scheme takes advantage of graph theoretic measures and it introduces two heuristics for tuning its parameters. We use this weighting as a preprocessing step for the greedy modularity optimization algorithm of Newman to improve its performance. The result of the experiments of our approach on computer-generated and real-world data networks confirm that the proposed approach not only mitigates the problems of modularity but also improves the modularity optimization.

  16. Taser and Conducted Energy Weapons.

    PubMed

    LeClair, Thomas G; Meriano, Tony

    2015-01-01

    It is clear that CEWs are an increasingly prevalent law enforcement tool, adopted to address a complex and challenging problem. The potential for serious injury from a single deployment of a CEW is extremely low. The debate regarding the link between these electrical weapons and sudden in-custody death is likely to continue because their use is often in complex and volatile situations. Any consideration of injuries has to be put into that context. One must also consider what injuries to a subject would result if an alternative force method was used. Furthermore, the potential benefits of CEWs, including reduction in injuries to the public and law-enforcement officers, need to be considered.

  17. Rival framings: A framework for discovering how problem formulation uncertainties shape risk management trade-offs in water resources systems

    NASA Astrophysics Data System (ADS)

    Quinn, J. D.; Reed, P. M.; Giuliani, M.; Castelletti, A.

    2017-08-01

    Managing water resources systems requires coordinated operation of system infrastructure to mitigate the impacts of hydrologic extremes while balancing conflicting multisectoral demands. Traditionally, recommended management strategies are derived by optimizing system operations under a single problem framing that is assumed to accurately represent the system objectives, tacitly ignoring the myriad of effects that could arise from simplifications and mathematical assumptions made when formulating the problem. This study illustrates the benefits of a rival framings framework in which analysts instead interrogate multiple competing hypotheses of how complex water management problems should be formulated. Analyzing rival framings helps discover unintended consequences resulting from inherent biases of alternative problem formulations. We illustrate this on the monsoonal Red River basin in Vietnam by optimizing operations of the system's four largest reservoirs under several different multiobjective problem framings. In each rival framing, we specify different quantitative representations of the system's objectives related to hydropower production, agricultural water supply, and flood protection of the capital city of Hanoi. We find that some formulations result in counterintuitive behavior. In particular, policies designed to minimize expected flood damages inadvertently increase the risk of catastrophic flood events in favor of hydropower production, while min-max objectives commonly used in robust optimization provide poor representations of system tradeoffs due to their instability. This study highlights the importance of carefully formulating and evaluating alternative mathematical abstractions of stakeholder objectives describing the multisectoral water demands and risks associated with hydrologic extremes.

  18. Target matching based on multi-view tracking

    NASA Astrophysics Data System (ADS)

    Liu, Yahui; Zhou, Changsheng

    2011-01-01

    A feature matching method is proposed based on Maximally Stable Extremal Regions (MSER) and Scale Invariant Feature Transform (SIFT) to solve the problem of the same target matching in multiple cameras. Target foreground is extracted by using frame difference twice and bounding box which is regarded as target regions is calculated. Extremal regions are got by MSER. After fitted into elliptical regions, those regions will be normalized into unity circles and represented with SIFT descriptors. Initial matching is obtained from the ratio of the closest distance to second distance less than some threshold and outlier points are eliminated in terms of RANSAC. Experimental results indicate the method can reduce computational complexity effectively and is also adapt to affine transformation, rotation, scale and illumination.

  19. Weather extremes could affect agriculture

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2012-05-01

    As Earth's climate warms, agricultural producers will need to adapt. Changes, especially increases in extreme events, are already having an impact on food production, according to speakers at a 1 May session on agriculture and food security at the AGU Science Policy Conference. Christopher Field, director of the Department of Global Ecology at the Carnegie Institution for Science of Washington, D. C., pointed out the complex factors that come into play in understanding food security, including spatially varying controls and stresses, incomplete models, and the potential for threshold responses. Factors that are likely to cause problems include increasing population; increasing preference for meat, which needs more land and energy inputs to produce; climate change; and increasing use of agricultural lands for biomass energy.

  20. A Climate Information Platform for Copernicus (CLIPC): managing the data flood

    NASA Astrophysics Data System (ADS)

    Juckes, Martin; Swart, Rob; Bärring, Lars; Groot, Annemarie; Thysse, Peter; Som de Cerff, Wim; Costa, Luis; Lückenkötter, Johannes; Callaghan, Sarah; Bennett, Victoria

    2016-04-01

    The FP7 project "Climate Information Platform for Copernicus" (CLIPC) is developing a demonstration portal for the Copernicus Climate Change Service (C3S). The project confronts many problems associated with the huge diversity of underlying data, complex multi-layered uncertainties and extremely complex and evolving user requirements. The infrastructure is founded on a comprehensive approach to managing data and documentation, using global domain independent standards where possible. An extensive thesaurus of terms provides both a robust and flexible foundation for data discovery services and accessible definitions to support users. It is, of course, essential to provide information to users through an interface which reflects their expectations rather than the intricacies of abstract data models. CLIPC has reviewed user engagement activities from other collaborative European projects, conducted user polls, interviews and meetings and is now entering an evaluation phase in which users discuss new features and options in the portal design. The CLIPC portal will provide access to raw climate science data and climate impact indicators derived from that data. The portal needs the flexibility to support access to extremely large datasets as well as providing means to manipulate data and explore complex products interactively.

  1. Artificial Intelligence Methods: Choice of algorithms, their complexity, and appropriateness within the context of hydrology and water resources. (Invited)

    NASA Astrophysics Data System (ADS)

    Bastidas, L. A.; Pande, S.

    2009-12-01

    Pattern analysis deals with the automatic detection of patterns in the data and there are a variety of algorithms available for the purpose. These algorithms are commonly called Artificial Intelligence (AI) or data driven algorithms, and have been applied lately to a variety of problems in hydrology and are becoming extremely popular. When confronting such a range of algorithms, the question of which one is the “best” arises. Some algorithms may be preferred because of the lower computational complexity; others take into account prior knowledge of the form and the amount of the data; others are chosen based on a version of the Occam’s razor principle that a simple classifier performs better. Popper has argued, however, that Occam’s razor is without operational value because there is no clear measure or criterion for simplicity. An example of measures that can be used for this purpose are: the so called algorithmic complexity - also known as Kolmogorov complexity or Kolmogorov (algorithmic) entropy; the Bayesian information criterion; or the Vapnik-Chervonenkis dimension. On the other hand, the No Free Lunch Theorem states that there is no best general algorithm, and that specific algorithms are superior only for specific problems. It should be noted also that the appropriate algorithm and the appropriate complexity are constrained by the finiteness of the available data and the uncertainties associated with it. Thus, there is compromise between the complexity of the algorithm, the data properties, and the robustness of the predictions. We discuss the above topics; briefly review the historical development of applications with particular emphasis on statistical learning theory (SLT), also known as machine learning (ML) of which support vector machines and relevant vector machines are the most commonly known algorithms. We present some applications of such algorithms for distributed hydrologic modeling; and introduce an example of how the complexity measure can be applied for appropriate model choice within the context of applications in hydrologic modeling intended for use in studies about water resources and water resources management and their direct relation to extreme conditions or natural hazards.

  2. SCOUT: simultaneous time segmentation and community detection in dynamic networks

    PubMed Central

    Hulovatyy, Yuriy; Milenković, Tijana

    2016-01-01

    Many evolving complex real-world systems can be modeled via dynamic networks. An important problem in dynamic network research is community detection, which finds groups of topologically related nodes. Typically, this problem is approached by assuming either that each time point has a distinct community organization or that all time points share a single community organization. The reality likely lies between these two extremes. To find the compromise, we consider community detection in the context of the problem of segment detection, which identifies contiguous time periods with consistent network structure. Consequently, we formulate a combined problem of segment community detection (SCD), which simultaneously partitions the network into contiguous time segments with consistent community organization and finds this community organization for each segment. To solve SCD, we introduce SCOUT, an optimization framework that explicitly considers both segmentation quality and partition quality. SCOUT addresses limitations of existing methods that can be adapted to solve SCD, which consider only one of segmentation quality or partition quality. In a thorough evaluation, SCOUT outperforms the existing methods in terms of both accuracy and computational complexity. We apply SCOUT to biological network data to study human aging. PMID:27881879

  3. New Developments of Computational Fluid Dynamics and Their Applications to Practical Engineering Problems

    NASA Astrophysics Data System (ADS)

    Chen, Hudong

    2001-06-01

    There have been considerable advances in Lattice Boltzmann (LB) based methods in the last decade. By now, the fundamental concept of using the approach as an alternative tool for computational fluid dynamics (CFD) has been substantially appreciated and validated in mainstream scientific research and in industrial engineering communities. Lattice Boltzmann based methods possess several major advantages: a) less numerical dissipation due to the linear Lagrange type advection operator in the Boltzmann equation; b) local dynamic interactions suitable for highly parallel processing; c) physical handling of boundary conditions for complicated geometries and accurate control of fluxes; d) microscopically consistent modeling of thermodynamics and of interface properties in complex multiphase flows. It provides a great opportunity to apply the method to practical engineering problems encountered in a wide range of industries from automotive, aerospace to chemical, biomedical, petroleum, nuclear, and others. One of the key challenges is to extend the applicability of this alternative approach to regimes of highly turbulent flows commonly encountered in practical engineering situations involving high Reynolds numbers. Over the past ten years, significant efforts have been made on this front at Exa Corporation in developing a lattice Boltzmann based commercial CFD software, PowerFLOW. It has become a useful computational tool for the simulation of turbulent aerodynamics in practical engineering problems involving extremely complex geometries and flow situations, such as in new automotive vehicle designs world wide. In this talk, we present an overall LB based algorithm concept along with certain key extensions in order to accurately handle turbulent flows involving extremely complex geometries. To demonstrate the accuracy of turbulent flow simulations, we provide a set of validation results for some well known academic benchmarks. These include straight channels, backward-facing steps, flows over a curved hill and typical NACA airfoils at various angles of attack including prediction of stall angle. We further provide numerous engineering cases, ranging from external aerodynamics around various car bodies to internal flows involved in various industrial devices. We conclude with a discussion of certain future extensions for complex fluids.

  4. Can custom-made biomechanic shoe orthoses prevent problems in the back and lower extremities? A randomized, controlled intervention trial of 146 military conscripts.

    PubMed

    Larsen, Kristian; Weidich, Flemming; Leboeuf-Yde, Charlotte

    2002-06-01

    Shock-absorbing and biomechanic shoe orthoses are frequently used in the prevention and treatment of back and lower extremity problems. One review concludes that the former is clinically effective in relation to prevention, whereas the latter has been tested in only 1 randomized clinical trial, concluding that stress fractures could be prevented. To investigate if biomechanic shoe orthoses can prevent problems in the back and lower extremities and if reducing the number of days off-duty because of back or lower extremity problems is possible. Prospective, randomized, controlled intervention trial. One female and 145 male military conscripts (aged 18 to 24 years), representing 25% of all new conscripts in a Danish regiment. Health data were collected by questionnaires at initiation of the study and 3 months later. Custom-made biomechanic shoe orthoses to be worn in military boots were provided to all in the study group during the 3-month intervention period. No intervention was provided for the control group. Differences between the 2 groups were tested with the chi-square test, and statistical significance was accepted at P <.05. Risk ratio (RR), risk difference (ARR), numbers needed to prevent (NNP), and cost per successfully prevented case were calculated. Outcome variables included self-reported back and/or lower extremity problems; specific problems in the back or knees or shin splints, Achilles tendonitis, sprained ankle, or other problems in the lower extremity; number of subjects with at least 1 day off-duty because of back or lower extremity problems and total number of days off-duty within the first 3 months of military service because of back or lower extremity problems. Results were significantly better in an actual-use analysis in the intervention group for total number of subjects with back or lower extremity problems (RR 0.7, ARR 19%, NNP 5, cost 98 US dollars); number of subjects with shin splints (RR 0.2, ARR 19%, NNP 5, cost 101 US dollars); number of off-duty days because of back or lower extremity problems (RR 0.6, ARR < 1%, NNP 200, cost 3750 US dollars). In an intention-to-treat analysis, a significant difference was found for only number of subjects with shin splints (RR 0.3, ARR 18%, NNP 6 cost 105 US dollars), whereas a worst-case analysis revealed no significant differences between the study groups. This study shows that it may be possible to prevent certain musculoskeletal problems in the back or lower extremities among military conscripts by using custom-made biomechanic shoe orthoses. However, because care-seeking for lower extremity problems is rare, using this method of prevention in military conscripts would be too costly. We also noted that the choice of statistical approach determined the outcome.

  5. Dry seasons identified in oak tree-ring chronology in the Czech Lands over the last millennium

    NASA Astrophysics Data System (ADS)

    Dobrovolny, Petr; Brazdil, Rudolf; Büntgen, Ulf; Rybnicek, Michal; Kolar, Tomas; Reznickova, Ladislava; Valasek, Hubert; Kotyza, Oldrich

    2015-04-01

    There is growing evidence on amplification of hydrological regimes as a consequence of rising temperatures, increase in evaporation and changes in circulation patterns. These processes may be responsible for higher probability of hydroclimatic extremes occurrence in regional scale. Extreme events such as floods or droughts are rare from their definition and for better understanding of possible changes in the frequency and intensity of their occurrence, long-term proxy archives may be analysed. Recently several tree ring width chronologies were compiled from hardwood species occurring in lowland positions and their analysis proved that they are moisture-sensitive and suitable for hydroclimate reconstructions. Here, we introduce a new oak (Quercus sp) ring width (RW) dataset for the Czech Republic and the last 1250 years. We explain the process of oak chronology standardization that was based on several only slightly different de-trending techniques and subsequent chronology development steps. We hypothesize that the most severe RW increment reductions (negative extremes) reflect extremely dry spring-summer conditions. Negative extremes were assigned for years in which transformed oak RWs were lower than the minus 1.5 standard deviation. To verify our hypothesis, we compare typical climatic conditions in negative extreme years with climatology of the reference period 1961-1990. Comparison was done for various instrumental measurements (1805-2012), existing proxy reconstructions (1500-1804) and also for documentary evidence from historical archives (before 1500). We found that years of negative extremes are characterized with distinctly above average spring (MAM) and summer (JJA) air temperatures and below average precipitation amounts. Typical sea level pressure spatial distribution in those years shows positive pressure anomaly over British Isles and Northern Sea, the pattern that synoptically corresponds to blocking anticyclone bringing to Central Europe warm air from SW and low precipitation totals with higher probability of drought occurrence. Our results provide consistent physical explanation of extremely dry seasons occurring in Central Europe. However, direct comparisons of individual RW extreme seasons with existing documentary evidence show the complexity the problem as some extremes identified in oak RW chronology were not confirmed in documentary archives and vice versa. We discuss possible causes of such differences related to the fact that various proxies may have problems to record real intensity or duration of extreme events e.g. due to non-linear response of proxy data to climate drivers or due to shift in seasonality.

  6. The development of personality extremity from childhood to adolescence: relations to internalizing and externalizing problems.

    PubMed

    Van den Akker, Alithe L; Prinzie, Peter; Deković, Maja; De Haan, Amaranta D; Asscher, Jessica J; Widiger, Thomas

    2013-12-01

    This study investigated the development of personality extremity (deviation of an average midpoint of all 5 personality dimensions together) across childhood and adolescence, as well as relations between personality extremity and adjustment problems. For 598 children (mean age at Time 1 = 7.5 years), mothers and fathers reported the Big Five personality dimensions 4 times across 8 years. Children's vector length in a 5-dimensional configuration of the Big Five dimensions represented personality extremity. Mothers, fathers, and teachers reported children's internalizing and externalizing problems at the 1st and final measurement. In a cohort-sequential design, we modeled personality extremity in children and adolescents from ages 6 to 17 years. Growth mixture modeling revealed a similar solution for both mother and father reports: a large group with relatively short vectors that were stable over time (mother reports: 80.3%; father reports: 84.7%) and 2 smaller groups with relatively long vectors (i.e., extreme personality configuration). One group started out relatively extreme and decreased over time (mother reports: 13.2%; father reports: 10.4%), whereas the other group started out only slightly higher than the short vector group but increased across time (mother reports: 6.5%; father reports: 4.9%). Children who belonged to the increasingly extreme class experienced more internalizing and externalizing problems in late adolescence, controlling for previous levels of adjustment problems and the Big Five personality dimensions. Personality extremity may be important to consider when identifying children at risk for adjustment problems. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  7. Handling Emergency Management in [an] Object Oriented Modeling Environment

    NASA Technical Reports Server (NTRS)

    Tokgoz, Berna Eren; Cakir, Volkan; Gheorghe, Adrian V.

    2010-01-01

    It has been understood that protection of a nation from extreme disasters is a challenging task. Impacts of extreme disasters on a nation's critical infrastructures, economy and society could be devastating. A protection plan itself would not be sufficient when a disaster strikes. Hence, there is a need for a holistic approach to establish more resilient infrastructures to withstand extreme disasters. A resilient infrastructure can be defined as a system or facility that is able to withstand damage, but if affected, can be readily and cost-effectively restored. The key issue to establish resilient infrastructures is to incorporate existing protection plans with comprehensive preparedness actions to respond, recover and restore as quickly as possible, and to minimize extreme disaster impacts. Although national organizations will respond to a disaster, extreme disasters need to be handled mostly by local emergency management departments. Since emergency management departments have to deal with complex systems, they have to have a manageable plan and efficient organizational structures to coordinate all these systems. A strong organizational structure is the key in responding fast before and during disasters, and recovering quickly after disasters. In this study, the entire emergency management is viewed as an enterprise and modelled through enterprise management approach. Managing an enterprise or a large complex system is a very challenging task. It is critical for an enterprise to respond to challenges in a timely manner with quick decision making. This study addresses the problem of handling emergency management at regional level in an object oriented modelling environment developed by use of TopEase software. Emergency Operation Plan of the City of Hampton, Virginia, has been incorporated into TopEase for analysis. The methodology used in this study has been supported by a case study on critical infrastructure resiliency in Hampton Roads.

  8. Statistical analysis and ANN modeling for predicting hydrological extremes under climate change scenarios: the example of a small Mediterranean agro-watershed.

    PubMed

    Kourgialas, Nektarios N; Dokou, Zoi; Karatzas, George P

    2015-05-01

    The purpose of this study was to create a modeling management tool for the simulation of extreme flow events under current and future climatic conditions. This tool is a combination of different components and can be applied in complex hydrogeological river basins, where frequent flood and drought phenomena occur. The first component is the statistical analysis of the available hydro-meteorological data. Specifically, principal components analysis was performed in order to quantify the importance of the hydro-meteorological parameters that affect the generation of extreme events. The second component is a prediction-forecasting artificial neural network (ANN) model that simulates, accurately and efficiently, river flow on an hourly basis. This model is based on a methodology that attempts to resolve a very difficult problem related to the accurate estimation of extreme flows. For this purpose, the available measurements (5 years of hourly data) were divided in two subsets: one for the dry and one for the wet periods of the hydrological year. This way, two ANNs were created, trained, tested and validated for a complex Mediterranean river basin in Crete, Greece. As part of the second management component a statistical downscaling tool was used for the creation of meteorological data according to the higher and lower emission climate change scenarios A2 and B1. These data are used as input in the ANN for the forecasting of river flow for the next two decades. The final component is the application of a meteorological index on the measured and forecasted precipitation and flow data, in order to assess the severity and duration of extreme events. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Supercritical water oxidation for the destruction of toxic organic wastewaters: a review.

    PubMed

    Veriansyah, Bambang; Kim, Jae-Duck

    2007-01-01

    The destruction of toxic organic wastewaters from munitions demilitarization and complex industrial chemical clearly becomes an overwhelming problem if left to conventional treatment processes. Two options, incineration and supercritical water oxidation (SCWO), exist for the complete destruction of toxic organic wastewaters. Incinerator has associated problems such as very high cost and public resentment; on the other hand, SCWO has proved to be a very promising method for the treatment of many different wastewaters with extremely efficient organic waste destruction 99.99% with none of the emissions associated with incineration. In this review, the concepts of SCWO, result and present perspectives of application, and industrial status of SCWO are critically examined and discussed.

  10. The QuakeSim Project: Numerical Simulations for Active Tectonic Processes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry

    2004-01-01

    In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.

  11. JPL Counterfeit Parts Avoidance

    NASA Technical Reports Server (NTRS)

    Risse, Lori

    2012-01-01

    SPACE ARCHITECTURE / ENGINEERING: It brings an extreme test bed for both technologies/concepts as well as procedures/processes. Design and construction (engineering) always go together, especially with complex systems. Requirements (objectives) are crucial. More important than the answers are the questions/Requirements/Tools-Techniques/Processes. Different environments force architects and engineering to think out of the box. For instance there might not be gravity forces. Architectural complex problems have common roots: in Space and on Earth. Let us bring Space down on Earth so we can keep sending Mankind to the stars from a better world. Have fun being architects and engineers...!!! This time is amazing and historical. We are changing the way we inhabit the solar systems!

  12. Piping Connector

    NASA Technical Reports Server (NTRS)

    1993-01-01

    A complex of high pressure piping at Stennis Space Center carries rocket propellants and other fluids/gases through the Center's Component Test Facility. Conventional clamped connectors tend to leak when propellant lines are chilled to extremely low temperatures. Reflange, Inc. customized an existing piping connector to include a secondary seal more tolerant of severe thermal gradients for Stennis. The T-Con connector solved the problem, and the company is now marketing a commercial version that permits testing, monitoring or collecting any emissions that may escape the primary seal during severe thermal transition.

  13. Controlling extreme events on complex networks

    NASA Astrophysics Data System (ADS)

    Chen, Yu-Zhong; Huang, Zi-Gang; Lai, Ying-Cheng

    2014-08-01

    Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network ``mobile'' can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed.

  14. A Robust Epoxy Resins @ Stearic Acid-Mg(OH)2 Micronanosheet Superhydrophobic Omnipotent Protective Coating for Real-Life Applications.

    PubMed

    Si, Yifan; Guo, Zhiguang; Liu, Weimin

    2016-06-29

    Superhydrophobic coating has extremely high application value and practicability. However, some difficult problems such as weak mechanical strength, the need for expensive toxic reagents, and a complex preparation process are all hard to avoid, and these problems have impeded the superhydrophobic coating's real-life application for a long time. Here, we demonstrate one kind of omnipotent epoxy resins @ stearic acid-Mg(OH)2 superhydrophobic coating via a simple antideposition route and one-step superhydrophobization process. The whole preparation process is facile, and expensive toxic reagents needed. This omnipotent coating can be applied on any solid substrate with great waterproof ability, excellent mechanical stability, and chemical durability, which can be stored in a realistic environment for more than 1 month. More significantly, this superhydrophobic coating also has four protective abilities, antifouling, anticorrosion, anti-icing, and flame-retardancy, to cope with a variety of possible extreme natural environments. Therefore, this omnipotent epoxy resins @ stearic acid-Mg(OH)2 superhydrophobic coating not only satisfies real-life need but also has great application potential in many respects.

  15. Extreme values and the level-crossing problem: An application to the Feller process

    NASA Astrophysics Data System (ADS)

    Masoliver, Jaume

    2014-04-01

    We review the question of the extreme values attained by a random process. We relate it to level crossings to one boundary (first-passage problems) as well as to two boundaries (escape problems). The extremes studied are the maximum, the minimum, the maximum absolute value, and the range or span. We specialize in diffusion processes and present detailed results for the Wiener and Feller processes.

  16. Development of an object-oriented finite element program: application to metal-forming and impact simulations

    NASA Astrophysics Data System (ADS)

    Pantale, O.; Caperaa, S.; Rakotomalala, R.

    2004-07-01

    During the last 50 years, the development of better numerical methods and more powerful computers has been a major enterprise for the scientific community. In the same time, the finite element method has become a widely used tool for researchers and engineers. Recent advances in computational software have made possible to solve more physical and complex problems such as coupled problems, nonlinearities, high strain and high-strain rate problems. In this field, an accurate analysis of large deformation inelastic problems occurring in metal-forming or impact simulations is extremely important as a consequence of high amount of plastic flow. In this presentation, the object-oriented implementation, using the C++ language, of an explicit finite element code called DynELA is presented. The object-oriented programming (OOP) leads to better-structured codes for the finite element method and facilitates the development, the maintainability and the expandability of such codes. The most significant advantage of OOP is in the modeling of complex physical systems such as deformation processing where the overall complex problem is partitioned in individual sub-problems based on physical, mathematical or geometric reasoning. We first focus on the advantages of OOP for the development of scientific programs. Specific aspects of OOP, such as the inheritance mechanism, the operators overload procedure or the use of template classes are detailed. Then we present the approach used for the development of our finite element code through the presentation of the kinematics, conservative and constitutive laws and their respective implementation in C++. Finally, the efficiency and accuracy of our finite element program are investigated using a number of benchmark tests relative to metal forming and impact simulations.

  17. - XSUMMER- Transcendental functions and symbolic summation in FORM

    NASA Astrophysics Data System (ADS)

    Moch, S.; Uwer, P.

    2006-05-01

    Harmonic sums and their generalizations are extremely useful in the evaluation of higher-order perturbative corrections in quantum field theory. Of particular interest have been the so-called nested sums, where the harmonic sums and their generalizations appear as building blocks, originating for example, from the expansion of generalized hypergeometric functions around integer values of the parameters. In this paper we discuss the implementation of several algorithms to solve these sums by algebraic means, using the computer algebra system FORM. Program summaryTitle of program:XSUMMER Catalogue identifier:ADXQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXQ_v1_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland License:GNU Public License and FORM License Computers:all Operating system:all Program language:FORM Memory required to execute:Depending on the complexity of the problem, recommended at least 64 MB RAM No. of lines in distributed program, including test data, etc.:9854 No. of bytes in distributed program, including test data, etc.:126 551 Distribution format:tar.gz Other programs called:none External files needed:none Nature of the physical problem:Systematic expansion of higher transcendental functions in a small parameter. The expansions arise in the calculation of loop integrals in perturbative quantum field theory. Method of solution:Algebraic manipulations of nested sums. Restrictions on complexity of the problem:Usually limited only by the available disk space. Typical running time:Dependent on the complexity of the problem.

  18. Robotically facilitated virtual rehabilitation of arm transport integrated with finger movement in persons with hemiparesis.

    PubMed

    Merians, Alma S; Fluet, Gerard G; Qiu, Qinyin; Saleh, Soha; Lafond, Ian; Davidow, Amy; Adamovich, Sergei V

    2011-05-16

    Recovery of upper extremity function is particularly recalcitrant to successful rehabilitation. Robotic-assisted arm training devices integrated with virtual targets or complex virtual reality gaming simulations are being developed to deal with this problem. Neural control mechanisms indicate that reaching and hand-object manipulation are interdependent, suggesting that training on tasks requiring coordinated effort of both the upper arm and hand may be a more effective method for improving recovery of real world function. However, most robotic therapies have focused on training the proximal, rather than distal effectors of the upper extremity. This paper describes the effects of robotically-assisted, integrated upper extremity training. Twelve subjects post-stroke were trained for eight days on four upper extremity gaming simulations using adaptive robots during 2-3 hour sessions. The subjects demonstrated improved proximal stability, smoothness and efficiency of the movement path. This was in concert with improvement in the distal kinematic measures of finger individuation and improved speed. Importantly, these changes were accompanied by a robust 16-second decrease in overall time in the Wolf Motor Function Test and a 24-second decrease in the Jebsen Test of Hand Function. Complex gaming simulations interfaced with adaptive robots requiring integrated control of shoulder, elbow, forearm, wrist and finger movements appear to have a substantial effect on improving hemiparetic hand function. We believe that the magnitude of the changes and the stability of the patient's function prior to training, along with maintenance of several aspects of the gains demonstrated at retention make a compelling argument for this approach to training.

  19. Robotically facilitated virtual rehabilitation of arm transport integrated with finger movement in persons with hemiparesis

    PubMed Central

    2011-01-01

    Background Recovery of upper extremity function is particularly recalcitrant to successful rehabilitation. Robotic-assisted arm training devices integrated with virtual targets or complex virtual reality gaming simulations are being developed to deal with this problem. Neural control mechanisms indicate that reaching and hand-object manipulation are interdependent, suggesting that training on tasks requiring coordinated effort of both the upper arm and hand may be a more effective method for improving recovery of real world function. However, most robotic therapies have focused on training the proximal, rather than distal effectors of the upper extremity. This paper describes the effects of robotically-assisted, integrated upper extremity training. Methods Twelve subjects post-stroke were trained for eight days on four upper extremity gaming simulations using adaptive robots during 2-3 hour sessions. Results The subjects demonstrated improved proximal stability, smoothness and efficiency of the movement path. This was in concert with improvement in the distal kinematic measures of finger individuation and improved speed. Importantly, these changes were accompanied by a robust 16-second decrease in overall time in the Wolf Motor Function Test and a 24-second decrease in the Jebsen Test of Hand Function. Conclusions Complex gaming simulations interfaced with adaptive robots requiring integrated control of shoulder, elbow, forearm, wrist and finger movements appear to have a substantial effect on improving hemiparetic hand function. We believe that the magnitude of the changes and the stability of the patient's function prior to training, along with maintenance of several aspects of the gains demonstrated at retention make a compelling argument for this approach to training. PMID:21575185

  20. The problem of extreme events in paired-watershed studies

    Treesearch

    James W. Hornbeck

    1973-01-01

    In paired-watershed studies, the occurrence of an extreme event during the after-treatment period presents a problem: the effects of treatment must be determined by using greatly extrapolated regression statistics. Several steps are presented to help insure careful handling of extreme events during analysis and reporting of research results.

  1. Final Report of the Project "From the finite element method to the virtual element method"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manzini, Gianmarco; Gyrya, Vitaliy

    The Finite Element Method (FEM) is a powerful numerical tool that is being used in a large number of engineering applications. The FEM is constructed on triangular/tetrahedral and quadrilateral/hexahedral meshes. Extending the FEM to general polygonal/polyhedral meshes in straightforward way turns out to be extremely difficult and leads to very complex and computationally expensive schemes. The reason for this failure is that the construction of the basis functions on elements with a very general shape is a non-trivial and complex task. In this project we developed a new family of numerical methods, dubbed the Virtual Element Method (VEM) for themore » numerical approximation of partial differential equations (PDE) of elliptic type suitable to polygonal and polyhedral unstructured meshes. We successfully formulated, implemented and tested these methods and studied both theoretically and numerically their stability, robustness and accuracy for diffusion problems, convection-reaction-diffusion problems, the Stokes equations and the biharmonic equations.« less

  2. Applications of artificial intelligence to mission planning

    NASA Technical Reports Server (NTRS)

    Ford, Donnie R.; Rogers, John S.; Floyd, Stephen A.

    1990-01-01

    The scheduling problem facing NASA-Marshall mission planning is extremely difficult for several reasons. The most critical factor is the computational complexity involved in developing a schedule. The size of the search space is large along some dimensions and infinite along others. It is because of this and other difficulties that many of the conventional operation research techniques are not feasible or inadequate to solve the problems by themselves. Therefore, the purpose is to examine various artificial intelligence (AI) techniques to assist conventional techniques or to replace them. The specific tasks performed were as follows: (1) to identify mission planning applications for object oriented and rule based programming; (2) to investigate interfacing AI dedicated hardware (Lisp machines) to VAX hardware; (3) to demonstrate how Lisp may be called from within FORTRAN programs; (4) to investigate and report on programming techniques used in some commercial AI shells, such as Knowledge Engineering Environment (KEE); and (5) to study and report on algorithmic methods to reduce complexity as related to AI techniques.

  3. Estimation of in-situ bioremediation system cost using a hybrid Extreme Learning Machine (ELM)-particle swarm optimization approach

    NASA Astrophysics Data System (ADS)

    Yadav, Basant; Ch, Sudheer; Mathur, Shashi; Adamowski, Jan

    2016-12-01

    In-situ bioremediation is the most common groundwater remediation procedure used for treating organically contaminated sites. A simulation-optimization approach, which incorporates a simulation model for groundwaterflow and transport processes within an optimization program, could help engineers in designing a remediation system that best satisfies management objectives as well as regulatory constraints. In-situ bioremediation is a highly complex, non-linear process and the modelling of such a complex system requires significant computational exertion. Soft computing techniques have a flexible mathematical structure which can generalize complex nonlinear processes. In in-situ bioremediation management, a physically-based model is used for the simulation and the simulated data is utilized by the optimization model to optimize the remediation cost. The recalling of simulator to satisfy the constraints is an extremely tedious and time consuming process and thus there is need for a simulator which can reduce the computational burden. This study presents a simulation-optimization approach to achieve an accurate and cost effective in-situ bioremediation system design for groundwater contaminated with BTEX (Benzene, Toluene, Ethylbenzene, and Xylenes) compounds. In this study, the Extreme Learning Machine (ELM) is used as a proxy simulator to replace BIOPLUME III for the simulation. The selection of ELM is done by a comparative analysis with Artificial Neural Network (ANN) and Support Vector Machine (SVM) as they were successfully used in previous studies of in-situ bioremediation system design. Further, a single-objective optimization problem is solved by a coupled Extreme Learning Machine (ELM)-Particle Swarm Optimization (PSO) technique to achieve the minimum cost for the in-situ bioremediation system design. The results indicate that ELM is a faster and more accurate proxy simulator than ANN and SVM. The total cost obtained by the ELM-PSO approach is held to a minimum while successfully satisfying all the regulatory constraints of the contaminated site.

  4. Air pollution engineering

    NASA Astrophysics Data System (ADS)

    Maduna, Karolina; Tomašić, Vesna

    2017-11-01

    Air pollution is an environmental and a social problem which leads to a multitude of adverse effects on human health and standard of human life, state of the ecosystems and global change of climate. Air pollutants are emitted from natural, but mostly from anthropogenic sources and may be transported over long distances. Some air pollutants are extremely stable in the atmosphere and may accumulate in the environment and in the food chain, affecting human beings, animals and natural biodiversity. Obviously, air pollution is a complex problem that poses multiple challenges in terms of management and abatements of the pollutants emission. Effective approach to the problems of air pollution requires a good understanding of the sources that cause it, knowledge of air quality status and future trends as well as its impact on humans and ecosystems. This chapter deals with the complexities of the air pollution and presents an overview of different technical processes and equipment for air pollution control, as well as basic principles of their work. The problems of air protection as well as protection of other ecosystems can be solved only by the coordinated endeavors of various scientific and engineering disciplines, such as chemistry, physics, biology, medicine, chemical engineering and social sciences. The most important engineering contribution is mostly focused on development, design and operation of equipment for the abatement of harmful emissions into environment.

  5. A Noise Removal Method for Uniform Circular Arrays in Complex Underwater Noise Environments with Low SNR

    PubMed Central

    Xia, Huijun; Yang, Kunde; Ma, Yuanliang; Wang, Yong; Liu, Yaxiong

    2017-01-01

    Generally, many beamforming methods are derived under the assumption of white noise. In practice, the actual underwater ambient noise is complex. As a result, the noise removal capacity of the beamforming method may be deteriorated considerably. Furthermore, in underwater environment with extremely low signal-to-noise ratio (SNR), the performances of the beamforming method may be deteriorated. To tackle these problems, a noise removal method for uniform circular array (UCA) is proposed to remove the received noise and improve the SNR in complex noise environments with low SNR. First, the symmetrical noise sources are defined and the spatial correlation of the symmetrical noise sources is calculated. Then, based on the preceding results, the noise covariance matrix is decomposed into symmetrical and asymmetrical components. Analysis indicates that the symmetrical component only affect the real part of the noise covariance matrix. Consequently, the delay-and-sum (DAS) beamforming is performed by using the imaginary part of the covariance matrix to remove the symmetrical component. However, the noise removal method causes two problems. First, the proposed method produces a false target. Second, the proposed method would seriously suppress the signal when it is located in some directions. To solve the first problem, two methods to reconstruct the signal covariance matrix are presented: based on the estimation of signal variance and based on the constrained optimization algorithm. To solve the second problem, we can design the array configuration and select the suitable working frequency. Theoretical analysis and experimental results are included to demonstrate that the proposed methods are particularly effective in complex noise environments with low SNR. The proposed method can be extended to any array. PMID:28598386

  6. The nonequilibrium quantum many-body problem as a paradigm for extreme data science

    NASA Astrophysics Data System (ADS)

    Freericks, J. K.; Nikolić, B. K.; Frieder, O.

    2014-12-01

    Generating big data pervades much of physics. But some problems, which we call extreme data problems, are too large to be treated within big data science. The nonequilibrium quantum many-body problem on a lattice is just such a problem, where the Hilbert space grows exponentially with system size and rapidly becomes too large to fit on any computer (and can be effectively thought of as an infinite-sized data set). Nevertheless, much progress has been made with computational methods on this problem, which serve as a paradigm for how one can approach and attack extreme data problems. In addition, viewing these physics problems from a computer-science perspective leads to new approaches that can be tried to solve more accurately and for longer times. We review a number of these different ideas here.

  7. Team approach to treatment of the posttraumatic stiff hand. A case report.

    PubMed

    Morey, K R; Watson, A H

    1986-02-01

    Posttraumatic hand stiffness is a common but complex problem treated in many general clinics and in hand treatment centers. Although much information is available regarding various treatment procedures, the use of a team approach to evaluate and treat hand stiffness has not been examined thoroughly in the Journal. The problems of the patient with a stiff hand include both physical and psychological components that must be addressed in a structured manner. The clinical picture of posttraumatic hand stiffness involves edema, immobility, pain, and the inability to incorporate the affected extremity into daily activities. In this case report, we review the purpose and philosophy of the team approach to hand therapy and the clarification of responsibilities for physical therapy and occupational therapy intervention.

  8. Neighboring extremals of dynamic optimization problems with path equality constraints

    NASA Technical Reports Server (NTRS)

    Lee, A. Y.

    1988-01-01

    Neighboring extremals of dynamic optimization problems with path equality constraints and with an unknown parameter vector are considered in this paper. With some simplifications, the problem is reduced to solving a linear, time-varying two-point boundary-value problem with integral path equality constraints. A modified backward sweep method is used to solve this problem. Two example problems are solved to illustrate the validity and usefulness of the solution technique.

  9. Massively parallel support for a case-based planning system

    NASA Technical Reports Server (NTRS)

    Kettler, Brian P.; Hendler, James A.; Anderson, William A.

    1993-01-01

    Case-based planning (CBP), a kind of case-based reasoning, is a technique in which previously generated plans (cases) are stored in memory and can be reused to solve similar planning problems in the future. CBP can save considerable time over generative planning, in which a new plan is produced from scratch. CBP thus offers a potential (heuristic) mechanism for handling intractable problems. One drawback of CBP systems has been the need for a highly structured memory to reduce retrieval times. This approach requires significant domain engineering and complex memory indexing schemes to make these planners efficient. In contrast, our CBP system, CaPER, uses a massively parallel frame-based AI language (PARKA) and can do extremely fast retrieval of complex cases from a large, unindexed memory. The ability to do fast, frequent retrievals has many advantages: indexing is unnecessary; very large case bases can be used; memory can be probed in numerous alternate ways; and queries can be made at several levels, allowing more specific retrieval of stored plans that better fit the target problem with less adaptation. In this paper we describe CaPER's case retrieval techniques and some experimental results showing its good performance, even on large case bases.

  10. Methods and compositions for efficient nucleic acid sequencing

    DOEpatents

    Drmanac, Radoje

    2006-07-04

    Disclosed are novel methods and compositions for rapid and highly efficient nucleic acid sequencing based upon hybridization with two sets of small oligonucleotide probes of known sequences. Extremely large nucleic acid molecules, including chromosomes and non-amplified RNA, may be sequenced without prior cloning or subcloning steps. The methods of the invention also solve various current problems associated with sequencing technology such as, for example, high noise to signal ratios and difficult discrimination, attaching many nucleic acid fragments to a surface, preparing many, longer or more complex probes and labelling more species.

  11. Methods and compositions for efficient nucleic acid sequencing

    DOEpatents

    Drmanac, Radoje

    2002-01-01

    Disclosed are novel methods and compositions for rapid and highly efficient nucleic acid sequencing based upon hybridization with two sets of small oligonucleotide probes of known sequences. Extremely large nucleic acid molecules, including chromosomes and non-amplified RNA, may be sequenced without prior cloning or subcloning steps. The methods of the invention also solve various current problems associated with sequencing technology such as, for example, high noise to signal ratios and difficult discrimination, attaching many nucleic acid fragments to a surface, preparing many, longer or more complex probes and labelling more species.

  12. Astroinformatics, data mining and the future of astronomical research

    NASA Astrophysics Data System (ADS)

    Brescia, Massimo; Longo, Giuseppe

    2013-08-01

    Astronomy, as many other scientific disciplines, is facing a true data deluge which is bound to change both the praxis and the methodology of every day research work. The emerging field of astroinformatics, while on the one end appears crucial to face the technological challenges, on the other is opening new exciting perspectives for new astronomical discoveries through the implementation of advanced data mining procedures. The complexity of astronomical data and the variety of scientific problems, however, call for innovative algorithms and methods as well as for an extreme usage of ICT technologies.

  13. On the Analysis of Output Information of S-tree Method

    NASA Astrophysics Data System (ADS)

    Bekaryan, Karen M.; Melkonyan, Anahit A.

    2007-08-01

    On of the most popular and effective method of analysis of hierarchical structure of N-body gravitating systems is method of S-tree diagrams. Apart from many interesting peculiarities, the method, unfortunately, is not free from some disadvantages, among which most important is an extremely complexity of analysis of output information. To solve this problem a number of methods are suggested. From our point of view, most effective approach is an application of all these methods simultaneousely. This allows to obtaine more complete and objective «picture» concerning a final distribution.

  14. RICIS research

    NASA Technical Reports Server (NTRS)

    Mckay, Charles W.; Feagin, Terry; Bishop, Peter C.; Hallum, Cecil R.; Freedman, Glenn B.

    1987-01-01

    The principle focus of one of the RICIS (Research Institute for Computing and Information Systems) components is computer systems and software engineering in-the-large of the lifecycle of large, complex, distributed systems which: (1) evolve incrementally over a long time; (2) contain non-stop components; and (3) must simultaneously satisfy a prioritized balance of mission and safety critical requirements at run time. This focus is extremely important because of the contribution of the scaling direction problem to the current software crisis. The Computer Systems and Software Engineering (CSSE) component addresses the lifestyle issues of three environments: host, integration, and target.

  15. Variations of trends of indicators describing complex systems: Change of scaling precursory to extreme events

    NASA Astrophysics Data System (ADS)

    Keilis-Borok, V. I.; Soloviev, A. A.

    2010-09-01

    Socioeconomic and natural complex systems persistently generate extreme events also known as disasters, crises, or critical transitions. Here we analyze patterns of background activity preceding extreme events in four complex systems: economic recessions, surges in homicides in a megacity, magnetic storms, and strong earthquakes. We use as a starting point the indicators describing the system's behavior and identify changes in an indicator's trend. Those changes constitute our background events (BEs). We demonstrate a premonitory pattern common to all four systems considered: relatively large magnitude BEs become more frequent before extreme event. A premonitory change of scaling has been found in various models and observations. Here we demonstrate this change in scaling of uniformly defined BEs in four real complex systems, their enormous differences notwithstanding.

  16. Applied extreme-value statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kinnison, R.R.

    1983-05-01

    The statistical theory of extreme values is a well established part of theoretical statistics. Unfortunately, it is seldom part of applied statistics and is infrequently a part of statistical curricula except in advanced studies programs. This has resulted in the impression that it is difficult to understand and not of practical value. In recent environmental and pollution literature, several short articles have appeared with the purpose of documenting all that is necessary for the practical application of extreme value theory to field problems (for example, Roberts, 1979). These articles are so concise that only a statistician can recognise all themore » subtleties and assumptions necessary for the correct use of the material presented. The intent of this text is to expand upon several recent articles, and to provide the necessary statistical background so that the non-statistician scientist can recognize and extreme value problem when it occurs in his work, be confident in handling simple extreme value problems himself, and know when the problem is statistically beyond his capabilities and requires consultation.« less

  17. Dynamically Reconfigurable Approach to Multidisciplinary Problems

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalie M.; Lewis, Robert Michael

    2003-01-01

    The complexity and autonomy of the constituent disciplines and the diversity of the disciplinary data formats make the task of integrating simulations into a multidisciplinary design optimization problem extremely time-consuming and difficult. We propose a dynamically reconfigurable approach to MDO problem formulation wherein an appropriate implementation of the disciplinary information results in basic computational components that can be combined into different MDO problem formulations and solution algorithms, including hybrid strategies, with relative ease. The ability to re-use the computational components is due to the special structure of the MDO problem. We believe that this structure can and should be used to formulate and solve optimization problems in the multidisciplinary context. The present work identifies the basic computational components in several MDO problem formulations and examines the dynamically reconfigurable approach in the context of a popular class of optimization methods. We show that if the disciplinary sensitivity information is implemented in a modular fashion, the transfer of sensitivity information among the formulations under study is straightforward. This enables not only experimentation with a variety of problem formations in a research environment, but also the flexible use of formulations in a production design environment.

  18. Extreme events and natural hazards: The complexity perspective

    NASA Astrophysics Data System (ADS)

    Schultz, Colin

    2012-10-01

    Advanced societies have become quite proficient at defending against moderate-size earthquakes, hurricanes, floods, or other natural assaults. What still pose a significant threat, however, are the unknowns, the extremes, the natural phenomena encompassed by the upper tail of the probability distribution. Alongside the large or powerful events, truly extreme natural disasters are those that tie different systems together: an earthquake that causes a tsunami, which leads to flooding, which takes down a nuclear reactor. In the geophysical monograph Extreme Events and Natural Hazards: The Complexity Perspective, editors A. Surjalal Sharma, Armin Bunde, Vijay P. Dimro, and Daniel N. Baker present a lens through which such multidisciplinary phenomena can be understood. In this interview, Eos talks to Sharma about complexity science, predicting extreme events and natural hazards, and the push for "big data."

  19. Identifying and characterizing key nodes among communities based on electrical-circuit networks.

    PubMed

    Zhu, Fenghui; Wang, Wenxu; Di, Zengru; Fan, Ying

    2014-01-01

    Complex networks with community structures are ubiquitous in the real world. Despite many approaches developed for detecting communities, we continue to lack tools for identifying overlapping and bridging nodes that play crucial roles in the interactions and communications among communities in complex networks. Here we develop an algorithm based on the local flow conservation to effectively and efficiently identify and distinguish the two types of nodes. Our method is applicable in both undirected and directed networks without a priori knowledge of the community structure. Our method bypasses the extremely challenging problem of partitioning communities in the presence of overlapping nodes that may belong to multiple communities. Due to the fact that overlapping and bridging nodes are of paramount importance in maintaining the function of many social and biological networks, our tools open new avenues towards understanding and controlling real complex networks with communities accompanied with the key nodes.

  20. A generalized complexity measure based on Rényi entropy

    NASA Astrophysics Data System (ADS)

    Sánchez-Moreno, Pablo; Angulo, Juan Carlos; Dehesa, Jesus S.

    2014-08-01

    The intrinsic statistical complexities of finite many-particle systems (i.e., those defined in terms of the single-particle density) quantify the degree of structure or patterns, far beyond the entropy measures. They are intuitively constructed to be minima at the opposite extremes of perfect order and maximal randomness. Starting from the pioneering LMC measure, which satisfies these requirements, some extensions of LMC-Rényi type have been published in the literature. The latter measures were shown to describe a variety of physical aspects of the internal disorder in atomic and molecular systems (e.g., quantum phase transitions, atomic shell filling) which are not grasped by their mother LMC quantity. However, they are not minimal for maximal randomness in general. In this communication, we propose a generalized LMC-Rényi complexity which overcomes this problem. Some applications which illustrate this fact are given.

  1. Automatic yield-line analysis of slabs using discontinuity layout optimization

    PubMed Central

    Gilbert, Matthew; He, Linwei; Smith, Colin C.; Le, Canh V.

    2014-01-01

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented. PMID:25104905

  2. WE-D-303-00: Computational Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, John; Brigham and Women’s Hospital and Dana-Farber Cancer Institute, Boston, MA

    2015-06-15

    Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less

  3. A frequency dependent preconditioned wavelet method for atmospheric tomography

    NASA Astrophysics Data System (ADS)

    Yudytskiy, Mykhaylo; Helin, Tapio; Ramlau, Ronny

    2013-12-01

    Atmospheric tomography, i.e. the reconstruction of the turbulence in the atmosphere, is a main task for the adaptive optics systems of the next generation telescopes. For extremely large telescopes, such as the European Extremely Large Telescope, this problem becomes overly complex and an efficient algorithm is needed to reduce numerical costs. Recently, a conjugate gradient method based on wavelet parametrization of turbulence layers was introduced [5]. An iterative algorithm can only be numerically efficient when the number of iterations required for a sufficient reconstruction is low. A way to achieve this is to design an efficient preconditioner. In this paper we propose a new frequency-dependent preconditioner for the wavelet method. In the context of a multi conjugate adaptive optics (MCAO) system simulated on the official end-to-end simulation tool OCTOPUS of the European Southern Observatory we demonstrate robustness and speed of the preconditioned algorithm. We show that three iterations are sufficient for a good reconstruction.

  4. Response of the Vegetation-Climate System to High Temperature (Invited)

    NASA Astrophysics Data System (ADS)

    Berry, J. A.

    2009-12-01

    High temperature extremes may lead to inhibition of photosynthesis and stomatal closure at the leaf scale. When these responses occur over regional scales, they can initiate a positive feedback loop in the coupled vegetation-climate system. The fraction of net radiation that is used by the land surface to evaporate water decreases leading to deeper, drier boundary layers, fewer clouds, increased solar radiation reaching the surface, and possibility reduced precipitation. These interactions within the vegetation-climate system may amplify natural (or greenhouse gas forced) variations in temperature and further stress the vegetation. Properly modeling of this system depends, among other things, on getting the plant responses to high temperature correct. I will review the current state of this problem and present some studies of rain forest trees to high temperature and drought conducted in the Biosphere 2 enclosure that illustrate how experiments in controlled systems can contribute to our understanding of complex systems to extreme events.

  5. [Shoulder injuries in golf].

    PubMed

    Liem, D; Gosheger, G; Schmidt, C

    2014-03-01

    Due to its growing popularity golf has now come into the focus of orthopedic sports medicine. With a wide range of age groups and playing levels, orthopedic surgeons will encounter a wide range of musculoskeletal problems which are usually the result of overuse rather than trauma. The shoulder joint plays an important role in the golf swing whereby not only the muscles around the glenohumeral joint but also the scapula stabilizing muscles are extremely important for an effective golf swing. Golf is strictly not considered to be an overhead sport; however, the extreme peak positions of the golf swing involve placing the shoulder joint in maximum abduction and adduction positions which can provoke impingement, lesions of the pulley system or even a special form of posterior shoulder instability. Even after complex shoulder operations, such as rotator cuff repair or shoulder arthroplasty, a return to the golf course at nearly the same level of play can be expected.

  6. Integration of UAV photogrammetry and SPH modelling of fluids to study runoff on real terrains.

    PubMed

    Barreiro, Anxo; Domínguez, Jose M; C Crespo, Alejandro J; González-Jorge, Higinio; Roca, David; Gómez-Gesteira, Moncho

    2014-01-01

    Roads can experience runoff problems due to the intense rain discharge associated to severe storms. Two advanced tools are combined to analyse the interaction of complex water flows with real terrains. UAV (Unmanned Aerial Vehicle) photogrammetry is employed to obtain accurate topographic information on small areas, typically on the order of a few hectares. The Smoothed Particle Hydrodynamics (SPH) technique is applied by means of the DualSPHysics model to compute the trajectory of the water flow during extreme rain events. The use of engineering solutions to palliate flood events is also analysed. The study case simulates how the collected water can flow into a close road and how precautionary measures can be effective to drain water under extreme conditions. The amount of water arriving at the road is calculated under different protection scenarios and the efficiency of a ditch is observed to decrease when sedimentation reduces its depth.

  7. Integration of UAV Photogrammetry and SPH Modelling of Fluids to Study Runoff on Real Terrains

    PubMed Central

    Barreiro, Anxo; Domínguez, Jose M.; C. Crespo, Alejandro J.; González-Jorge, Higinio; Roca, David; Gómez-Gesteira, Moncho

    2014-01-01

    Roads can experience runoff problems due to the intense rain discharge associated to severe storms. Two advanced tools are combined to analyse the interaction of complex water flows with real terrains. UAV (Unmanned Aerial Vehicle) photogrammetry is employed to obtain accurate topographic information on small areas, typically on the order of a few hectares. The Smoothed Particle Hydrodynamics (SPH) technique is applied by means of the DualSPHysics model to compute the trajectory of the water flow during extreme rain events. The use of engineering solutions to palliate flood events is also analysed. The study case simulates how the collected water can flow into a close road and how precautionary measures can be effective to drain water under extreme conditions. The amount of water arriving at the road is calculated under different protection scenarios and the efficiency of a ditch is observed to decrease when sedimentation reduces its depth. PMID:25372035

  8. Doppler ultrasonography of the anterior knee tendons in elite badminton players: colour fraction before and after match.

    PubMed

    Koenig, M J; Torp-Pedersen, S; Boesen, M I; Holm, C C; Bliddal, H

    2010-02-01

    Anterior knee tendon problems are seldom reported in badminton players although the game is obviously stressful to the lower extremities. Painful anterior knee tendons are common among elite badminton players. The anterior knee tendons exhibit colour Doppler activity. This activity increases after a match. Painful tendons have more Doppler activity than tendons without pain. Cohort study. 72 elite badminton players were interviewed about training, pain and injuries. The participants were scanned with high-end ultrasound equipment. Colour Doppler was used to examine the tendons of 64 players before a match and 46 players after a match. Intratendinous colour Doppler flow was measured as colour fraction (CF). The tendon complex was divided into three loci: the quadriceps tendon, the proximal patellar tendon and the insertion on the tibial tuberosity. Interview: Of the 72 players, 62 players had problems with 86 tendons in the lower extremity. Of these 86 tendons, 48 were the anterior knee tendons. Ultrasound: At baseline, the majority of players (87%) had colour Doppler flow in at least one scanning position. After a match, the percentage of the knee complexes involved did not change. CF increased significantly in the dominant leg at the tibial tuberosity; single players had a significantly higher CF after a match at the tibial tuberosity and in the patellar tendon both before and after a match. Painful tendons had the highest colour Doppler activity. Most elite badminton players had pain in the anterior knee tendons and intratendinous Doppler activity both before and after match. High levels of Doppler activity were associated with self-reported ongoing pain.

  9. Shoulder pain in hemiplegia.

    PubMed

    Andersen, L T

    1985-01-01

    Development of a painful shoulder in the hemiplegic patient is a significant and serious problem, because it can limit the patient's ability to reach his or her maximum functional potential. Several etiologies of shoulder pain have been identified, such as immobilization of the upper extremity, trauma to the joint structures, including brachial plexus injuries, and subluxation of the gleno-humeral joint. A review of the literature explains the basic anatomy and kinesiology of the shoulder complex, the various etiologies of hemiplegic shoulder pain, and the pros and cons of specific treatment techniques. This knowledge is essential for the occupational therapist to evaluate effectively techniques used to treat the patient with hemiplegic shoulder pain. More effective management of this problem will facilitate the patient's ability to reach his or her maximum functional potential.

  10. Malaria and Colonialism in Korea, c.1876–c.1945

    PubMed Central

    Kim, Jeong-Ran

    2016-01-01

    Abstract This article considers the problem of malaria in the Korean peninsula from 1876 to 1945, focusing particularly on the impact of Japanese colonial rule. One aspect which receives special attention is malaria in urban contexts. The relationship between malaria and urbanisation is shown to be extremely complex, fluctuating regardless of specific interventions against the disease. In rural and urban areas, Japanese antimalarial measures concentrated on military garrisons, at the expense of both civilian settlers and Koreans. However, it was Koreans who bore the brunt of the malaria problem, which was exacerbated in many areas by agricultural and industrial development and, ultimately, by the war regime introduced from 1938. The worsening of the malaria burden in the final years of Japanese rule left a legacy which lasted long after independence. PMID:29731545

  11. Design loads and uncertainties for the transverse strength of ships

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pittaluga, A.

    1995-12-31

    Rational design of ship structures is becoming a reality, and a reliability based approach for the longitudinal strength assessment of ship hulls is close to implementation. Transverse strength of ships is a step behind, mainly due to the complexity of the collapse modes associated with transverse strength. Nevertheless, some investigations are being made and the importance of an acceptable stochastic model for the environmental demand on the transverse structures is widely recognized. In the paper, the problem of the determination of the sea loads on a transverse section of a ship is discussed. The problem of extrapolating the calculated results,more » which are relevant to the submerged portion of the hull, to areas which are only occasionally wet in extreme conditions is also addressed.« less

  12. Stress state of rocks with a system of workings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikiforovskii, V.S.; Seryakov, V.M.

    1979-09-01

    An investigation of the state of rocks in undisturbed form, and during disturbance by drivage of development workings and the working of seams or ore beds, is both important and also extremely complex in practice. The complete physical and mathematical formulation of the problem must take into account the complex geological structure (allowing for tectonics) of the region, the mutual influence of the systems of workings, the change in the mechanical characteristics in the vicinity of the workings, etc. All these factors make it necessary to solve spatial problems with inclusions and workings of arbitrary form. The literature gives datamore » on the stress in the rock in the vicinity of a working remote from the free surface and in its vicinity. However, the possibilities of an analytical investigation of the problem are limited to the simplest cases under conditions of plane deformation. Considerable success in the solution of problems of geomechanics has been attained using numerical methods, particularly the finite-element method, which enables us, without altering the algorithm, to change fairly rapidly and simply the outer and inner boundaries of the region and the properties of the medium, or to assign various boundary conditions. In this article we calculate the stress in the rocks around mining-out and development workings during mining of the Talnakh and Oktyabr' deposits by the longwall slicing system with stowing of the worked-out area.« less

  13. Mechanochemical Preparation of Stable Sub-100 nm γ-Cyclodextrin:Buckminsterfullerene (C60) Nanoparticles by Electrostatic or Steric Stabilization.

    PubMed

    Van Guyse, Joachim F R; de la Rosa, Victor R; Hoogenboom, Richard

    2018-02-21

    Buckminster fullerene (C 60 )'s main hurdle to enter the field of biomedicine is its low bioavailability, which results from its extremely low water solubility. A well-known approach to increase the water solubility of C 60 is by complexation with γ-cyclodextrins. However, the formed complexes are not stable in time as they rapidly aggregate and eventually precipitate due to attractive intermolecular forces, a common problem in inclusion complexes of cyclodextrins. In this study we attempt to overcome the attractive intermolecular forces between the complexes by designing custom γ-cyclodextrin (γCD)-based supramolecular hosts for C 60 that inhibit the aggregation found in native γCD-C 60 complexes. The approach entails the introduction of either repulsive electrostatic forces or increased steric hindrance to prevent aggregation, thus enhancing the biomedical application potential of C 60 . These modifications have led to new sub-100 nm nanostructures that show long-term stability in solution. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. [Problems of work world and its impact on health. Current financial crisis].

    PubMed

    Tomasina, Fernando

    2012-06-01

    Health and work are complex processes. Besides, they are multiple considering the forms they take. These two processes are linked to each other and they are influenced by each other. According to this, it is possible to establish that work world is extremely complex and heterogeneous. In this world, "old" or traditional risks coexist with "modern risks", derived from the new models of work organization and the incorporation of new technologies. Unemployment, work relationships precariousness and work risks outsourcing are results of neoliberal strategies. Some negative results of health-sickness process derived from transformation in work world and current global economic crisis have been noticed in current work conditions. Finally, the need for reconstructing policies focusing on this situation derived from work world is suggested.

  15. A linguistic geometry for 3D strategic planning

    NASA Technical Reports Server (NTRS)

    Stilman, Boris

    1995-01-01

    This paper is a new step in the development and application of the Linguistic Geometry. This formal theory is intended to discover the inner properties of human expert heuristics, which have been successful in a certain class of complex control systems, and apply them to different systems. In this paper we investigate heuristics extracted in the form of hierarchical networks of planning paths of autonomous agents. Employing Linguistic Geometry tools the dynamic hierarchy of networks is represented as a hierarchy of formal attribute languages. The main ideas of this methodology are shown in this paper on the new pilot example of the solution of the extremely complex 3D optimization problem of strategic planning for the space combat of autonomous vehicles. This example demonstrates deep and highly selective search in comparison with conventional search algorithms.

  16. Human behaviours in evacuation crowd dynamics: From modelling to "big data" toward crisis management

    NASA Astrophysics Data System (ADS)

    Bellomo, N.; Clarke, D.; Gibelli, L.; Townsend, P.; Vreugdenhil, B. J.

    2016-09-01

    This paper proposes an essay concerning the understanding of human behaviours and crisis management of crowds in extreme situations, such as evacuation through complex venues. The first part focuses on the understanding of the main features of the crowd viewed as a living, hence complex system. The main concepts are subsequently addressed, in the second part, to a critical analysis of mathematical models suitable to capture them, as far as it is possible. Then, the third part focuses on the use, toward safety problems, of a model derived by the methods of the mathematical kinetic theory and theoretical tools of evolutionary game theory. It is shown how this model can depict critical situations and how these can be managed with the aim of minimizing the risk of catastrophic events.

  17. Optimizing Illumina next-generation sequencing library preparation for extremely AT-biased genomes.

    PubMed

    Oyola, Samuel O; Otto, Thomas D; Gu, Yong; Maslen, Gareth; Manske, Magnus; Campino, Susana; Turner, Daniel J; Macinnis, Bronwyn; Kwiatkowski, Dominic P; Swerdlow, Harold P; Quail, Michael A

    2012-01-03

    Massively parallel sequencing technology is revolutionizing approaches to genomic and genetic research. Since its advent, the scale and efficiency of Next-Generation Sequencing (NGS) has rapidly improved. In spite of this success, sequencing genomes or genomic regions with extremely biased base composition is still a great challenge to the currently available NGS platforms. The genomes of some important pathogenic organisms like Plasmodium falciparum (high AT content) and Mycobacterium tuberculosis (high GC content) display extremes of base composition. The standard library preparation procedures that employ PCR amplification have been shown to cause uneven read coverage particularly across AT and GC rich regions, leading to problems in genome assembly and variation analyses. Alternative library-preparation approaches that omit PCR amplification require large quantities of starting material and hence are not suitable for small amounts of DNA/RNA such as those from clinical isolates. We have developed and optimized library-preparation procedures suitable for low quantity starting material and tolerant to extremely high AT content sequences. We have used our optimized conditions in parallel with standard methods to prepare Illumina sequencing libraries from a non-clinical and a clinical isolate (containing ~53% host contamination). By analyzing and comparing the quality of sequence data generated, we show that our optimized conditions that involve a PCR additive (TMAC), produces amplified libraries with improved coverage of extremely AT-rich regions and reduced bias toward GC neutral templates. We have developed a robust and optimized Next-Generation Sequencing library amplification method suitable for extremely AT-rich genomes. The new amplification conditions significantly reduce bias and retain the complexity of either extremes of base composition. This development will greatly benefit sequencing clinical samples that often require amplification due to low mass of DNA starting material.

  18. NeuroTessMesh: A Tool for the Generation and Visualization of Neuron Meshes and Adaptive On-the-Fly Refinement.

    PubMed

    Garcia-Cantero, Juan J; Brito, Juan P; Mata, Susana; Bayona, Sofia; Pastor, Luis

    2017-01-01

    Gaining a better understanding of the human brain continues to be one of the greatest challenges for science, largely because of the overwhelming complexity of the brain and the difficulty of analyzing the features and behavior of dense neural networks. Regarding analysis, 3D visualization has proven to be a useful tool for the evaluation of complex systems. However, the large number of neurons in non-trivial circuits, together with their intricate geometry, makes the visualization of a neuronal scenario an extremely challenging computational problem. Previous work in this area dealt with the generation of 3D polygonal meshes that approximated the cells' overall anatomy but did not attempt to deal with the extremely high storage and computational cost required to manage a complex scene. This paper presents NeuroTessMesh, a tool specifically designed to cope with many of the problems associated with the visualization of neural circuits that are comprised of large numbers of cells. In addition, this method facilitates the recovery and visualization of the 3D geometry of cells included in databases, such as NeuroMorpho, and provides the tools needed to approximate missing information such as the soma's morphology. This method takes as its only input the available compact, yet incomplete, morphological tracings of the cells as acquired by neuroscientists. It uses a multiresolution approach that combines an initial, coarse mesh generation with subsequent on-the-fly adaptive mesh refinement stages using tessellation shaders. For the coarse mesh generation, a novel approach, based on the Finite Element Method, allows approximation of the 3D shape of the soma from its incomplete description. Subsequently, the adaptive refinement process performed in the graphic card generates meshes that provide good visual quality geometries at a reasonable computational cost, both in terms of memory and rendering time. All the described techniques have been integrated into NeuroTessMesh, available to the scientific community, to generate, visualize, and save the adaptive resolution meshes.

  19. Statistical Modeling of Extreme Values and Evidence of Presence of Dragon King (DK) in Solar Wind

    NASA Astrophysics Data System (ADS)

    Gomes, T.; Ramos, F.; Rempel, E. L.; Silva, S.; C-L Chian, A.

    2017-12-01

    The solar wind constitutes a nonlinear dynamical system, presenting intermittent turbulence, multifractality and chaotic dynamics. One characteristic shared by many such complex systems is the presence of extreme events, that play an important role in several Geophysical phenomena and their statistical characterization is a problem of great practical relevance. This work investigates the presence of extreme events in time series of the modulus of the interplanetary magnetic field measured by Cluster spacecraft on February 2, 2002. One of the main results is that the solar wind near the Earth's bow shock can be modeled by the Generalized Pareto (GP) and Generalized Extreme Values (GEV) distributions. Both models present a statistically significant positive shape parameter which implyies a heavy tail in the probability distribution functions and an unbounded growth in return values as return periods become too long. There is evidence that current sheets are the main responsible for positive values of the shape parameter. It is also shown that magnetic reconnection at the interface between two interplanetary magnetic flux ropes in the solar wind can be considered as Dragon Kings (DK), a class of extreme events whose formation mechanisms are fundamentally different from others. As long as magnetic reconnection can be classified as a Dragon King, there is the possibility of its identification and even its prediction. Dragon kings had previously been identified in time series of financial crashes, nuclear power generation accidents, stock market and so on. It is believed that they are associated with the occurrence of extreme events in dynamical systems at phase transition, bifurcation, crises or tipping points.

  20. Plasma physics of extreme astrophysical environments.

    PubMed

    Uzdensky, Dmitri A; Rightley, Shane

    2014-03-01

    Among the incredibly diverse variety of astrophysical objects, there are some that are characterized by very extreme physical conditions not encountered anywhere else in the Universe. Of special interest are ultra-magnetized systems that possess magnetic fields exceeding the critical quantum field of about 44 TG. There are basically only two classes of such objects: magnetars, whose magnetic activity is manifested, e.g., via their very short but intense gamma-ray flares, and central engines of supernovae (SNe) and gamma-ray bursts (GRBs)--the most powerful explosions in the modern Universe. Figuring out how these complex systems work necessarily requires understanding various plasma processes, both small-scale kinetic and large-scale magnetohydrodynamic (MHD), that govern their behavior. However, the presence of an ultra-strong magnetic field modifies the underlying basic physics to such a great extent that relying on conventional, classical plasma physics is often not justified. Instead, plasma-physical problems relevant to these extreme astrophysical environments call for constructing relativistic quantum plasma (RQP) physics based on quantum electrodynamics (QED). In this review, after briefly describing the astrophysical systems of interest and identifying some of the key plasma-physical problems important to them, we survey the recent progress in the development of such a theory. We first discuss the ways in which the presence of a super-critical field modifies the properties of vacuum and matter and then outline the basic theoretical framework for describing both non-relativistic and RQPs. We then turn to some specific astrophysical applications of relativistic QED plasma physics relevant to magnetar magnetospheres and to central engines of core-collapse SNe and long GRBs. Specifically, we discuss the propagation of light through a magnetar magnetosphere; large-scale MHD processes driving magnetar activity and responsible for jet launching and propagation in GRBs; energy-transport processes governing the thermodynamics of extreme plasma environments; micro-scale kinetic plasma processes important in the interaction of intense electric currents flowing through a magnetar magnetosphere with the neutron star surface; and magnetic reconnection of ultra-strong magnetic fields. Finally, we point out that future progress in applying RQP physics to real astrophysical problems will require the development of suitable numerical modeling capabilities.

  1. Preterm birth and developmental problems in the preschool age. Part I: minor motor problems.

    PubMed

    Ferrari, Fabrizio; Gallo, Claudio; Pugliese, Marisa; Guidotti, Isotta; Gavioli, Sara; Coccolini, Elena; Zagni, Paola; Della Casa, Elisa; Rossi, Cecilia; Lugli, Licia; Todeschini, Alessandra; Ori, Luca; Bertoncelli, Natascia

    2012-11-01

    Nearly half of very preterm (VP) and extremely preterm (EP) infants suffers from minor disabilities. The paper overviews the literature dealing with motor problems other than cerebral palsy (CP) during infancy and preschool age. The term "minor motor problems" indicates a wide spectrum of motor disorders other than CP; "minor" does not mean "minimal", as a relevant proportion of the preterm infants will develop academic and behavioural problems at school age. Early onset disorders consist of abnormal general movements (GMs), transient dystonia and postural instability; these conditions usually fade during the first months. They were underestimated in the past; recently, qualitative assessment of GMs using Prechtl's method has become a major item of the neurological examination. Late onset disorders include developmental coordination disorder (DCD) and/or minor neurological dysfunction (MND): both terms cover partly overlapping problems. Simple MND (MND-1) and complex MND (MND-2) can be identified and MND-2 gives a higher risk for learning and behavioural disorders. A relationship between the quality of GMs and MND in childhood has been recently described. The Touwen infant neurological examination (TINE) can reliably detect neurological signs of MND even in infancy. However, the prognostic value of these disorders requires further investigations.

  2. Probabilistic forecasting of extreme weather events based on extreme value theory

    NASA Astrophysics Data System (ADS)

    Van De Vyver, Hans; Van Schaeybroeck, Bert

    2016-04-01

    Extreme events in weather and climate such as high wind gusts, heavy precipitation or extreme temperatures are commonly associated with high impacts on both environment and society. Forecasting extreme weather events is difficult, and very high-resolution models are needed to describe explicitly extreme weather phenomena. A prediction system for such events should therefore preferably be probabilistic in nature. Probabilistic forecasts and state estimations are nowadays common in the numerical weather prediction community. In this work, we develop a new probabilistic framework based on extreme value theory that aims to provide early warnings up to several days in advance. We consider the combined events when an observation variable Y (for instance wind speed) exceeds a high threshold y and its corresponding deterministic forecasts X also exceeds a high forecast threshold y. More specifically two problems are addressed:} We consider pairs (X,Y) of extreme events where X represents a deterministic forecast, and Y the observation variable (for instance wind speed). More specifically two problems are addressed: Given a high forecast X=x_0, what is the probability that Y>y? In other words: provide inference on the conditional probability: [ Pr{Y>y|X=x_0}. ] Given a probabilistic model for Problem 1, what is the impact on the verification analysis of extreme events. These problems can be solved with bivariate extremes (Coles, 2001), and the verification analysis in (Ferro, 2007). We apply the Ramos and Ledford (2009) parametric model for bivariate tail estimation of the pair (X,Y). The model accommodates different types of extremal dependence and asymmetry within a parsimonious representation. Results are presented using the ensemble reforecast system of the European Centre of Weather Forecasts (Hagedorn, 2008). Coles, S. (2001) An Introduction to Statistical modelling of Extreme Values. Springer-Verlag.Ferro, C.A.T. (2007) A probability model for verifying deterministic forecasts of extreme events. Wea. Forecasting {22}, 1089-1100.Hagedorn, R. (2008) Using the ECMWF reforecast dataset to calibrate EPS forecasts. ECMWF Newsletter, {117}, 8-13.Ramos, A., Ledford, A. (2009) A new class of models for bivariate joint tails. J.R. Statist. Soc. B {71}, 219-241.

  3. The role of order in distributed programs

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Marzullo, Keith

    1989-01-01

    The role of order in building distributed systems is discussed. It is the belief that a principle of event ordering underlies the wide range of operating systems mechanisms that were put forward for building robust distributed software. Stated concisely, this principle achieves correct distributed behavior by ordering classes of distributed events that conflict with one another. By focusing on order, simplified descriptions can be obtained and convincingly correct solutions to problems that might otherwise have looked extremely complex. Moreover, it is observed that there are a limited number of ways to obtain order, and that the choice made impacts greatly on performance.

  4. Solar activity influences on atmospheric electricity and on some structures in the middle atmosphere

    NASA Technical Reports Server (NTRS)

    Reiter, Reinhold

    1989-01-01

    Only processes in the troposphere and the lower stratosphere are reviewed. General aspects of global atmospheric electricity are summarized in Chapter 3 of NCR (1986); Volland (1984) has outlined the overall problems of atmospheric electrodynamics; and Roble and Hays (1982) published a summary of solar effects on the global circuit. The solar variability and its atmospheric effects (overview by Donelly et al, 1987) and the solar-planetary relationships (survey by James et al. 1983) are so extremely complex that only particular results and selected papers of direct relevance or historical importance are compiled herein.

  5. A Review of Computational Intelligence Methods for Eukaryotic Promoter Prediction.

    PubMed

    Singh, Shailendra; Kaur, Sukhbir; Goel, Neelam

    2015-01-01

    In past decades, prediction of genes in DNA sequences has attracted the attention of many researchers but due to its complex structure it is extremely intricate to correctly locate its position. A large number of regulatory regions are present in DNA that helps in transcription of a gene. Promoter is one such region and to find its location is a challenging problem. Various computational methods for promoter prediction have been developed over the past few years. This paper reviews these promoter prediction methods. Several difficulties and pitfalls encountered by these methods are also detailed, along with future research directions.

  6. A Cellular Automata Model of Infection Control on Medical Implants

    PubMed Central

    Prieto-Langarica, Alicia; Kojouharov, Hristo; Chen-Charpentier, Benito; Tang, Liping

    2011-01-01

    S. epidermidis infections on medically implanted devices are a common problem in modern medicine due to the abundance of the bacteria. Once inside the body, S. epidermidis gather in communities called biofilms and can become extremely hard to eradicate, causing the patient serious complications. We simulate the complex S. epidermidis-Neutrophils interactions in order to determine the optimum conditions for the immune system to be able to contain the infection and avoid implant rejection. Our cellular automata model can also be used as a tool for determining the optimal amount of antibiotics for combating biofilm formation on medical implants. PMID:23543851

  7. Robust intelligent flight control for hypersonic vehicles. Ph.D. Thesis - Massachusetts Inst. of Technology

    NASA Technical Reports Server (NTRS)

    Chamitoff, Gregory Errol

    1992-01-01

    Intelligent optimization methods are applied to the problem of real-time flight control for a class of airbreathing hypersonic vehicles (AHSV). The extreme flight conditions that will be encountered by single-stage-to-orbit vehicles, such as the National Aerospace Plane, present a tremendous challenge to the entire spectrum of aerospace technologies. Flight control for these vehicles is particularly difficult due to the combination of nonlinear dynamics, complex constraints, and parametric uncertainty. An approach that utilizes all available a priori and in-flight information to perform robust, real time, short-term trajectory planning is presented.

  8. The laboratory diagnosis of bacterial vaginosis

    PubMed Central

    Money, Deborah

    2005-01-01

    Bacterial vaginosis (BV) is an extremely common health problem for women. In addition to the troublesome symptoms often associated with a disruption in the balance of vaginal flora, BV is associated with adverse gynecological and pregnancy outcomes. Although not technically a sexually transmitted infection, BV is a sexually associated condition. Diagnostic tests include real-time clinical/microbiological diagnosis, and the current gold standard, the standardized evaluation of morphotypes on Gram stain analysis. The inappropriate use of vaginal culture can be misleading. Future developments into molecular-based diagnostics will be important to further understand this complex endogenous flora disruption. PMID:18159532

  9. Statistical similarities of pre-earthquake electromagnetic emissions to biological and economic extreme events

    NASA Astrophysics Data System (ADS)

    Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos

    2014-05-01

    When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative similarities. It is also demonstrated that, for the considered systems, the nonextensive parameter q increases as the extreme event approaches, which indicates that the strength of the long-memory / long-range interactions between the constituents of the system increases characterizing the dynamics of the system.

  10. Resilience Design Patterns - A Structured Approach to Resilience at Extreme Scale (version 1.1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hukerikar, Saurabh; Engelmann, Christian

    Reliability is a serious concern for future extreme-scale high-performance computing (HPC) systems. Projections based on the current generation of HPC systems and technology roadmaps suggest the prevalence of very high fault rates in future systems. The errors resulting from these faults will propagate and generate various kinds of failures, which may result in outcomes ranging from result corruptions to catastrophic application crashes. Therefore the resilience challenge for extreme-scale HPC systems requires management of various hardware and software technologies that are capable of handling a broad set of fault models at accelerated fault rates. Also, due to practical limits on powermore » consumption in HPC systems future systems are likely to embrace innovative architectures, increasing the levels of hardware and software complexities. As a result the techniques that seek to improve resilience must navigate the complex trade-off space between resilience and the overheads to power consumption and performance. While the HPC community has developed various resilience solutions, application-level techniques as well as system-based solutions, the solution space of HPC resilience techniques remains fragmented. There are no formal methods and metrics to investigate and evaluate resilience holistically in HPC systems that consider impact scope, handling coverage, and performance & power efficiency across the system stack. Additionally, few of the current approaches are portable to newer architectures and software environments that will be deployed on future systems. In this document, we develop a structured approach to the management of HPC resilience using the concept of resilience-based design patterns. A design pattern is a general repeatable solution to a commonly occurring problem. We identify the commonly occurring problems and solutions used to deal with faults, errors and failures in HPC systems. Each established solution is described in the form of a pattern that addresses concrete problems in the design of resilient systems. The complete catalog of resilience design patterns provides designers with reusable design elements. We also define a framework that enhances a designer's understanding of the important constraints and opportunities for the design patterns to be implemented and deployed at various layers of the system stack. This design framework may be used to establish mechanisms and interfaces to coordinate flexible fault management across hardware and software components. The framework also supports optimization of the cost-benefit trade-offs among performance, resilience, and power consumption. The overall goal of this work is to enable a systematic methodology for the design and evaluation of resilience technologies in extreme-scale HPC systems that keep scientific applications running to a correct solution in a timely and cost-efficient manner in spite of frequent faults, errors, and failures of various types.« less

  11. WE-D-303-01: Development and Application of Digital Human Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segars, P.

    2015-06-15

    Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less

  12. The Erdős-Hajnal problem of hypergraph colouring, its generalizations, and related problems

    NASA Astrophysics Data System (ADS)

    Raigorodskii, Andrei M.; Shabanov, Dmitrii A.

    2011-10-01

    Extremal problems concerned with hypergraph colouring first arose in connection with classical investigations in the 1920-30s which gave rise to Ramsey theory. Since then, this area has assumed a central position in extremal combinatorics. This survey is devoted to one well-known problem of hypergraph colouring, the Erdős-Hajnal problem, initially posed in 1961. It opened a line of research in hypergraph theory whose methods and results are widely used in various domains of discrete mathematics. Bibliography: 109 titles.

  13. The use of interlocking prostheses for both temporary and definitive management of infected periprosthetic femoral fractures.

    PubMed

    Konan, Sujith; Rayan, Faizal; Manketelow, Andrew R J; Haddad, Fares S

    2011-12-01

    Infected periprosthetic fractures around total hip arthroplasties are an extremely challenging problem. We describe our experience of managing infected periprosthetic femoral fractures using interlocking long-stem femoral prostheses either as temporary functional spacers or as definitive implants. The Cannulock (Orthodesign, Christchurch, United Kingdom) uncoated stem was used in 12 cases, and the Kent hip prosthesis (Biomet Merck, Bridgend, United Kingdom), in 5 cases. Satisfactory outcome was noted in all cases, and in 11 cases, revision to a definitive stem has been undertaken after successful control of infection and fracture union. The use of interlocking stems offers a relatively appealing solution for a complex problem and avoids the complications that would be associated with resection of the entire femur or the use of large quantities of bone cement. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Management Knowledge and Skills Required in the Health Care System of the Federation Bosnia and Herzegovina

    PubMed Central

    Slipicevic, Osman; Masic, Izet

    2012-01-01

    Extremely complex health care organizations, by their structure and organization, operate in a constantly changing business environment, and such situation implies and requires complex and demanding health management. Therefore, in order to manage health organizations in a competent manner, health managers must possess various managerial skills and be familiar with problems in health care. Research, identification, analysis, and assessment of health management education and training needs are basic preconditions for the development and implementation of adequate programs to meet those needs. Along with other specific activities, this research helped to determine the nature, profile, and level of top-priority needs for education. The need for knowledge of certain areas in health management, as well as the need for mastering concrete managerial competencies has been recognized as top-priorities requiring additional improvement and upgrading. PMID:23922519

  15. Adjoint-Based Algorithms for Adaptation and Design Optimizations on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2006-01-01

    Schemes based on discrete adjoint algorithms present several exciting opportunities for significantly advancing the current state of the art in computational fluid dynamics. Such methods provide an extremely efficient means for obtaining discretely consistent sensitivity information for hundreds of design variables, opening the door to rigorous, automated design optimization of complex aerospace configuration using the Navier-Stokes equation. Moreover, the discrete adjoint formulation provides a mathematically rigorous foundation for mesh adaptation and systematic reduction of spatial discretization error. Error estimates are also an inherent by-product of an adjoint-based approach, valuable information that is virtually non-existent in today's large-scale CFD simulations. An overview of the adjoint-based algorithm work at NASA Langley Research Center is presented, with examples demonstrating the potential impact on complex computational problems related to design optimization as well as mesh adaptation.

  16. Extremal Optimization: Methods Derived from Co-Evolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boettcher, S.; Percus, A.G.

    1999-07-13

    We describe a general-purpose method for finding high-quality solutions to hard optimization problems, inspired by self-organized critical models of co-evolution such as the Bak-Sneppen model. The method, called Extremal Optimization, successively eliminates extremely undesirable components of sub-optimal solutions, rather than ''breeding'' better components. In contrast to Genetic Algorithms which operate on an entire ''gene-pool'' of possible solutions, Extremal Optimization improves on a single candidate solution by treating each of its components as species co-evolving according to Darwinian principles. Unlike Simulated Annealing, its non-equilibrium approach effects an algorithm requiring few parameters to tune. With only one adjustable parameter, its performance provesmore » competitive with, and often superior to, more elaborate stochastic optimization procedures. We demonstrate it here on two classic hard optimization problems: graph partitioning and the traveling salesman problem.« less

  17. Application of redundancy in the Saturn 5 guidance and control system

    NASA Technical Reports Server (NTRS)

    Moore, F. B.; White, J. B.

    1976-01-01

    The Saturn launch vehicle's guidance and control system is so complex that the reliability of a simplex system is not adequate to fulfill mission requirements. Thus, to achieve the desired reliability, redundancy encompassing a wide range of types and levels was employed. At one extreme, the lowest level, basic components (resistors, capacitors, relays, etc.) are employed in series, parallel, or quadruplex arrangements to insure continued system operation in the presence of possible failure conditions. At the other extreme, the highest level, complete subsystem duplication is provided so that a backup subsystem can be employed in case the primary system malfunctions. In between these two extremes, many other redundancy schemes and techniques are employed at various levels. Basic redundancy concepts are covered to gain insight into the advantages obtained with various techniques. Points and methods of application of these techniques are included. The theoretical gain in reliability resulting from redundancy is assessed and compared to a simplex system. Problems and limitations encountered in the practical application of redundancy are discussed as well as techniques verifying proper operation of the redundant channels. As background for the redundancy application discussion, a basic description of the guidance and control system is included.

  18. The Extreme Mechanics of Soft Structures

    NASA Astrophysics Data System (ADS)

    Reis, Pedro

    2015-03-01

    I will present a series of experimental investigations on the rich behavior of soft mechanical structures, which, similarly to soft materials, can undergo large deformations under a variety of loading conditions. Soft structures typically comprise slender elements that can readily undergo mechanical instabilities to achieve extreme flexibility and reversible reconfigurations. This field has came to be warmly known as `Extreme Mechanics', where one of the fundamental challenges lies in rationalizing the geometric nonlinearities that arise in the post-buckling regime. I shall focus on problems involving thin elastic rods and shells, through examples ranging from the deployment of submarine cables onto the seabed, locomotion of uniflagellar bacteria, crystallography of curved wrinkling and its usage for active aerodynamic drag reduction. The main common feature underlying this series of studies is the prominence of geometry, and its interplay with mechanics, in dictating complex mechanical behavior that is relevant and applicable over a wide range of length scales. Moreover, our findings suggest that we rethink our relationship with mechanical instabilities which, rather than modes of failure, can be embraced as opportunities for functionality that are scalable, reversible, and robust. The author knowledges financial support from the National Science Foundation, CMMI-1351449 (CAREER).

  19. Learning Problems in Kindergarten Students with Extremely Preterm Birth

    PubMed Central

    Taylor, H. Gerry; Klein, Nancy; Anselmo, Marcia G.; Minich, Nori; Espy, Kimberly A.; Hack, Maureen

    2012-01-01

    Objective To assess learning problems in extremely preterm children in kindergarten and identify risk factors. Design Cohort study. Setting Children’s hospital. Participants A cohort of extremely preterm children born January 2001 – December 2003 (n=148), defined as <28 weeks gestation and/or <1000 g birth weight, and term-born normal birth weight classmate controls (n=111). Main Interventions The children were enrolled during their first year in kindergarten and assessed on measures of learning progress. Main Outcome Measures Achievement testing, teacher ratings of learning progress, and individual educational assistance. Results The extremely preterm children had lower mean standard scores than controls on tests of spelling (8.52 points, 95% CI: 4.58, 12.46) and applied mathematics (11.02 points, 95% CI: 6.76, 15.28). They also had higher rates of substandard learning progress by teacher report in written language (OR = 4.23, 95% CI: 2.32, 7.73) and mathematics (OR = 7.08, 95% CI: 2.79, 17.95). Group differences on mathematics achievement and in teacher ratings of learning progress were significant even in children without neurosensory deficits or low global cognitive ability. Neonatal risk factors, early childhood neurodevelopmental impairment, and socioeconomic status predicted learning problems in extremely preterm children, yet many of the children with problems were not in a special education program. Conclusion Learning problems in extremely preterm children are evident in kindergarten and are associated with neonatal and early childhood risk factors. The findings support efforts to provide more extensive monitoring and interventions both prior to and during the first year in school. PMID:21893648

  20. Scheduling Future Water Supply Investments Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Huskova, I.; Matrosov, E. S.; Harou, J. J.; Kasprzyk, J. R.; Reed, P. M.

    2014-12-01

    Uncertain hydrological impacts of climate change, population growth and institutional changes pose a major challenge to planning of water supply systems. Planners seek optimal portfolios of supply and demand management schemes but also when to activate assets whilst considering many system goals and plausible futures. Incorporation of scheduling into the planning under uncertainty problem strongly increases its complexity. We investigate some approaches to scheduling with many-objective heuristic search. We apply a multi-scenario many-objective scheduling approach to the Thames River basin water supply system planning problem in the UK. Decisions include which new supply and demand schemes to implement, at what capacity and when. The impact of different system uncertainties on scheme implementation schedules are explored, i.e. how the choice of future scenarios affects the search process and its outcomes. The activation of schemes is influenced by the occurrence of extreme hydrological events in the ensemble of plausible scenarios and other factors. The approach and results are compared with a previous study where only the portfolio problem is addressed (without scheduling).

  1. Application of powder densification models to the consolidation processing of composites

    NASA Technical Reports Server (NTRS)

    Wadley, H. N. G.; Elzey, D. M.

    1991-01-01

    Unidirectional fiber reinforced metal matrix composite tapes (containing a single layer of parallel fibers) can now be produced by plasma deposition. These tapes can be stacked and subjected to a thermomechanical treatment that results in a fully dense near net shape component. The mechanisms by which this consolidation step occurs are explored, and models to predict the effect of different thermomechanical conditions (during consolidation) upon the kinetics of densification are developed. The approach is based upon a methodology developed by Ashby and others for the simpler problem of HIP of spherical powders. The complex problem is devided into six, much simpler, subproblems, and then their predicted contributions are added to densification. The initial problem decomposition is to treat the two extreme geometries encountered (contact deformation occurring between foils and shrinkage of isolated, internal pores). Deformation of these two geometries is modelled for plastic, power law creep and diffusional flow. The results are reported in the form of a densification map.

  2. A prediction model to forecast the cost impact from a break in the production schedule

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1977-01-01

    The losses which are experienced after a break or stoppage in sequence of a production cycle portends an extremely complex situation and involves numerous variables, some of uncertain quantity and quality. There are no discrete formulas to define the losses during a gap in production. The techniques which are employed are therefore related to a prediction or forecast of the losses that take place, based on the conditions which exist in the production environment. Such parameters as learning curve slope, number of predecessor units, and length of time the production sequence is halted are utilized in formulating a prediction model. The pertinent current publications related to this subject are few in number, but are reviewed to provide an understanding of the problem. Example problems are illustrated together with appropriate trend curves to show the approach. Solved problems are also given to show the application of the models to actual cases or production breaks in the real world.

  3. Evaluating the performance of parallel subsurface simulators: An illustrative example with PFLOTRAN

    PubMed Central

    Hammond, G E; Lichtner, P C; Mills, R T

    2014-01-01

    [1] To better inform the subsurface scientist on the expected performance of parallel simulators, this work investigates performance of the reactive multiphase flow and multicomponent biogeochemical transport code PFLOTRAN as it is applied to several realistic modeling scenarios run on the Jaguar supercomputer. After a brief introduction to the code's parallel layout and code design, PFLOTRAN's parallel performance (measured through strong and weak scalability analyses) is evaluated in the context of conceptual model layout, software and algorithmic design, and known hardware limitations. PFLOTRAN scales well (with regard to strong scaling) for three realistic problem scenarios: (1) in situ leaching of copper from a mineral ore deposit within a 5-spot flow regime, (2) transient flow and solute transport within a regional doublet, and (3) a real-world problem involving uranium surface complexation within a heterogeneous and extremely dynamic variably saturated flow field. Weak scalability is discussed in detail for the regional doublet problem, and several difficulties with its interpretation are noted. PMID:25506097

  4. Evaluating the performance of parallel subsurface simulators: An illustrative example with PFLOTRAN.

    PubMed

    Hammond, G E; Lichtner, P C; Mills, R T

    2014-01-01

    [1] To better inform the subsurface scientist on the expected performance of parallel simulators, this work investigates performance of the reactive multiphase flow and multicomponent biogeochemical transport code PFLOTRAN as it is applied to several realistic modeling scenarios run on the Jaguar supercomputer. After a brief introduction to the code's parallel layout and code design, PFLOTRAN's parallel performance (measured through strong and weak scalability analyses) is evaluated in the context of conceptual model layout, software and algorithmic design, and known hardware limitations. PFLOTRAN scales well (with regard to strong scaling) for three realistic problem scenarios: (1) in situ leaching of copper from a mineral ore deposit within a 5-spot flow regime, (2) transient flow and solute transport within a regional doublet, and (3) a real-world problem involving uranium surface complexation within a heterogeneous and extremely dynamic variably saturated flow field. Weak scalability is discussed in detail for the regional doublet problem, and several difficulties with its interpretation are noted.

  5. Expert systems for superalloy studies

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kaukler, William F.

    1990-01-01

    There are many areas in science and engineering which require knowledge of an extremely complex foundation of experimental results in order to design methodologies for developing new materials or products. Superalloys are an area which fit well into this discussion in the sense that they are complex combinations of elements which exhibit certain characteristics. Obviously the use of superalloys in high performance, high temperature systems such as the Space Shuttle Main Engine is of interest to NASA. The superalloy manufacturing process is complex and the implementation of an expert system within the design process requires some thought as to how and where it should be implemented. A major motivation is to develop a methodology to assist metallurgists in the design of superalloy materials using current expert systems technology. Hydrogen embrittlement is disasterous to rocket engines and the heuristics can be very complex. Attacking this problem as one module in the overall design process represents a significant step forward. In order to describe the objectives of the first phase implementation, the expert system was designated Hydrogen Environment Embrittlement Expert System (HEEES).

  6. Detection of time delays and directional interactions based on time series from complex dynamical systems

    NASA Astrophysics Data System (ADS)

    Ma, Huanfei; Leng, Siyang; Tao, Chenyang; Ying, Xiong; Kurths, Jürgen; Lai, Ying-Cheng; Lin, Wei

    2017-07-01

    Data-based and model-free accurate identification of intrinsic time delays and directional interactions is an extremely challenging problem in complex dynamical systems and their networks reconstruction. A model-free method with new scores is proposed to be generally capable of detecting single, multiple, and distributed time delays. The method is applicable not only to mutually interacting dynamical variables but also to self-interacting variables in a time-delayed feedback loop. Validation of the method is carried out using physical, biological, and ecological models and real data sets. Especially, applying the method to air pollution data and hospital admission records of cardiovascular diseases in Hong Kong reveals the major air pollutants as a cause of the diseases and, more importantly, it uncovers a hidden time delay (about 30-40 days) in the causal influence that previous studies failed to detect. The proposed method is expected to be universally applicable to ascertaining and quantifying subtle interactions (e.g., causation) in complex systems arising from a broad range of disciplines.

  7. Model Checking - My 27-Year Quest to Overcome the State Explosion Problem

    NASA Technical Reports Server (NTRS)

    Clarke, Ed

    2009-01-01

    Model Checking is an automatic verification technique for state-transition systems that are finite=state or that have finite-state abstractions. In the early 1980 s in a series of joint papers with my graduate students E.A. Emerson and A.P. Sistla, we proposed that Model Checking could be used for verifying concurrent systems and gave algorithms for this purpose. At roughly the same time, Joseph Sifakis and his student J.P. Queille at the University of Grenoble independently developed a similar technique. Model Checking has been used successfully to reason about computer hardware and communication protocols and is beginning to be used for verifying computer software. Specifications are written in temporal logic, which is particularly valuable for expressing concurrency properties. An intelligent, exhaustive search is used to determine if the specification is true or not. If the specification is not true, the Model Checker will produce a counterexample execution trace that shows why the specification does not hold. This feature is extremely useful for finding obscure errors in complex systems. The main disadvantage of Model Checking is the state-explosion problem, which can occur if the system under verification has many processes or complex data structures. Although the state-explosion problem is inevitable in worst case, over the past 27 years considerable progress has been made on the problem for certain classes of state-transition systems that occur often in practice. In this talk, I will describe what Model Checking is, how it works, and the main techniques that have been developed for combating the state explosion problem.

  8. Mental health and social competencies of 10- to 12-year-old children born at 23 to 25 weeks of gestation in the 1990s: a Swedish national prospective follow-up study.

    PubMed

    Farooqi, Aijaz; Hägglöf, Bruno; Sedin, Gunnar; Gothefors, Leif; Serenius, Fredrik

    2007-07-01

    We investigated a national cohort of extremely immature children with respect to behavioral and emotional problems and social competencies, from the perspectives of parents, teachers, and children themselves. We examined 11-year-old children who were born before 26 completed weeks of gestation in Sweden between 1990 and 1992. All had been evaluated at a corrected age of 36 months. At 11 years of age, 86 of 89 survivors were studied and compared with an equal number of control subjects, matched with respect to age and gender. Behavioral and emotional problems, social competencies, and adaptive functioning at school were evaluated with standardized, well-validated instruments, including parent and teacher report questionnaires and a child self-report, administered by mail. Compared with control subjects, parents of extremely immature children reported significantly more problems with internalizing behaviors (anxiety/depression, withdrawn, and somatic problems) and attention, thought, and social problems. Teachers reported a similar pattern. Reports from children showed a trend toward increased depression symptoms compared with control subjects. Multivariate analysis of covariance of parent-reported behavioral problems revealed no interactions, but significant main effects emerged for group status (extremely immature versus control), family function, social risk, and presence of a chronic medical condition, with all effect sizes being medium and accounting for 8% to 12% of the variance. Multivariate analysis of covariance of teacher-reported behavioral problems showed significant effects for group status and gender but not for the covariates mentioned above. According to the teachers' ratings, extremely immature children were less well adjusted to the school environment than were control subjects. However, a majority of extremely immature children (85%) were functioning in mainstream schools without major adjustment problems. Despite favorable outcomes for many children born at the limit of viability, these children are at risk for mental health problems, with poorer school results.

  9. A compliant mechanism for inspecting extremely confined spaces

    NASA Astrophysics Data System (ADS)

    Mascareñas, David; Moreu, Fernando; Cantu, Precious; Shields, Daniel; Wadden, Jack; El Hadedy, Mohamed; Farrar, Charles

    2017-11-01

    We present a novel, compliant mechanism that provides the capability to navigate extremely confined spaces for the purpose of infrastructure inspection. Extremely confined spaces are commonly encountered during infrastructure inspection. Examples of such spaces can include pipes, conduits, and ventilation ducts. Often these infrastructure features go uninspected simply because there is no viable way to access their interior. In addition, it is not uncommon for extremely confined spaces to possess a maze-like architecture that must be selectively navigated in order to properly perform an inspection. Efforts by the imaging sensor community have resulted in the development of imaging sensors on the millimeter length scale. Due to their compact size, they are able to inspect many extremely confined spaces of interest, however, the means to deliver these sensors to the proper location to obtain the desired images are lacking. To address this problem, we draw inspiration from the field of endoscopic surgery. Specifically we consider the work that has already been done to create long flexible needles that are capable of being steered through the human body. These devices are typically referred to as ‘steerable needles.’ Steerable needle technology is not directly applicable to the problem of navigating maze-like arrangements of extremely confined spaces, but it does provide guidance on how this problem should be approached. Specifically, the super-elastic nitinol tubing material that allows steerable needles to operate is also appropriate for the problem of navigating maze-like arrangements of extremely confined spaces. Furthermore, the portion of the mechanism that enters the extremely confined space is completely mechanical in nature. The mechanical nature of the device is an advantage when the extremely confined space features environmental hazards such as radiation that could degrade an electromechanically operated mechanism. Here, we present a compliant mechanism developed to navigate maze-like arrangements of extremely confined spaces. The mechanism is shown to be able to selectively navigate past three 90° bends. The ability to selectively navigate extremely confined spaces opens up new possibilities to use emerging miniature imaging technology for infrastructure inspection.

  10. Nonparametric Regression Subject to a Given Number of Local Extreme Value

    DTIC Science & Technology

    2001-07-01

    compilation report: ADP013708 thru ADP013761 UNCLASSIFIED Nonparametric regression subject to a given number of local extreme value Ali Majidi and Laurie...locations of the local extremes for the smoothing algorithm. 280 A. Majidi and L. Davies 3 The smoothing problem We make the smoothing problem precise...is the solution of QP3. k--oo 282 A. Majidi and L. Davies FiG. 2. The captions top-left, top-right, bottom-left, bottom-right show the result of the

  11. On the dimension of complex responses in nonlinear structural vibrations

    NASA Astrophysics Data System (ADS)

    Wiebe, R.; Spottswood, S. M.

    2016-07-01

    The ability to accurately model engineering systems under extreme dynamic loads would prove a major breakthrough in many aspects of aerospace, mechanical, and civil engineering. Extreme loads frequently induce both nonlinearities and coupling which increase the complexity of the response and the computational cost of finite element models. Dimension reduction has recently gained traction and promises the ability to distill dynamic responses down to a minimal dimension without sacrificing accuracy. In this context, the dimensionality of a response is related to the number of modes needed in a reduced order model to accurately simulate the response. Thus, an important step is characterizing the dimensionality of complex nonlinear responses of structures. In this work, the dimensionality of the nonlinear response of a post-buckled beam is investigated. Significant detail is dedicated to carefully introducing the experiment, the verification of a finite element model, and the dimensionality estimation algorithm as it is hoped that this system may help serve as a benchmark test case. It is shown that with minor modifications, the method of false nearest neighbors can quantitatively distinguish between the response dimension of various snap-through, non-snap-through, random, and deterministic loads. The state-space dimension of the nonlinear system in question increased from 2-to-10 as the system response moved from simple, low-level harmonic to chaotic snap-through. Beyond the problem studied herein, the techniques developed will serve as a prescriptive guide in developing fast and accurate dimensionally reduced models of nonlinear systems, and eventually as a tool for adaptive dimension-reduction in numerical modeling. The results are especially relevant in the aerospace industry for the design of thin structures such as beams, panels, and shells, which are all capable of spatio-temporally complex dynamic responses that are difficult and computationally expensive to model.

  12. Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.

    2005-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.

  13. Risk factors, management and primary prevention of thrombotic complications related to the use of central venous catheters.

    PubMed

    Linnemann, Birgit; Lindhoff-Last, Edelgard

    2012-09-01

    An adequate vascular access is of importance for the treatment of patients with cancer and complex illnesses in the intensive, perioperative or palliative care setting. Deep vein thrombosis and thrombotic occlusion are the most common complications attributed to central venous catheters in short-term and, especially, in long-term use. In this review we will focus on the risk factors, management and prevention strategies of catheter-related thrombosis and occlusion. Due to the lack of randomised controlled trials, there is still controversy about the optimal treatment of catheter-related thrombotic complications, and therapy has been widely adopted using the evidence concerning lower extremity deep vein thrombosis. Given the increasing use of central venous catheters in patients that require long-term intravenous therapy, the problem of upper extremity deep venous thrombosis can be expected to increase in the future. We provide data for establishing a more uniform strategy for preventing, diagnosing and treating catheter-related thrombotic complications.

  14. Species interactions of the alewife in the Great Lakes

    USGS Publications Warehouse

    Smith, Stanford H.

    1970-01-01

    The alewife (Alosa pseudoharengus) has caused serious problems in the Great Lakes for almost 100 years. It entered Lake Ontario in abundance via the Erie Canal during the 1860's when major piscivores were declining, and became the dominant species in the lake during the 1870's. The alewife subsequently spread throughout the Great Lakes and became the dominant species in Lakes Huron and Michigan as major piscivores declined. In lakes where it became extremely abundant, the shallow-water planktivores declined in the first decade after alewife establishment, the minor piscivores increased then declined in the second decade, and the deep-water planktivores declined in the third decade. The consequence has been a general reduction in fishery productivity. Rehabilitation will require extreme reduction of the alewife, and restoration of an interacting complex of deep- and shallow-water forage species, and minor and major piscivores, either by reestablishing species affected by the alewife, or by the introduction of new species that can thrive under the new ecological conditions of the lakes.

  15. Countering the Pedagogy of Extremism: Reflective Narratives and Critiques of Problem-Based Learning

    ERIC Educational Resources Information Center

    Woo, Chris W. H.; Laxman, Kumar

    2013-01-01

    This paper is a critique against "purist" pedagogies found in the literature of student-centred learning. The article reproves extremism in education and questions the absolutism and teleological truths expounded in exclusive problem-based learning. The paper articulates the framework of a unifying pedagogical practice through Eve…

  16. A simple encoding method for Sigma-Delta ADC based biopotential acquisition systems.

    PubMed

    Guerrero, Federico N; Spinelli, Enrique M

    2017-10-01

    Sigma Delta analogue-to-digital converters allow acquiring the full dynamic range of biomedical signals at the electrodes, resulting in less complex hardware and increased measurement robustness. However, the increased data size per sample (typically 24 bits) demands the transmission of extremely large volumes of data across the isolation barrier, thus increasing power consumption on the patient side. This problem is accentuated when a large number of channels is used as in current 128-256 electrodes biopotential acquisition systems, that usually opt for an optic fibre link to the computer. An analogous problem occurs for simpler low-power acquisition platforms that transmit data through a wireless link to a computing platform. In this paper, a low-complexity encoding method is presented to decrease sample data size without losses, while preserving the full DC-coupled signal. The method achieved a 2.3 average compression ratio evaluated over an ECG and EMG signal bank acquired with equipment based on Sigma-Delta converters. It demands a very low processing load: a C language implementation is presented that resulted in an 110 clock cycles average execution on an 8-bit microcontroller.

  17. From phase transitions to the topological renaissance. Comment on "Topodynamics of metastable brains" by Arturo Tozzi et al.

    NASA Astrophysics Data System (ADS)

    Somogyvári, Zoltán; Érdi, Péter

    2017-07-01

    The neural topodynamics theory of Tozzi et al. [13] has two main foci: metastable brain dynamics and the topological approach based on the Borsuk-Ulam theorem (BUT). Briefly, metastable brain dynamics theory hypothesizes that temporary stable synchronization and desynchronization of large number of individual dynamical systems, formed by local neural circuits, are responsible for coding of complex concepts in the brain and sudden changes of these synchronization patterns correspond to operational steps. But what dynamical network could form the substrate for this metastable dynamics, capable of entering into a combinatorially high number of metastable synchronization patterns and exhibit rapid transient changes between them? The general problem is related to the discrimination between ;Black Swans; and ;Dragon Kings;. While BSs are related to the theory of self-organized criticality, and suggests that high-impact extreme events are unpredictable, Dragon-kings are associated with the occurrence of a phase transition, whose emergent organization is based on intermittent criticality [9]. Widening the limits of predictability is one of the big open problems in the theory and practice of complex systems (Sect. 9.3 of Érdi [2]).

  18. Planet Formation in Binaries

    NASA Astrophysics Data System (ADS)

    Thebault, P.; Haghighipour, N.

    Spurred by the discovery of numerous exoplanets in multiple systems, binaries have become in recent years one of the main topics in planet formation research. Numerous studies have investigated to what extent the presence of a stellar companion can affect the planet formation process. Such studies have implications that can reach beyond the sole context of binaries, as they allow to test certain aspects of the planet formation scenario by submitting them to extreme environments. We review here the current understanding on this complex problem. We show in particular how each of the different stages of the planet-formation process is affected differently by binary perturbations. We focus especially on the intermediate stage of kilometre-sized planetesimal accretion, which has proven to be the most sensitive to binarity and for which the presence of some exoplanets observed in tight binaries is difficult to explain by in-situ formation following the "standard" planet-formation scenario. Some tentative solutions to this apparent paradox are presented. The last part of our review presents a thorough description of the problem of planet habitability, for which the binary environment creates a complex situation because of the presence of two irradation sources of varying distance.

  19. Additional adjoint Monte Carlo studies of the shielding of concrete structures against initial gamma radiation. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beer, M.; Cohen, M.O.

    1975-02-01

    The adjoint Monte Carlo method previously developed by MAGI has been applied to the calculation of initial radiation dose due to air secondary gamma rays and fission product gamma rays at detector points within buildings for a wide variety of problems. These provide an in-depth survey of structure shielding effects as well as many new benchmark problems for matching by simplified models. Specifically, elevated ring source results were obtained in the following areas: doses at on-and off-centerline detectors in four concrete blockhouse structures; doses at detector positions along the centerline of a high-rise structure without walls; dose mapping at basementmore » detector positions in the high-rise structure; doses at detector points within a complex concrete structure containing exterior windows and walls and interior partitions; modeling of the complex structure by replacing interior partitions by additional material at exterior walls; effects of elevation angle changes; effects on the dose of changes in fission product ambient spectra; and modeling of mutual shielding due to external structures. In addition, point source results yielding dose extremes about the ring source average were obtained. (auth)« less

  20. Two problems in multiphase biological flows: Blood flow and particulate transport in microvascular network, and pseudopod-driven motility of amoeboid cells

    NASA Astrophysics Data System (ADS)

    Bagchi, Prosenjit

    2016-11-01

    In this talk, two problems in multiphase biological flows will be discussed. The first is the direct numerical simulation of whole blood and drug particulates in microvascular networks. Blood in microcirculation behaves as a dense suspension of heterogeneous cells. The erythrocytes are extremely deformable, while inactivated platelets and leukocytes are nearly rigid. A significant progress has been made in recent years in modeling blood as a dense cellular suspension. However, many of these studies considered the blood flow in simple geometry, e.g., straight tubes of uniform cross-section. In contrast, the architecture of a microvascular network is very complex with bifurcating, merging and winding vessels, posing a further challenge to numerical modeling. We have developed an immersed-boundary-based method that can consider blood cell flow in physiologically realistic and complex microvascular network. In addition to addressing many physiological issues related to network hemodynamics, this tool can be used to optimize the transport properties of drug particulates for effective organ-specific delivery. Our second problem is pseudopod-driven motility as often observed in metastatic cancer cells and other amoeboid cells. We have developed a multiscale hydrodynamic model to simulate such motility. We study the effect of cell stiffness on motility as the former has been considered as a biomarker for metastatic potential. Funded by the National Science Foundation.

  1. Advanced Computational Aeroacoustics Methods for Fan Noise Prediction

    NASA Technical Reports Server (NTRS)

    Envia, Edmane (Technical Monitor); Tam, Christopher

    2003-01-01

    Direct computation of fan noise is presently not possible. One of the major difficulties is the geometrical complexity of the problem. In the case of fan noise, the blade geometry is critical to the loading on the blade and hence the intensity of the radiated noise. The precise geometry must be incorporated into the computation. In computational fluid dynamics (CFD), there are two general ways to handle problems with complex geometry. One way is to use unstructured grids. The other is to use body fitted overset grids. In the overset grid method, accurate data transfer is of utmost importance. For acoustic computation, it is not clear that the currently used data transfer methods are sufficiently accurate as not to contaminate the very small amplitude acoustic disturbances. In CFD, low order schemes are, invariably, used in conjunction with unstructured grids. However, low order schemes are known to be numerically dispersive and dissipative. dissipative errors are extremely undesirable for acoustic wave problems. The objective of this project is to develop a high order unstructured grid Dispersion-Relation-Preserving (DRP) scheme. would minimize numerical dispersion and dissipation errors. contains the results of the funded portion of the project. scheme on an unstructured grid has been developed. constructed in the wave number space. The characteristics of the scheme can be improved by the inclusion of additional constraints. Stability of the scheme has been investigated. Stability can be improved by adopting the upwinding strategy.

  2. Patterns of precipitation and soil moisture extremes in Texas, US: A complex network analysis

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Xia, Youlong; Caldwell, Todd G.; Hao, Zengchao

    2018-02-01

    Understanding of the spatial and temporal dynamics of extreme precipitation not only improves prediction skills, but also helps to prioritize hazard mitigation efforts. This study seeks to enhance the understanding of spatiotemporal covariation patterns embedded in precipitation (P) and soil moisture (SM) by using an event-based, complex-network-theoretic approach. Events concurrences are quantified using a nonparametric event synchronization measure, and spatial patterns of hydroclimate variables are analyzed by using several network measures and a community detection algorithm. SM-P coupling is examined using a directional event coincidence analysis measure that takes the order of event occurrences into account. The complex network approach is demonstrated for Texas, US, a region possessing a rich set of hydroclimate features and is frequented by catastrophic flooding. Gridded daily observed P data and simulated SM data are used to create complex networks of P and SM extremes. The uncovered high degree centrality regions and community structures are qualitatively in agreement with the overall existing knowledge of hydroclimate extremes in the study region. Our analyses provide new visual insights on the propagation, connectivity, and synchronicity of P extremes, as well as the SM-P coupling, in this flood-prone region, and can be readily used as a basis for event-driven predictive analytics for other regions.

  3. [A method for reproducing amnesia in mice by the complex extremal exposure].

    PubMed

    Iasnetsov, V V; Provornova, N A

    2003-01-01

    It is suggested to reproduce a retrograde amnesia in mice by means of a complex extremal action: emaciating swim in cold water with simultaneous wheel rotation. It was found that nootropes such as pyracetam, mexidol, semax, nooglutil, acephen, and noopept fully or completely prevent from the amnesia development.

  4. A computational model of the human hand 93-ERI-053

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollerbach, K.; Axelrod, T.

    1996-03-01

    The objectives of the Computational Hand Modeling project were to prove the feasibility of the Laboratory`s NIKE3D finite element code to orthopaedic problems. Because of the great complexity of anatomical structures and the nonlinearity of their behavior, we have focused on a subset of joints of the hand and lower extremity and have developed algorithms to model their behavior. The algorithms developed here solve fundamental problems in computational biomechanics and can be expanded to describe any other joints of the human body. This kind of computational modeling has never successfully been attempted before, due in part to a lack ofmore » biomaterials data and a lack of computational resources. With the computational resources available at the National Laboratories and the collaborative relationships we have established with experimental and other modeling laboratories, we have been in a position to pursue our innovative approach to biomechanical and orthopedic modeling.« less

  5. A case of Rocky Mountain spotted fever.

    PubMed

    Rubel, Barry S

    2007-01-01

    Rocky Mountain spotted fever is a serious, generalized infection that is spread to humans through the bite of infected ticks. It can be lethal but it is curable. The disease gets its name from the Rocky Mountain region where it was first identified in 1896. The fever is caused by the bacterium Rickettsia rickettsii and is maintained in nature in a complex life cycle involving ticks and mammals. Humans are considered to be accidental hosts and are not involved in the natural transmission cycle of this pathogen. The author examined a 47-year-old woman during a periodic recall appointment. The patient had no dental problems other than the need for routine prophylaxis but mentioned a recent problem with swelling of her extremities with an accompanying rash and general malaise and soreness in her neck region. Tests were conducted and a diagnosis of Rocky Mountain spotted fever was made.

  6. Combining convolutional neural networks and Hough Transform for classification of images containing lines

    NASA Astrophysics Data System (ADS)

    Sheshkus, Alexander; Limonova, Elena; Nikolaev, Dmitry; Krivtsov, Valeriy

    2017-03-01

    In this paper, we propose an expansion of convolutional neural network (CNN) input features based on Hough Transform. We perform morphological contrasting of source image followed by Hough Transform, and then use it as input for some convolutional filters. Thus, CNNs computational complexity and the number of units are not affected. Morphological contrasting and Hough Transform are the only additional computational expenses of introduced CNN input features expansion. Proposed approach was demonstrated on the example of CNN with very simple structure. We considered two image recognition problems, that were object classification on CIFAR-10 and printed character recognition on private dataset with symbols taken from Russian passports. Our approach allowed to reach noticeable accuracy improvement without taking much computational effort, which can be extremely important in industrial recognition systems or difficult problems utilising CNNs, like pressure ridge analysis and classification.

  7. One Health in food safety and security education: A curricular framework.

    PubMed

    Angelos, J; Arens, A; Johnson, H; Cadriel, J; Osburn, B

    2016-02-01

    The challenges of producing and distributing the food necessary to feed an anticipated 9 billion people in developed and developing societies by 2050 without destroying Earth's finite soil and water resources present extremely complex problems that lack simple solutions. The ability of modern societies to adequately address these and other food-related problems will require an educated workforce trained not only in traditional food safety, security, and public health, but also in other areas including food production, sustainable practices, and ecosystem health. To help address the need for such an educated workforce, a curricular framework was developed to assist those tasked with designing education and training for future food systems workers. One sentence summary: A curricular framework for education and training in food safety and security was developed that incorporates One Health concepts. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Review of complex networks application in hydroclimatic extremes with an implementation to characterize spatio-temporal drought propagation in continental USA

    NASA Astrophysics Data System (ADS)

    Konapala, Goutam; Mishra, Ashok

    2017-12-01

    The quantification of spatio-temporal hydroclimatic extreme events is a key variable in water resources planning, disaster mitigation, and preparing climate resilient society. However, quantification of these extreme events has always been a great challenge, which is further compounded by climate variability and change. Recently complex network theory was applied in earth science community to investigate spatial connections among hydrologic fluxes (e.g., rainfall and streamflow) in water cycle. However, there are limited applications of complex network theory for investigating hydroclimatic extreme events. This article attempts to provide an overview of complex networks and extreme events, event synchronization method, construction of networks, their statistical significance and the associated network evaluation metrics. For illustration purpose, we apply the complex network approach to study the spatio-temporal evolution of droughts in Continental USA (CONUS). A different drought threshold leads to a new drought event as well as different socio-economic implications. Therefore, it would be interesting to explore the role of thresholds on spatio-temporal evolution of drought through network analysis. In this study, long term (1900-2016) Palmer drought severity index (PDSI) was selected for spatio-temporal drought analysis using three network-based metrics (i.e., strength, direction and distance). The results indicate that the drought events propagate differently at different thresholds associated with initiation of drought events. The direction metrics indicated that onset of mild drought events usually propagate in a more spatially clustered and uniform approach compared to onsets of moderate droughts. The distance metric shows that the drought events propagate for longer distance in western part compared to eastern part of CONUS. We believe that the network-aided metrics utilized in this study can be an important tool in advancing our knowledge on drought propagation as well as other hydroclimatic extreme events. Although the propagation of droughts is investigated using the network approach, however process (physics) based approaches is essential to further understand the dynamics of hydroclimatic extreme events.

  9. I-BIEM calculations of the frequency dispersion and ac current distribution at disk and ring-disk electrodes

    NASA Technical Reports Server (NTRS)

    Cahan, Boris D.

    1991-01-01

    The Iterative Boundary Integral Equation Method (I-BIEM) has been applied to the problem of frequency dispersion at a disk electrode in a finite geometry. The I-BIEM permits the direct evaluation of the AC potential (a complex variable) using complex boundary conditions. The point spacing was made highly nonuniform, to give extremely high resolution in those regions where the variables change most rapidly, i.e., in the vicinity of the edge of the disk. Results are analyzed with respect to IR correction, equipotential surfaces, and reference electrode placement. The current distribution is also examined for a ring-disk configuration, with the ring and the disk at the same AC potential. It is shown that the apparent impedance of the disk is inductive at higher frequencies. The results are compared to analytic calculations from the literature, and usually agree to better than 0.001 percent.

  10. I-BIEM calculations of the frequency dispersion and AC current distribution at disk and ring-disk electrodes

    NASA Technical Reports Server (NTRS)

    Cahan, Boris D.

    1991-01-01

    The Iterative Boundary Integral Equation Method (I-BIEM) has been applied to the problem of frequency dispersion at a disk electrode in a finite geometry. The I-BIEM permits the direct evaluation of the AC potential (a complex variable) using complex boundary conditions. The point spacing was made highly nonuniform, to give extremely high resolution in those regions where the variables change most rapidly, i.e., in the vicinity of the edge of the disk. Results are analyzed with respect to IR correction, equipotential surfaces, and reference electrode placement. The current distribution is also examined for a ring-disk configuration, with the ring and the disk at the same AC potential. It is shown that the apparent impedance of the disk is inductive at higher frequencies. The results are compared to analytic calculations from the literature, and usually agree to better than 0.001 percent.

  11. Scheduling Software for Complex Scenarios

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Preparing a vehicle and its payload for a single launch is a complex process that involves thousands of operations. Because the equipment and facilities required to carry out these operations are extremely expensive and limited in number, optimal assignment and efficient use are critically important. Overlapping missions that compete for the same resources, ground rules, safety requirements, and the unique needs of processing vehicles and payloads destined for space impose numerous constraints that, when combined, require advanced scheduling. Traditional scheduling systems use simple algorithms and criteria when selecting activities and assigning resources and times to each activity. Schedules generated by these simple decision rules are, however, frequently far from optimal. To resolve mission-critical scheduling issues and predict possible problem areas, NASA historically relied upon expert human schedulers who used their judgment and experience to determine where things should happen, whether they will happen on time, and whether the requested resources are truly necessary.

  12. A curative treatment option for Complex Regional Pain Syndrome (CRPS) Type I: dorsal root entry zone operation (report of two cases).

    PubMed

    Kanpolat, Yucel; Al-Beyati, Eyyub; Ugur, Hasan Caglar; Akpinar, Gokhan; Kahilogullari, Gokmen; Bozkurt, Melih

    2014-01-01

    Complex Regional Pain Syndrome Type I (CRPS-I) is a debated health problem concerning its pathophysiology and treatment strategies. A 12-year-old boy and a 35-year-old woman were diagnosed with CRPS-I at different times. They had previously undergone various types of interventions with no success. After one year of follow-up and observation, DREZ lesioning operation was performed. Afterwards, both cases had transient lower extremity ataxia. The first case was followed for 60 months with no recurrence and total cure. The second case was pain-free until the 6th month, when she required psychological support; she was followed for 33 months with partial satisfactory outcome. Although not a first-line option, DREZ lesioning procedure can be chosen and may be a curative option in selected cases of CRPS-I who are unresponsive to conventional therapies.

  13. Early Reconstructions of Complex Lower Extremity Battlefield Soft Tissue Wounds

    PubMed Central

    Ebrahimi, Ali; Nejadsarvari, Nasrin; Ebrahimi, Azin; Rasouli, Hamid Reza

    2017-01-01

    BACKGROUND Severe lower extremity trauma as a devastating combat related injury is on the rise and this presents reconstructive surgeons with significant challenges to reach optimal cosmetic and functional outcomes. This study assessed early reconstructions of complex lower extremity battlefield soft tissue wounds. METHODS This was a prospective case series study of battled field injured patients which was done in the Department of Plastic Surgery, Baqiyatallah University of Medical Sciences hospitals, Tehran, Iran between 2013-2015. In this survey, 73 patients were operated for reconstruction of lower extremity soft tissue defects due to battlefield injuries RESULTS Seventy-three patients (65 men, 8 womens) ranging from 21-48 years old (mean: 35 years) were enrolled. Our study showed that early debridement and bone stabilization and later coverage of complex battlefields soft tissue wounds with suitable flaps and grafts of lower extremity were effective method for difficult wounds managements with less amputation and infections. CONCLUSION Serial debridement and bone stabilization before early soft tissue reconstruction according to reconstructive ladder were shown to be essential steps. PMID:29218283

  14. Self-Regulation in Children Born with Extremely Low Birth Weight at 2 Years Old: A Comparison Study

    ERIC Educational Resources Information Center

    Lynn, Lisa N.; Cuskelly, Monica; Gray, Peter H.; O'Callaghan, Michael J.

    2012-01-01

    Survival rates for children born with extremely low birth weight (ELBW) are increasing; however, many of these children experience later problems with learning. This study adopted an integrated approach to these problems, involving the self-regulatory tasks of inhibition and delay of gratification and relevant individual factors including…

  15. Inflationary dynamics for matrix eigenvalue problems

    PubMed Central

    Heller, Eric J.; Kaplan, Lev; Pollmann, Frank

    2008-01-01

    Many fields of science and engineering require finding eigenvalues and eigenvectors of large matrices. The solutions can represent oscillatory modes of a bridge, a violin, the disposition of electrons around an atom or molecule, the acoustic modes of a concert hall, or hundreds of other physical quantities. Often only the few eigenpairs with the lowest or highest frequency (extremal solutions) are needed. Methods that have been developed over the past 60 years to solve such problems include the Lanczos algorithm, Jacobi–Davidson techniques, and the conjugate gradient method. Here, we present a way to solve the extremal eigenvalue/eigenvector problem, turning it into a nonlinear classical mechanical system with a modified Lagrangian constraint. The constraint induces exponential inflationary growth of the desired extremal solutions. PMID:18511564

  16. Novices and Experts in Geoinformatics: the Cognitive Gap.

    NASA Astrophysics Data System (ADS)

    Zhilin, M.

    2012-04-01

    Modern geoinformatics is an extremely powerful tool for problem analysis and decision making in various fields. Currently general public uses geoinformatics predominantly for navigating (GPS) and sharing information about particular places (GoogleMaps, Wikimapia). Communities also use geoinformatics for particular purposes: fans of history use it to correspond historical and actual maps (www.retromap.ru), birdwatchers point places where they met birds (geobirds.com/rangemaps) etc. However the majority of stakeholders local authorities are not aware of advantages and possibilities of geoinformatics. The same problem is observed for students. At the same time many professional geoinformatic tools are developed, but sometimes the experts even can't explain their purpose to non-experts. So the question is how to shrink the gap between experts and non-experts in understanding and application of geoinformatics. We think that this gap has a cognitive basis. According to modern cognitive theories (Shiffrin-Atkinson and descending) the information primary has to pass through the perceptual filter that cuts off the information that seems to be irrelevant. The mind estimates the relevance implicitly (unconsciously) basing on previous knowledge and judgments what is important. Then it comes to the working memory which is used (a) for proceeding and (b) for problem solving. The working memory has limited capacity and can operate only with about 7 objects simultaneously. Then information passes to the long-term memory that is of unlimited capacity. There it is stored as more or less complex structures with associative links. When necessary it is extracted into the working memory. If great amount of information is linked ("chunked") the working memory operates with it as one object of seven thus overcoming the limitations of the working memory capacity. To adopt any information it should (a) pass through the perceptual filter, (b) not to overload the working memory and (c) to be structured in the long-term memory. Expert easily adopt domain-specific information because they (a) understand terminology and consider the information to be important thus passing it through the perceptual filter and (b) have a lot of complex domain-specific chunks that are processed by the working memory as a whole thus avoiding to overload it. Novices (students and general public) have neither understanding and feeling importance nor necessary chunks. The following measures should be taken to bridge experts' and novices' understanding of geoinformatics. Expert community should popularize geoscientific problems developing understandable language and available tools for their solving. This requires close collaboration with educational system (especially second education). If students understand a problem, they can find and apply appropriate tool for it. Geoscientific problems and models are extremely complex. In cognitive terms, they require hierarchy of chunks. This hierarchy should coherently develop beginning from simple ones later joining them to complex. It requires an appropriate sequence of learning tasks. There is no necessity in correct solutions - the students should understand how are they solved and realize limitations of models. We think that tasks of weather forecast, global climate modeling etc are suitable. The first step on bridging experts and novices is the elaboration of a set and a sequence of learning tasks and its sequence as well as tools for their solution. The tools should be easy for everybody who understands the task and as versatile as possible - otherwise students will waste a lot of time mastering it. This development requires close collaboration between geoscientists and educators.

  17. A Bayesian Approach to Real-Time Earthquake Phase Association

    NASA Astrophysics Data System (ADS)

    Benz, H.; Johnson, C. E.; Earle, P. S.; Patton, J. M.

    2014-12-01

    Real-time location of seismic events requires a robust and extremely efficient means of associating and identifying seismic phases with hypothetical sources. An association algorithm converts a series of phase arrival times into a catalog of earthquake hypocenters. The classical approach based on time-space stacking of the locus of possible hypocenters for each phase arrival using the principal of acoustic reciprocity has been in use now for many years. One of the most significant problems that has emerged over time with this approach is related to the extreme variations in seismic station density throughout the global seismic network. To address this problem we have developed a novel, Bayesian association algorithm, which looks at the association problem as a dynamically evolving complex system of "many to many relationships". While the end result must be an array of one to many relations (one earthquake, many phases), during the association process the situation is quite different. Both the evolving possible hypocenters and the relationships between phases and all nascent hypocenters is many to many (many earthquakes, many phases). The computational framework we are using to address this is a responsive, NoSQL graph database where the earthquake-phase associations are represented as intersecting Bayesian Learning Networks. The approach directly addresses the network inhomogeneity issue while at the same time allowing the inclusion of other kinds of data (e.g., seismic beams, station noise characteristics, priors on estimated location of the seismic source) by representing the locus of intersecting hypothetical loci for a given datum as joint probability density functions.

  18. Flooding experience at Veracruz: not only a natural disaster

    NASA Astrophysics Data System (ADS)

    Welsh-Rodriguez, C. M.; Nava Bringas, M.; Ochoa Martinez, C.; Local; regional impacts of global change

    2013-05-01

    The Veracruz state lies on the middle of the Gulf of Mexico in Mexican Republic; has a surface of 72815 Km2 represent almost the 4% of Mexico. Due to the complex topography, the rainfall, runoff and the extreme weather the 33% of Mexican water goes trough Veracruz, and every year the presence of tropical depressions, tropical storms and hurricanes impacts on the habitants of Veracruz (7.5 millions). For Veracruz the Sierra Madre is the natural border on the West and on the East the Gulf of Mexico. It is located from 17°10' to 23°38' (N) and between 93° to 99° (W). We will try to get the find out the primary information source for the floods on 2005 and 20010 and correlate with the laws on environment and civil protection for Veracruz. In 1999 a tropical depression more than 200 000 persons and more than 20 died, in 2005 Stan hurricane affected more than a million persons but no one died. In 2010 the effects of hurricane Karl were similar but a few days after the tropical depression Mathew affected 150 000 persons more and 15 people died. The patterns of people habitat in Veracruz since middle of XX century follows the oil industry develop at south east Mexico, so the risk increased as the population density increased, that's a critical reason to concluded that is not only cause - effect issue on Veracruz. So if the extreme events increase as consequence of the climate variability and climate change the vulnerability on this region will not be address in prevention policies, and the future scenario on adaptation will be a deep complex problem to solve from all perspectives.Reported impactst; Extreme events. Data from Veracruz Government.

  19. Joint Interpretation of Insar and GPS Data Related To The Eruptive Event of July 2001 At Mt. Etna

    NASA Astrophysics Data System (ADS)

    Ferretti, A.; Colesanti, C.; Basilico, M.; Locatelli, R.; Novali, F.; Bonforte, A.; Coltelli, M.; Guglielmino, F.; Palano, M.; Puglisi, G.

    The eruptive background of the July 2001 eruption at Mt. Etna, proved extremely complex and dynamic from the very beginning. The development of the ground defor- mation pattern due to the eruptive event was monitored through both GPS continuous measurements on network of permanent and static stations, and daily measurements both static and kinematic GPS, made by INGV-CT on geodetic network. These mea- surements show diffuse and intense ground deformations on large part of volcanic area. After the ERS-2 gyroscope problems in January 2001, the attitude accuracy of the platform was compromised due to the variability of the baseline and Doppler cen- troid values. Since January, a dedicated and passionate ESA team started a complex recovery procedure aimed at improving the satellite stability. The results obtained are extremely promising. In fact, POLIMI team, in cooperation with TRE (POLIMI com- mercial spin-off), was able to obtain, albeit with a very simple ad hoc processing, a clear surface deformation map related to the 11 July-15 August 2001 passages. Fur- ther work, after this preliminary interferogram, could be carried out to unwrap the very crowded fringe pattern on the top of the volcano. A preliminary analysis of the differential product shows an extremely interesting pattern that will appear associated to a decimetres ground deformation at the summit area of the volcano and at the Valle del Bove area. The GPS data and the preliminary results of SAR interferogram are in agreement with the deformation pattern expected in such kind of event, where the displacements are caused by deep magmatic sources and locally modulated by major structural features.

  20. Surgical Management of Complex Lower-Extremity Trauma With a Long Hindfoot Fusion Nail: A Case Report.

    PubMed

    Jain, Nickul S; Lopez, Gregory D; Bederman, S Samuel; Wirth, Garrett A; Scolaro, John A

    2016-08-01

    High-energy injuries can result in complete or partial loss of the talus. Ipsilateral fractures to the lower limb increase the complexity of surgical management, and treatment is guided by previous case reports of similar injuries. A case of complex lower-extremity trauma with extruded and missing talar body and ipsilateral type IIIB open tibia fracture is presented. Surgical limb reconstruction and salvage was performed successfully with a single orthopaedic implant in a manner not described previously in the literature. The purpose of this case report is to present the novel use of a single orthopaedic implant for treatment of a complex, open traumatic injury. Previous case reports in the literature have described the management of complete or partial talar loss. We describe the novel use of a long hindfoot fusion nail and staged bone grafting to achieve tibiocalcaneal arthrodesis for the treatment of complex lower-extremity trauma. Therapeutic, Level IV: Case study. © 2015 The Author(s).

  1. Three-Dimensional Printing Based Hybrid Manufacturing of Microfluidic Devices.

    PubMed

    Alapan, Yunus; Hasan, Muhammad Noman; Shen, Richang; Gurkan, Umut A

    2015-05-01

    Microfluidic platforms offer revolutionary and practical solutions to challenging problems in biology and medicine. Even though traditional micro/nanofabrication technologies expedited the emergence of the microfluidics field, recent advances in advanced additive manufacturing hold significant potential for single-step, stand-alone microfluidic device fabrication. One such technology, which holds a significant promise for next generation microsystem fabrication is three-dimensional (3D) printing. Presently, building 3D printed stand-alone microfluidic devices with fully embedded microchannels for applications in biology and medicine has the following challenges: (i) limitations in achievable design complexity, (ii) need for a wider variety of transparent materials, (iii) limited z-resolution, (iv) absence of extremely smooth surface finish, and (v) limitations in precision fabrication of hollow and void sections with extremely high surface area to volume ratio. We developed a new way to fabricate stand-alone microfluidic devices with integrated manifolds and embedded microchannels by utilizing a 3D printing and laser micromachined lamination based hybrid manufacturing approach. In this new fabrication method, we exploit the minimized fabrication steps enabled by 3D printing, and reduced assembly complexities facilitated by laser micromachined lamination method. The new hybrid fabrication method enables key features for advanced microfluidic system architecture: (i) increased design complexity in 3D, (ii) improved control over microflow behavior in all three directions and in multiple layers, (iii) transverse multilayer flow and precisely integrated flow distribution, and (iv) enhanced transparency for high resolution imaging and analysis. Hybrid manufacturing approaches hold great potential in advancing microfluidic device fabrication in terms of standardization, fast production, and user-independent manufacturing.

  2. Symptoms and Needs of Head and Neck Cancer Patients at Diagnosis of Incurability - Prevalences, Clinical Implications, and Feasibility of a Prospective Longitudinal Multicenter Cohort Study.

    PubMed

    Alt-Epping, Bernd; Seidel, Wiebke; Vogt, Jeannette; Mehnert, Anja; Thomas, Michael; van Oorschot, Birgitt; Wolff, Hendrik; Schliephake, Henning; Canis, Martin; Dröge, Leif H; Nauck, Friedemann; Lordick, Florian

    2016-01-01

    Little is known about the physical symptoms and psychosocial burden of patients at the time of diagnosis of an incurable situation, although cancer treatment guidelines demand early assessment and integration of palliative care concepts, beginning from the diagnosis of incurability. Therefore, we initiated a prospective longitudinal multicenter cohort study assessing the symptoms and needs of patients suffering from incurable cancer (various entities), from the time of diagnosing incurability (i.e., before palliative anticancer treatment was initiated) and in 3-monthly intervals thereafter, by using validated self-reporting tools. Here, we focus on patients with head and neck cancer and present preliminary results on symptoms and need prevalences, on clinical implications, and on the feasibility of a methodologically complex assessment procedure in a particularly vulnerable study population. 22 patients completed the first visit. The Eastern Cooperative Oncology Group (ECOG) performance scores and most physical symptoms and psychosocial items varied between the extremes, from a virtually uncompromised condition to extremely perceived symptoms and needs. If intense face-to-face study support was provided, the study concept proved to be feasible, despite the complexity of assessment, problems in interdisciplinary and patient communication, comorbidities, and early death from complications. The striking variability in the perceived symptom and need intensities requires a highly individualized approach. For clinical purposes, a less complex screening procedure would be desirable, in order to enable a routine, early and comprehensive support, including palliative care services. © 2016 S. Karger GmbH, Freiburg.

  3. Three-Dimensional Printing Based Hybrid Manufacturing of Microfluidic Devices

    PubMed Central

    Shen, Richang; Gurkan, Umut A.

    2016-01-01

    Microfluidic platforms offer revolutionary and practical solutions to challenging problems in biology and medicine. Even though traditional micro/nanofabrication technologies expedited the emergence of the microfluidics field, recent advances in advanced additive manufacturing hold significant potential for single-step, stand-alone microfluidic device fabrication. One such technology, which holds a significant promise for next generation microsystem fabrication is three-dimensional (3D) printing. Presently, building 3D printed stand-alone microfluidic devices with fully embedded microchannels for applications in biology and medicine has the following challenges: (i) limitations in achievable design complexity, (ii) need for a wider variety of transparent materials, (iii) limited z-resolution, (iv) absence of extremely smooth surface finish, and (v) limitations in precision fabrication of hollow and void sections with extremely high surface area to volume ratio. We developed a new way to fabricate stand-alone microfluidic devices with integrated manifolds and embedded microchannels by utilizing a 3D printing and laser micromachined lamination based hybrid manufacturing approach. In this new fabrication method, we exploit the minimized fabrication steps enabled by 3D printing, and reduced assembly complexities facilitated by laser micromachined lamination method. The new hybrid fabrication method enables key features for advanced microfluidic system architecture: (i) increased design complexity in 3D, (ii) improved control over microflow behavior in all three directions and in multiple layers, (iii) transverse multilayer flow and precisely integrated flow distribution, and (iv) enhanced transparency for high resolution imaging and analysis. Hybrid manufacturing approaches hold great potential in advancing microfluidic device fabrication in terms of standardization, fast production, and user-independent manufacturing. PMID:27512530

  4. "Complex" Posttraumatic Stress Disorder/Disorders of Extreme Stress (CP/DES) in Sexually Abused Children: An Exloratory Study.

    ERIC Educational Resources Information Center

    Hall, Darlene Kordich

    1999-01-01

    Compares three groups of young sexually abused children on seven "Complex" Posttraumatic Stress Disorder/Disorders of Extreme Stress (CP/DES) indices. As cumulative number of types of trauma increased, the number of CP/DES symptoms rose. Results suggest that CP/DES also characterizes sexually abused children, especially those who have…

  5. The Hunger Stones: a new source for more objective identification of historical droughts

    NASA Astrophysics Data System (ADS)

    Elleder, Libor

    2016-04-01

    Extreme droughts recorded recently more frequently in different parts of the world represent the most serious environmental problem. Our contribution identifies periods of hydrological drought. The extreme drought period in summer 2015 enabled the levelling of historical watermarks on the „Hunger Stone" (Hungerstein) in the Elbe in Czech town of Děčín. The comparison of the obtained levels of earlier palaeographic records with systematic measurements in the Děčín profile confirmed the hypothesis that the old watermarks represent the minimal water levels. Moreover, we present a review of so far known Hunger Stones in the Elbe River with their low-level watermarks. For identification of the drought period duration we used the oldest water level records from the Czech Hydrometeorological Institute (CHMI) database archive: Magdeburg (since 1727), Dresden (since 1801), Prague (since 1825) and Decin (since 1851) time-series. We obtained more objective and complex information on all historical droughts between 1727 and 2015. The low water-marks on Hunger Stones give us a possibility for augmentation of systematic records and extended our knowledge's back to 1616. The Hunger Stones in the Elbe River with old watermarks are unique testimony for studying of hydrological extremes, and last but not least also of anthropogenic changes in the riverbed of the Elbe.

  6. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  7. Eating Problems at Age 6 Years in a Whole Population Sample of Extremely Preterm Children

    ERIC Educational Resources Information Center

    Samara, Muthanna; Johnson, Samantha; Lamberts, Koen; Marlow, Neil; Wolke, Dieter

    2010-01-01

    Aim: The aim of this study was to investigate the prevalence of eating problems and their association with neurological and behavioural disabilities and growth among children born extremely preterm (EPC) at age 6 years. Method: A standard questionnaire about eating was completed by parents of 223 children (125 males [56.1%], 98 females [43.9%])…

  8. Floods in a changing climate: a review.

    PubMed

    Hunt, J C R

    2002-07-15

    This paper begins with an analysis of flooding as a natural disaster for which the solutions to the environmental, social and economic problems are essentially those of identifying and overcoming hazards and vulnerability, reducing risk and damaging consequences. Long-term solutions to flooding problems, especially in a changing climate, should be sought in the wider context of developing more sustainable social organization, economics and technology. Then, developments are described of how scientific understanding, supported by practical modelling, is leading to predictions of how human-induced changes to climatic and geological conditions are likely to influence flooding over at least the next 300 years, through their influences on evaporation, precipitation, run-off, wind storm and sea-level rise. Some of the outstanding scientific questions raised by these problems are highlighted, such as the statistical and deterministic prediction of extreme events, the understanding and modelling of mechanisms that operate on varying length- and time-scales, and the complex interactions between biological, ecological and physical problems. Some options for reducing the impact of flooding by new technology include both improved prediction and monitoring with computer models, and remote sensing, flexible and focused warning systems, and permanent and temporary flood-reduction systems.

  9. Magnetic storms and solar flares: can be analysed within similar mathematical framework with other extreme events?

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Potirakis, Stelios M.; Papadimitriou, Constantinos; Zitis, Pavlos I.; Eftaxias, Konstantinos

    2015-04-01

    The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We apply concepts of the nonextensive statistical physics, on time-series data of observable manifestations of the underlying complex processes ending up to different extreme events, in order to support the suggestion that a dynamical analogy characterizes the generation of a single magnetic storm, solar flare, earthquake (in terms of pre-seismic electromagnetic signals) , epileptic seizure, and economic crisis. The analysis reveals that all the above mentioned different extreme events can be analyzed within similar mathematical framework. More precisely, we show that the populations of magnitudes of fluctuations included in all the above mentioned pulse-like-type time series follow the traditional Gutenberg-Richter law as well as a nonextensive model for earthquake dynamics, with similar nonextensive q-parameter values. Moreover, based on a multidisciplinary statistical analysis we show that the extreme events are characterized by crucial common symptoms, namely: (i) high organization, high compressibility, low complexity, high information content; (ii) strong persistency; and (iii) existence of clear preferred direction of emerged activities. These symptoms clearly discriminate the appearance of the extreme events under study from the corresponding background noise.

  10. NeuroTessMesh: A Tool for the Generation and Visualization of Neuron Meshes and Adaptive On-the-Fly Refinement

    PubMed Central

    Garcia-Cantero, Juan J.; Brito, Juan P.; Mata, Susana; Bayona, Sofia; Pastor, Luis

    2017-01-01

    Gaining a better understanding of the human brain continues to be one of the greatest challenges for science, largely because of the overwhelming complexity of the brain and the difficulty of analyzing the features and behavior of dense neural networks. Regarding analysis, 3D visualization has proven to be a useful tool for the evaluation of complex systems. However, the large number of neurons in non-trivial circuits, together with their intricate geometry, makes the visualization of a neuronal scenario an extremely challenging computational problem. Previous work in this area dealt with the generation of 3D polygonal meshes that approximated the cells’ overall anatomy but did not attempt to deal with the extremely high storage and computational cost required to manage a complex scene. This paper presents NeuroTessMesh, a tool specifically designed to cope with many of the problems associated with the visualization of neural circuits that are comprised of large numbers of cells. In addition, this method facilitates the recovery and visualization of the 3D geometry of cells included in databases, such as NeuroMorpho, and provides the tools needed to approximate missing information such as the soma’s morphology. This method takes as its only input the available compact, yet incomplete, morphological tracings of the cells as acquired by neuroscientists. It uses a multiresolution approach that combines an initial, coarse mesh generation with subsequent on-the-fly adaptive mesh refinement stages using tessellation shaders. For the coarse mesh generation, a novel approach, based on the Finite Element Method, allows approximation of the 3D shape of the soma from its incomplete description. Subsequently, the adaptive refinement process performed in the graphic card generates meshes that provide good visual quality geometries at a reasonable computational cost, both in terms of memory and rendering time. All the described techniques have been integrated into NeuroTessMesh, available to the scientific community, to generate, visualize, and save the adaptive resolution meshes. PMID:28690511

  11. Towards communication-efficient quantum oblivious key distribution

    NASA Astrophysics Data System (ADS)

    Panduranga Rao, M. V.; Jakobi, M.

    2013-01-01

    Symmetrically private information retrieval, a fundamental problem in the field of secure multiparty computation, is defined as follows: A database D of N bits held by Bob is queried by a user Alice who is interested in the bit Db in such a way that (1) Alice learns Db and only Db and (2) Bob does not learn anything about Alice's choice b. While solutions to this problem in the classical domain rely largely on unproven computational complexity theoretic assumptions, it is also known that perfect solutions that guarantee both database and user privacy are impossible in the quantum domain. Jakobi [Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.83.022301 83, 022301 (2011)] proposed a protocol for oblivious transfer using well-known quantum key device (QKD) techniques to establish an oblivious key to solve this problem. Their solution provided a good degree of database and user privacy (using physical principles like the impossibility of perfectly distinguishing nonorthogonal quantum states and the impossibility of superluminal communication) while being loss-resistant and implementable with commercial QKD devices (due to the use of the Scarani-Acin-Ribordy-Gisin 2004 protocol). However, their quantum oblivious key distribution (QOKD) protocol requires a communication complexity of O(NlogN). Since modern databases can be extremely large, it is important to reduce this communication as much as possible. In this paper, we first suggest a modification of their protocol wherein the number of qubits that need to be exchanged is reduced to O(N). A subsequent generalization reduces the quantum communication complexity even further in such a way that only a few hundred qubits are needed to be transferred even for very large databases.

  12. Computational dynamic approaches for temporal omics data with applications to systems medicine.

    PubMed

    Liang, Yulan; Kelemen, Arpad

    2017-01-01

    Modeling and predicting biological dynamic systems and simultaneously estimating the kinetic structural and functional parameters are extremely important in systems and computational biology. This is key for understanding the complexity of the human health, drug response, disease susceptibility and pathogenesis for systems medicine. Temporal omics data used to measure the dynamic biological systems are essentials to discover complex biological interactions and clinical mechanism and causations. However, the delineation of the possible associations and causalities of genes, proteins, metabolites, cells and other biological entities from high throughput time course omics data is challenging for which conventional experimental techniques are not suited in the big omics era. In this paper, we present various recently developed dynamic trajectory and causal network approaches for temporal omics data, which are extremely useful for those researchers who want to start working in this challenging research area. Moreover, applications to various biological systems, health conditions and disease status, and examples that summarize the state-of-the art performances depending on different specific mining tasks are presented. We critically discuss the merits, drawbacks and limitations of the approaches, and the associated main challenges for the years ahead. The most recent computing tools and software to analyze specific problem type, associated platform resources, and other potentials for the dynamic trajectory and interaction methods are also presented and discussed in detail.

  13. Sexuality in persons with lower extremity amputations.

    PubMed

    Bodenheimer, C; Kerrigan, A J; Garber, S L; Monga, T N

    2000-06-15

    There is a paucity of information regarding sexual functioning in persons with lower extremity amputations. The purpose of this study was to describe sexual and psychological functioning and health status in persons with lower extremity amputation. Self-report surveys assessed sexual functioning (Derogatis Inventory), depression (Beck Depression Inventory, anxiety (State-Trait Anxiety Inventory), and health status (Health Status Questionnaire) in a convenience sample of 30 men with lower extremity amputations. Mean age of the participants was 57 years (range 32-79). Mean duration since amputation was 23 months (range 3-634 months). Twenty one subjects (70%) had trans-tibial and seven subjects (23%) had trans-femoral amputations. A majority of subjects were experiencing problems in several domains of sexual functioning. Fifty three percent (n = 16) of the subjects were engaged in sexual intercourse or oral sex at least once a month. Twenty seven percent (n = 8) were masturbating at least once a month. Nineteen subjects (63%) reported orgasmic problems and 67% were experiencing erectile difficulties. Despite these problems, interest in sex was high in over 90% of the subjects. There was no evidence of increased prevalence of depression or anxiety in these subjects when compared to other outpatient adult populations. Sexual problems were common in the subjects studied. Despite these problems, interest in sex remained high. Few investigations have been directed toward identifying the psychological and social factors that may contribute to these problems and more research with a larger population is needed in this area.

  14. Fluctuating residual limb volume accommodated with an adjustable, modular socket design: A novel case report.

    PubMed

    Mitton, Kay; Kulkarni, Jai; Dunn, Kenneth William; Ung, Anthony Hoang

    2017-10-01

    This novel case report describes the problems of prescribing a prosthetic socket in a left transfemoral amputee secondary to chronic patellofemoral instability compounded by complex regional pain syndrome. Case Description and Methods: Following the amputation, complex regional pain syndrome symptoms recurred in the residual limb, presenting mainly with oedema. Due to extreme daily volume fluctuations of the residual limb, a conventional, laminated thermoplastic socket fitting was not feasible. Findings and Outcomes: An adjustable, modular socket design was trialled. The residual limb volume fluctuations were accommodated within the socket. Amputee rehabilitation could be continued, and the rehabilitation goals were achieved. The patient was able to wear the prosthesis for 8 h daily and to walk unaided indoors and outdoors. An adjustable, modular socket design accommodated the daily residual limb volume fluctuations and provided a successful outcome in this case. It demonstrates the complexities of socket fitting and design with volume fluctuations. Clinical relevance Ongoing complex regional pain syndrome symptoms within the residual limb can lead to fitting difficulties in a conventional, laminated thermoplastic socket due to volume fluctuations. An adjustable, modular socket design can accommodate this and provide a successful outcome.

  15. Extremes in ecology: Avoiding the misleading effects of sampling variation in summary analyses

    USGS Publications Warehouse

    Link, W.A.; Sauer, J.R.

    1996-01-01

    Surveys such as the North American Breeding Bird Survey (BBS) produce large collections of parameter estimates. One's natural inclination when confronted with lists of parameter estimates is to look for the extreme values: in the BBS, these correspond to the species that appear to have the greatest changes in population size through time. Unfortunately, extreme estimates are liable to correspond to the most poorly estimated parameters. Consequently, the most extreme parameters may not match up with the most extreme parameter estimates. The ranking of parameter values on the basis of their estimates are a difficult statistical problem. We use data from the BBS and simulations to illustrate the potential misleading effects of sampling variation in rankings of parameters. We describe empirical Bayes and constrained empirical Bayes procedures which provide partial solutions to the problem of ranking in the presence of sampling variation.

  16. Forecasts and Warnings of Extreme Solar Storms at the Sun

    NASA Astrophysics Data System (ADS)

    Lundstedt, H.

    2015-12-01

    The most pressing space weather forecasts and warnings are those of the most intense solar flares and coronal mass ejections. However, in trying to develop these forecasts and warnings, we are confronted to many fundamental questions. Some of those are: How to define an observable measure for an extreme solar storm? How extreme can a solar storm become and how long is the build up time? How to make forecasts and warnings? Many have contributed to clarifying these general questions. In his presentation we will describe our latest results on the topological complexity of magnetic fields and the use of SDO SHARP parameters. The complexity concept will then be used to discuss the second question. Finally we will describe probability estimates of extreme solar storms.

  17. A two steps solution approach to solving large nonlinear models: application to a problem of conjunctive use.

    PubMed

    Vieira, J; Cunha, M C

    2011-01-01

    This article describes a solution method of solving large nonlinear problems in two steps. The two steps solution approach takes advantage of handling smaller and simpler models and having better starting points to improve solution efficiency. The set of nonlinear constraints (named as complicating constraints) which makes the solution of the model rather complex and time consuming is eliminated from step one. The complicating constraints are added only in the second step so that a solution of the complete model is then found. The solution method is applied to a large-scale problem of conjunctive use of surface water and groundwater resources. The results obtained are compared with solutions determined with the direct solve of the complete model in one single step. In all examples the two steps solution approach allowed a significant reduction of the computation time. This potential gain of efficiency of the two steps solution approach can be extremely important for work in progress and it can be particularly useful for cases where the computation time would be a critical factor for having an optimized solution in due time.

  18. Label-Free Biosensing with High Selectivity in Complex Media using Microtoroidal Optical Resonators

    NASA Astrophysics Data System (ADS)

    Ozgur, Erol; Toren, Pelin; Aktas, Ozan; Huseyinoglu, Ersin; Bayindir, Mehmet

    2015-08-01

    Although label-free biosensors comprised of optical microcavities inherently possess the capability of resolving molecular interactions at individual level, this extreme sensitivity restricts their convenience for large scale applications by inducing vulnerability towards non-specific interactions that readily occur within complex media. Therefore, the use of optical microresonators for biosensing is mostly limited within strictly defined laboratory conditions, instead of field applications as early detection of cancer markers in blood, or identification of contamination in food. Here, we propose a novel surface modification strategy suitable for but not limited to optical microresonator based biosensors, enabling highly selective biosensing with considerable sensitivity as well. Using a robust, silane-based surface coating which is simultaneously protein resistant and bioconjugable, we demonstrate that it becomes possible to perform biosensing within complex media, without compromising the sensitivity or reliability of the measurement. Functionalized microtoroids are successfully shown to resist nonspecific interactions, while simultaneously being used as sensitive biological sensors. This strategy could pave the way for important applications in terms of extending the use of state-of-the-art biosensors for solving problems similar to the aforementioned.

  19. Characterizing heterogeneous cellular responses to perturbations.

    PubMed

    Slack, Michael D; Martinez, Elisabeth D; Wu, Lani F; Altschuler, Steven J

    2008-12-09

    Cellular populations have been widely observed to respond heterogeneously to perturbation. However, interpreting the observed heterogeneity is an extremely challenging problem because of the complexity of possible cellular phenotypes, the large dimension of potential perturbations, and the lack of methods for separating meaningful biological information from noise. Here, we develop an image-based approach to characterize cellular phenotypes based on patterns of signaling marker colocalization. Heterogeneous cellular populations are characterized as mixtures of phenotypically distinct subpopulations, and responses to perturbations are summarized succinctly as probabilistic redistributions of these mixtures. We apply our method to characterize the heterogeneous responses of cancer cells to a panel of drugs. We find that cells treated with drugs of (dis-)similar mechanism exhibit (dis-)similar patterns of heterogeneity. Despite the observed phenotypic diversity of cells observed within our data, low-complexity models of heterogeneity were sufficient to distinguish most classes of drug mechanism. Our approach offers a computational framework for assessing the complexity of cellular heterogeneity, investigating the degree to which perturbations induce redistributions of a limited, but nontrivial, repertoire of underlying states and revealing functional significance contained within distinct patterns of heterogeneous responses.

  20. Learning to Read Against All Odds: Using Precision Reading to Enhance Literacy in Students with Cognitive Impairments, Extreme Academic Deficits, and Severe Social, Emotional, and Psychiatric Problems

    ERIC Educational Resources Information Center

    Freeze, Rick; Cook, Paula

    2005-01-01

    The purpose of this study was to assess the efficacy and practicality of precision reading, a constructive reading intervention, with students with cognitive impairments, extreme academic deficits in reading, and severe social, emotional, and psychiatric problems. As precision reading had shown promise with students with low achievement, learning…

  1. Frontal Electroencephalogram Asymmetry, Salivary Cortisol, and Internalizing Behavior Problems in Young Adults Who Were Born at Extremely Low Birth Weight

    ERIC Educational Resources Information Center

    Schmidt, Louis A.; Miskovic, Vladimir; Boyle, Michael; Saigal, Saroj

    2010-01-01

    The authors examined internalizing behavior problems at middle childhood, adolescence, and young adulthood and brain-based measures of stress vulnerability in 154 right-handed, nonimpaired young adults (M age = 23 years): 71 (30 males, 41 females) born at extremely low birth weight (ELBW; less than 1,000 g) and 83 (35 males, 48 females) controls…

  2. Oral bioavailability of curcumin: problems and advancements.

    PubMed

    Liu, Weidong; Zhai, Yingjie; Heng, Xueyuan; Che, Feng Yuan; Chen, Wenjun; Sun, Dezhong; Zhai, Guangxi

    2016-09-01

    Curcumin is a natural compound of Curcuma longa L. and has shown many pharmacological activities such as anti-inflammatory, anti-oxidant in both preclinical and clinical studies. Moreover, curcumin has hepatoprotective, neuroprotective activities and protects against myocardial infarction. Particularly, curcumin has also demonstrated favorite anticancer efficacy. But limiting factors such as its extremely low oral bioavailability hampers its application as therapeutic agent. Therefore, many technologies have been developed and applied to overcome this limitation. This review described the main physicochemical properties of curcumin and summarized the recent studies in the design and development of oral delivery systems for curcumin to enhance the solubility and oral bioavailability, including liposomes, nanoparticles and polymeric micelles, phospholipid complexes, and microemulsions.

  3. Pen-based computers: Computers without keys

    NASA Technical Reports Server (NTRS)

    Conklin, Cheryl L.

    1994-01-01

    The National Space Transportation System (NSTS) is comprised of many diverse and highly complex systems incorporating the latest technologies. Data collection associated with ground processing of the various Space Shuttle system elements is extremely challenging due to the many separate processing locations where data is generated. This presents a significant problem when the timely collection, transfer, collation, and storage of data is required. This paper describes how new technology, referred to as Pen-Based computers, is being used to transform the data collection process at Kennedy Space Center (KSC). Pen-Based computers have streamlined procedures, increased data accuracy, and now provide more complete information than previous methods. The end results is the elimination of Shuttle processing delays associated with data deficiencies.

  4. [Occupation-specific illnesses in musicians].

    PubMed

    Schuppert, M; Altenmüller, E

    1999-12-01

    Performance-related disorders in musicians are most often caused by multiple risk factors. They are based on the chronic complex, rapid and forceful work that requires highest precision, as well as on poor ergonomic conditions and psychological strain. Predominantly, the musculo-skeletal system of the upper extremity and the spine is affected by acute or chronic pain syndromes and neurological disorders. Stage fright and psychological tension frequently generate somatoform disorders and may contribute to the chronification of physical disabilities in musicians. Depending on the individual characteristics, the actual professional activity and the specific instrument, the performance-related risk factors and disorders differ widely. Early and regular prevention clearly contributes to a reduction of medical problems in musicians.

  5. The Need for Hydrologists in the Third World.

    NASA Astrophysics Data System (ADS)

    Sedlar, F.

    2014-12-01

    The United Nations estimates that by 2040 there will be 2 billion people living in slums around the world. Though the problems surrounding slums are varied and unique to each location, water is almost always the major concern. From physical issues such as water scarcity, treatment, and flooding to political issues including water privatization and distribution, water manifests itself in many complex, negative and unconventional ways in the global slums. Although many human rights and aid organizations are already doing important work in slums around the world I argue that the technical knowledge and perspectives that hydrologists have, particularly early career hydrologists, are extremely unique and desperately required in slums the world over.

  6. Gender, Education, Extremism and Security

    ERIC Educational Resources Information Center

    Davies, Lynn

    2008-01-01

    This paper examines the complex relationships between gender, education, extremism and security. After defining extremism and fundamentalism, it looks first at the relationship of gender to violence generally, before looking specifically at how this plays out in more extremist violence and terrorism. Religious fundamentalism is also shown to have…

  7. Extremal problems for topological indices in combinatorial chemistry.

    PubMed

    Tichy, Robert F; Wagner, Stephan

    2005-09-01

    Topological indices of molecular graphs are related to several physicochemical characteristics; recently, the inverse problem for some of these indices has been studied, and it has some applications in the design of combinatorial libraries for drug discovery. It is thus very natural to study also extremal problems for these indices, i.e., finding graphs having minimal or maximal index. In this paper, these questions will be discussed for three different indices, namely the sigma-index, the c-index and the Z-index, with emphasis on the sigma-index.

  8. Extreme value problems without calculus: a good link with geometry and elementary maths

    NASA Astrophysics Data System (ADS)

    Ganci, Salvatore

    2016-11-01

    Some classical examples of problem solving, where an extreme value condition is required, are here considered and/or revisited. The search for non-calculus solutions appears pedagogically useful and intriguing as shown through a rich literature. A teacher, who teaches both maths and physics, (as happens in Italian High schools) can find in these kinds of problems a mind stimulating exercise compared with the standard solution obtained by the differential calculus. A good link between the geometric and analytical explanations is so established.

  9. A place for marriage and family services in employee assistance programs (EAPs): a survey of EAP client problems and needs.

    PubMed

    Shumway, Sterling T; Wampler, Richard S; Dersch, Charette; Arredondo, Rudy

    2004-01-01

    Marriage and family services have not been widely recognized as part of employee assistance programs (EAP), although family and relational problems are widely cited as sources of problems on the job. EAP clients (N = 800, 97% self-referred) indicated how much family, psychological/emotional, drug, alcohol, employment-related, legal, and medical problems troubled them and the need for services in each area. Psychological/emotional (66%) and family (65%) problem areas frequently were rated "considerable" or "extreme." Both areas were rated as "considerable" or "extreme" by 48.6% of participants. In view of the evidence that marriage and family services can be effective with both family and psychological/emotional problems, professionals who are competent to provide such services have much to offer EAP programs.

  10. Climate network analysis of regional precipitation extremes: The true story told by event synchronization

    NASA Astrophysics Data System (ADS)

    Odenweller, Adrian; Donner, Reik V.

    2017-04-01

    Over the last decade, complex network methods have been frequently used for characterizing spatio-temporal patterns of climate variability from a complex systems perspective, yielding new insights into time-dependent teleconnectivity patterns and couplings between different components of the Earth climate. Among the foremost results reported, network analyses of the synchronicity of extreme events as captured by the so-called event synchronization have been proposed to be powerful tools for disentangling the spatio-temporal organization of particularly extreme rainfall events and anticipating the timing of monsoon onsets or extreme floodings. Rooted in the analysis of spike train synchrony analysis in the neurosciences, event synchronization has the great advantage of automatically classifying pairs of events arising at two distinct spatial locations as temporally close (and, thus, possibly statistically - or even dynamically - interrelated) or not without the necessity of selecting an additional parameter in terms of a maximally tolerable delay between these events. This consideration is conceptually justified in case of the original application to spike trains in electroencephalogram (EEG) recordings, where the inter-spike intervals show relatively narrow distributions at high temporal sampling rates. However, in case of climate studies, precipitation extremes defined by daily precipitation sums exceeding a certain empirical percentile of their local distribution exhibit a distinctively different type of distribution of waiting times between subsequent events. This raises conceptual concerns if event synchronization is still appropriate for detecting interlinkages between spatially distributed precipitation extremes. In order to study this problem in more detail, we employ event synchronization together with an alternative similarity measure for event sequences, event coincidence rates, which requires a manual setting of the tolerable maximum delay between two events to be considered potentially related. Both measures are then used to generate climate networks from parts of the satellite-based TRMM precipitation data set at daily resolution covering the Indian and East Asian monsoon domains, respectively, thereby reanalysing previously published results. The obtained spatial patterns of degree densities and local clustering coefficients exhibit marked differences between both similarity measures. Specifically, we demonstrate that there exists a strong relationship between the fraction of extremes occurring at subsequent days and the degree density in the event synchronization based networks, suggesting that the spatial patterns obtained using this approach are strongly affected by the presence of serial dependencies between events. Given that a manual selection of the maximally tolerable delay between two events can be guided by a priori climatological knowledge and even used for systematic testing of different hypotheses on climatic processes underlying the emergence of spatio-temporal patterns of extreme precipitation, our results provide evidence that event coincidence rates are a more appropriate statistical characteristic for similarity assessment and network construction for climate extremes, while results based on event synchronization need to be interpreted with great caution.

  11. The application of the statistical theory of extreme values to gust-load problems

    NASA Technical Reports Server (NTRS)

    Press, Harry

    1950-01-01

    An analysis is presented which indicates that the statistical theory of extreme values is applicable to the problems of predicting the frequency of encountering the larger gust loads and gust velocities for both specific test conditions as well as commercial transport operations. The extreme-value theory provides an analytic form for the distributions of maximum values of gust load and velocity. Methods of fitting the distribution are given along with a method of estimating the reliability of the predictions. The theory of extreme values is applied to available load data from commercial transport operations. The results indicate that the estimates of the frequency of encountering the larger loads are more consistent with the data and more reliable than those obtained in previous analyses. (author)

  12. A Flux-Corrected Transport Based Hydrodynamic Model for the Plasmasphere Refilling Problem following Geomagnetic Storms

    NASA Astrophysics Data System (ADS)

    Chatterjee, K.; Schunk, R. W.

    2017-12-01

    The refilling of the plasmasphere following a geomagnetic storm remains one of the longstanding problems in the area of ionosphere-magnetosphere coupling. Both diffusion and hydrodynamic approximations have been adopted for the modeling and solution of this problem. The diffusion approximation neglects the nonlinear inertial term in the momentum equation and so this approximation is not rigorously valid immediately after the storm. Over the last few years, we have developed a hydrodynamic refilling model using the flux-corrected transport method, a numerical method that is extremely well suited to handling nonlinear problems with shocks and discontinuities. The plasma transport equations are solved along 1D closed magnetic field lines that connect conjugate ionospheres and the model currently includes three ion (H+, O+, He+) and two neutral (O, H) species. In this work, each ion species under consideration has been modeled as two separate streams emanating from the conjugate hemispheres and the model correctly predicts supersonic ion speeds and the presence of high levels of Helium during the early hours of refilling. The ultimate objective of this research is the development of a 3D model for the plasmasphere refilling problem and with additional development, the same methodology can potentially be applied to the study of other complex space plasma coupling problems in closed flux tube geometries. Index Terms: 2447 Modeling and forecasting [IONOSPHERE] 2753 Numerical modeling [MAGNETOSPHERIC PHYSICS] 7959 Models [SPACE WEATHER

  13. Deciphering landscape complexity to predict (non)linear responses to extreme climatic events

    USDA-ARS?s Scientific Manuscript database

    Extreme events are increasing in frequency and magnitude for many landscapes globally. Ecologically, most of the focus on extreme climatic events has been on effects of either short-term pulses (floods, freezes) or long-term drought. Multi-year increases in precipitation are also occurring with litt...

  14. Doing Solar Science With Extreme-ultraviolet and X-ray High Resolution Imaging Spectroscopy

    NASA Astrophysics Data System (ADS)

    Doschek, G. A.

    2005-12-01

    In this talk I will demonstrate how high resolution extreme-ultraviolet (EUV) and/or X-ray imaging spectroscopy can be used to provide unique information for solving several current key problems of the solar atmosphere, e.g., the morphology and reconnection site of solar flares, the structure of the transition region, and coronal heating. I will describe the spectra that already exist relevant to these problems and what the shortcomings of the data are, and how an instrument such as the Extreme-ultraviolet Imaging Spectrometer (EIS) on Solar-B as well as other proposed spectroscopy missions such as NEXUS and RAM will improve on the existing observations. I will discuss a few particularly interesting properties of the spectra and atomic data for highly ionized atoms that are important for the science problems.

  15. An efficient abnormal cervical cell detection system based on multi-instance extreme learning machine

    NASA Astrophysics Data System (ADS)

    Zhao, Lili; Yin, Jianping; Yuan, Lihuan; Liu, Qiang; Li, Kuan; Qiu, Minghui

    2017-07-01

    Automatic detection of abnormal cells from cervical smear images is extremely demanded in annual diagnosis of women's cervical cancer. For this medical cell recognition problem, there are three different feature sections, namely cytology morphology, nuclear chromatin pathology and region intensity. The challenges of this problem come from feature combination s and classification accurately and efficiently. Thus, we propose an efficient abnormal cervical cell detection system based on multi-instance extreme learning machine (MI-ELM) to deal with above two questions in one unified framework. MI-ELM is one of the most promising supervised learning classifiers which can deal with several feature sections and realistic classification problems analytically. Experiment results over Herlev dataset demonstrate that the proposed method outperforms three traditional methods for two-class classification in terms of well accuracy and less time.

  16. Inter-examiner classification reliability of Mechanical Diagnosis and Therapy for extremity problems - Systematic review.

    PubMed

    Takasaki, Hiroshi; Okuyama, Kousuke; Rosedale, Richard

    2017-02-01

    Mechanical Diagnosis and Therapy (MDT) is used in the treatment of extremity problems. Classifying clinical problems is one method of providing effective treatment to a target population. Classification reliability is a key factor to determine the precise clinical problem and to direct an appropriate intervention. To explore inter-examiner reliability of the MDT classification for extremity problems in three reliability designs: 1) vignette reliability using surveys with patient vignettes, 2) concurrent reliability, where multiple assessors decide a classification by observing someone's assessment, 3) successive reliability, where multiple assessors independently assess the same patient at different times. Systematic review with data synthesis in a quantitative format. Agreement of MDT subgroups was examined using the Kappa value, with the operational definition of acceptable reliability set at ≥ 0.6. The level of evidence was determined considering the methodological quality of the studies. Six studies were included and all studies met the criteria for high quality. Kappa values for the vignette reliability design (five studies) were ≥ 0.7. There was data from two cohorts in one study for the concurrent reliability design and the Kappa values ranged from 0.45 to 1.0. Kappa values for the successive reliability design (data from three cohorts in one study) were < 0.6. The current review found strong evidence of acceptable inter-examiner reliability of MDT classification for extremity problems in the vignette reliability design, limited evidence of acceptable reliability in the concurrent reliability design and unacceptable reliability in the successive reliability design. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Cascade phenomenon against subsequent failures in complex networks

    NASA Astrophysics Data System (ADS)

    Jiang, Zhong-Yuan; Liu, Zhi-Quan; He, Xuan; Ma, Jian-Feng

    2018-06-01

    Cascade phenomenon may lead to catastrophic disasters which extremely imperil the network safety or security in various complex systems such as communication networks, power grids, social networks and so on. In some flow-based networks, the load of failed nodes can be redistributed locally to their neighboring nodes to maximally preserve the traffic oscillations or large-scale cascading failures. However, in such local flow redistribution model, a small set of key nodes attacked subsequently can result in network collapse. Then it is a critical problem to effectively find the set of key nodes in the network. To our best knowledge, this work is the first to study this problem comprehensively. We first introduce the extra capacity for every node to put up with flow fluctuations from neighbors, and two extra capacity distributions including degree based distribution and average distribution are employed. Four heuristic key nodes discovering methods including High-Degree-First (HDF), Low-Degree-First (LDF), Random and Greedy Algorithms (GA) are presented. Extensive simulations are realized in both scale-free networks and random networks. The results show that the greedy algorithm can efficiently find the set of key nodes in both scale-free and random networks. Our work studies network robustness against cascading failures from a very novel perspective, and methods and results are very useful for network robustness evaluations and protections.

  18. Translation Analysis on Civil Engineering Text Produced by Machine Translator

    NASA Astrophysics Data System (ADS)

    Sutopo, Anam

    2018-02-01

    Translation is extremely needed in communication since people have serious problem in the language used. Translation activity is done by the person in charge for translating the material. Translation activity is also able to be done by machine. It is called machine translation, reflected in the programs developed by programmer. One of them is Transtool. Many people used Transtool for helping them in solving the problem related with translation activities. This paper wants to deliver how important is the Transtool program, how effective is Transtool program and how is the function of Transtool for human business. This study applies qualitative research. The sources of data were document and informant. This study used documentation and in dept-interviewing as the techniques for collecting data. The collected data were analyzed by using interactive analysis. The results of the study show that, first; Transtool program is helpful for people in translating the civil engineering text and it functions as the aid or helper, second; the working of Transtool software program is effective enough and third; the result of translation produced by Transtool is good for short and simple sentences and not readable, not understandable and not accurate for long sentences (compound, complex and compound complex) thought the result is informative. The translated material must be edited by the professional translator.

  19. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  20. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE PAGES

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...

    2015-07-14

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  1. [Rehabilitation of the patients following the endoprosthetic replacement of the joints of the lower extremities].

    PubMed

    Rud, I M; Melnikova, E A; Rassulova, M A; Razumov, A N; Gorelikov, A E

    2017-12-28

    The present article is the analytical review of the literature pertaining to the problem of rehabilitation of the patients following the endoprosthetic replacement of joints of the lower extremities. The relevance of the problem of interest for medical rehabilitation is beyond any doubt. The traditional methods for the rehabilitation of the patients do not always lead to the desired results. The authors discuss in detail the need for and the contemporary approaches to the rehabilitation of the patients who had undergone reconstructive surgery and arthroplasty of the joints of the lower extremities. The pathogenetically-based three-stage algorithm for medical rehabilitation is proposed.

  2. Wildlife as valuable natural resources vs. intolerable pests: A suburban wildlife management model

    USGS Publications Warehouse

    DeStefano, S.; Deblinger, R.D.

    2005-01-01

    Management of wildlife in suburban environments involves a complex set of interactions between both human and wildlife populations. Managers need additional tools, such as models, that can help them assess the status of wildlife populations, devise and apply management programs, and convey this information to other professionals and the public. We present a model that conceptualizes how some wildlife populations can fluctuate between extremely low (rare, threatened, or endangered status) and extremely high (overabundant) numbers over time. Changes in wildlife abundance can induce changes in human perceptions, which continually redefine species as a valuable resource to be protected versus a pest to be controlled. Management programs thatincorporate a number of approaches and promote more stable populations of wildlife avoid the problems of the resource versus pest transformation, are less costly to society, and encourage more positive and less negative interactions between humans and wildlife. We presenta case example of the beaver Castor canadensis in Massachusetts to illustrate how this model functions and can be applied. ?? 2005 Springer Science + Business Media, Inc.

  3. Intermittent and sustained periodic windows in networked chaotic Rössler oscillators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Zhiwei; Sun, Yong; University of the Chinese Academy of Sciences, Beijing 100049

    Route to chaos (or periodicity) in dynamical systems is one of fundamental problems. Here, dynamical behaviors of coupled chaotic Rössler oscillators on complex networks are investigated and two different types of periodic windows with the variation of coupling strength are found. Under a moderate coupling, the periodic window is intermittent, and the attractors within the window extremely sensitively depend on the initial conditions, coupling parameter, and topology of the network. Therefore, after adding or removing one edge of network, the periodic attractor can be destroyed and substituted by a chaotic one, or vice versa. In contrast, under an extremely weakmore » coupling, another type of periodic window appears, which insensitively depends on the initial conditions, coupling parameter, and network. It is sustained and unchanged for different types of network structure. It is also found that the phase differences of the oscillators are almost discrete and randomly distributed except that directly linked oscillators more likely have different phases. These dynamical behaviors have also been generally observed in other networked chaotic oscillators.« less

  4. Current trends in nanobiosensor technology

    PubMed Central

    Wu, Diana; Langer, Robert S

    2014-01-01

    The development of tools and processes used to fabricate, measure, and image nanoscale objects has lead to a wide range of work devoted to producing sensors that interact with extremely small numbers (or an extremely small concentration) of analyte molecules. These advances are particularly exciting in the context of biosensing, where the demands for low concentration detection and high specificity are great. Nanoscale biosensors, or nanobiosensors, provide researchers with an unprecedented level of sensitivity, often to the single molecule level. The use of biomolecule-functionalized surfaces can dramatically boost the specificity of the detection system, but can also yield reproducibility problems and increased complexity. Several nanobiosensor architectures based on mechanical devices, optical resonators, functionalized nanoparticles, nanowires, nanotubes, and nanofibers have been demonstrated in the lab. As nanobiosensor technology becomes more refined and reliable, it is likely it will eventually make its way from the lab to the clinic, where future lab-on-a-chip devices incorporating an array of nanobiosensors could be used for rapid screening of a wide variety of analytes at low cost using small samples of patient material. PMID:21391305

  5. New perspective on single-radiator multiple-port antennas for adaptive beamforming applications.

    PubMed

    Byun, Gangil; Choo, Hosung

    2017-01-01

    One of the most challenging problems in recent antenna engineering fields is to achieve highly reliable beamforming capabilities in an extremely restricted space of small handheld devices. In this paper, we introduce a new perspective on single-radiator multiple-port (SRMP) antenna to alter the traditional approach of multiple-antenna arrays for improving beamforming performances with reduced aperture sizes. The major contribution of this paper is to demonstrate the beamforming capability of the SRMP antenna for use as an extremely miniaturized front-end component in more sophisticated beamforming applications. To examine the beamforming capability, the radiation properties and the array factor of the SRMP antenna are theoretically formulated for electromagnetic characterization and are used as complex weights to form adaptive array patterns. Then, its fundamental performance limits are rigorously explored through enumerative studies by varying the dielectric constant of the substrate, and field tests are conducted using a beamforming hardware to confirm the feasibility. The results demonstrate that the new perspective of the SRMP antenna allows for improved beamforming performances with the ability of maintaining consistently smaller aperture sizes compared to the traditional multiple-antenna arrays.

  6. A statistical mechanics approach to computing rare transitions in multi-stable turbulent geophysical flows

    NASA Astrophysics Data System (ADS)

    Laurie, J.; Bouchet, F.

    2012-04-01

    Many turbulent flows undergo sporadic random transitions, after long periods of apparent statistical stationarity. For instance, paths of the Kuroshio [1], the Earth's magnetic field reversal, atmospheric flows [2], MHD experiments [3], 2D turbulence experiments [4,5], 3D flows [6] show this kind of behavior. The understanding of this phenomena is extremely difficult due to the complexity, the large number of degrees of freedom, and the non-equilibrium nature of these turbulent flows. It is however a key issue for many geophysical problems. A straightforward study of these transitions, through a direct numerical simulation of the governing equations, is nearly always impracticable. This is mainly a complexity problem, due to the large number of degrees of freedom involved for genuine turbulent flows, and the extremely long time between two transitions. In this talk, we consider two-dimensional and geostrophic turbulent models, with stochastic forces. We consider regimes where two or more attractors coexist. As an alternative to direct numerical simulation, we propose a non-equilibrium statistical mechanics approach to the computation of this phenomenon. Our strategy is based on large deviation theory [7], derived from a path integral representation of the stochastic process. Among the trajectories connecting two non-equilibrium attractors, we determine the most probable one. Moreover, we also determine the transition rates, and in which cases this most probable trajectory is a typical one. Interestingly, we prove that in the class of models we consider, a mechanism exists for diffusion over sets of connected attractors. For the type of stochastic forces that allows this diffusion, the transition between attractors is not a rare event. It is then very difficult to characterize the flow as bistable. However for another class of stochastic forces, this diffusion mechanism is prevented, and genuine bistability or multi-stability is observed. We discuss how these results are probably connected to the long debated existence of multi-stability in the atmosphere and oceans.

  7. Predictability of extremes in non-linear hierarchically organized systems

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.; Soloviev, A.

    2011-12-01

    Understanding the complexity of non-linear dynamics of hierarchically organized systems progresses to new approaches in assessing hazard and risk of the extreme catastrophic events. In particular, a series of interrelated step-by-step studies of seismic process along with its non-stationary though self-organized behaviors, has led already to reproducible intermediate-term middle-range earthquake forecast/prediction technique that has passed control in forward real-time applications during the last two decades. The observed seismic dynamics prior to and after many mega, great, major, and strong earthquakes demonstrate common features of predictability and diverse behavior in course durable phase transitions in complex hierarchical non-linear system of blocks-and-faults of the Earth lithosphere. The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable analytical models, which leads to widespread practice of their deceptive application. The consequences of underestimation of seismic hazard propagate non-linearly into inflicted underestimation of risk and, eventually, into unexpected societal losses due to earthquakes and associated phenomena (i.e., collapse of buildings, landslides, tsunamis, liquefaction, etc.). The studies aimed at forecast/prediction of extreme events (interpreted as critical transitions) in geophysical and socio-economical systems include: (i) large earthquakes in geophysical systems of the lithosphere blocks-and-faults, (ii) starts and ends of economic recessions, (iii) episodes of a sharp increase in the unemployment rate, (iv) surge of the homicides in socio-economic systems. These studies are based on a heuristic search of phenomena preceding critical transitions and application of methodologies of pattern recognition of infrequent events. Any study of rare phenomena of highly complex origin, by their nature, implies using problem oriented methods, which design breaks the limits of classical statistical or econometric applications. The unambiguously designed forecast/prediction algorithms of the "yes or no" variety, analyze the observable quantitative integrals and indicators available to a given date, then provides unambiguous answer to the question whether a critical transition should be expected or not in the next time interval. Since the predictability of an originating non-linear dynamical system is limited in principle, the probabilistic component of forecast/prediction algorithms is represented by the empirical probabilities of alarms, on one side, and failures-to-predict, on the other, estimated on control sets achieved in the retro- and prospective experiments. Predicting in advance is the only decisive test of forecast/predictions and the relevant on-going experiments are conducted in the case seismic extremes, recessions, and increases of unemployment rate. The results achieved in real-time testing keep being encouraging and confirm predictability of the extremes.

  8. High-resolution, submicron particle size distribution analysis using gravitational-sweep sedimentation.

    PubMed Central

    Mächtle, W

    1999-01-01

    Sedimentation velocity is a powerful tool for the analysis of complex solutions of macromolecules. However, sample turbidity imposes an upper limit to the size of molecular complexes currently amenable to such analysis. Furthermore, the breadth of the particle size distribution, combined with possible variations in the density of different particles, makes it difficult to analyze extremely complex mixtures. These same problems are faced in the polymer industry, where dispersions of latices, pigments, lacquers, and emulsions must be characterized. There is a rich history of methods developed for the polymer industry finding use in the biochemical sciences. Two such methods are presented. These use analytical ultracentrifugation to determine the density and size distributions for submicron-sized particles. Both methods rely on Stokes' equations to estimate particle size and density, whereas turbidity, corrected using Mie's theory, provides the concentration measurement. The first method uses the sedimentation time in dispersion media of different densities to evaluate the particle density and size distribution. This method works provided the sample is chemically homogeneous. The second method splices together data gathered at different sample concentrations, thus permitting the high-resolution determination of the size distribution of particle diameters ranging from 10 to 3000 nm. By increasing the rotor speed exponentially from 0 to 40,000 rpm over a 1-h period, size distributions may be measured for extremely broadly distributed dispersions. Presented here is a short history of particle size distribution analysis using the ultracentrifuge, along with a description of the newest experimental methods. Several applications of the methods are provided that demonstrate the breadth of its utility, including extensions to samples containing nonspherical and chromophoric particles. PMID:9916040

  9. Morphological similarity and ecological overlap in two rotifer species.

    PubMed

    Gabaldón, Carmen; Montero-Pau, Javier; Serra, Manuel; Carmona, María José

    2013-01-01

    Co-occurrence of cryptic species raises theoretically relevant questions regarding their coexistence and ecological similarity. Given their great morphological similitude and close phylogenetic relationship (i.e., niche retention), these species will have similar ecological requirements and are expected to have strong competitive interactions. This raises the problem of finding the mechanisms that may explain the coexistence of cryptic species and challenges the conventional view of coexistence based on niche differentiation. The cryptic species complex of the rotifer Brachionus plicatilis is an excellent model to study these questions and to test hypotheses regarding ecological differentiation. Rotifer species within this complex are filtering zooplankters commonly found inhabiting the same ponds across the Iberian Peninsula and exhibit an extremely similar morphology-some of them being even virtually identical. Here, we explore whether subtle differences in body size and morphology translate into ecological differentiation by comparing two extremely morphologically similar species belonging to this complex: B. plicatilis and B. manjavacas. We focus on three key ecological features related to body size: (1) functional response, expressed by clearance rates; (2) tolerance to starvation, measured by growth and reproduction; and (3) vulnerability to copepod predation, measured by the number of preyed upon neonates. No major differences between B. plicatilis and B. manjavacas were found in the response to these features. Our results demonstrate the existence of a substantial niche overlap, suggesting that the subtle size differences between these two cryptic species are not sufficient to explain their coexistence. This lack of evidence for ecological differentiation in the studied biotic niche features is in agreement with the phylogenetic limiting similarity hypothesis but requires a mechanistic explanation of the coexistence of these species not based on differentiation related to biotic niche axes.

  10. Efficient statistically accurate algorithms for the Fokker-Planck equation in large dimensions

    NASA Astrophysics Data System (ADS)

    Chen, Nan; Majda, Andrew J.

    2018-02-01

    Solving the Fokker-Planck equation for high-dimensional complex turbulent dynamical systems is an important and practical issue. However, most traditional methods suffer from the curse of dimensionality and have difficulties in capturing the fat tailed highly intermittent probability density functions (PDFs) of complex systems in turbulence, neuroscience and excitable media. In this article, efficient statistically accurate algorithms are developed for solving both the transient and the equilibrium solutions of Fokker-Planck equations associated with high-dimensional nonlinear turbulent dynamical systems with conditional Gaussian structures. The algorithms involve a hybrid strategy that requires only a small number of ensembles. Here, a conditional Gaussian mixture in a high-dimensional subspace via an extremely efficient parametric method is combined with a judicious non-parametric Gaussian kernel density estimation in the remaining low-dimensional subspace. Particularly, the parametric method provides closed analytical formulae for determining the conditional Gaussian distributions in the high-dimensional subspace and is therefore computationally efficient and accurate. The full non-Gaussian PDF of the system is then given by a Gaussian mixture. Different from traditional particle methods, each conditional Gaussian distribution here covers a significant portion of the high-dimensional PDF. Therefore a small number of ensembles is sufficient to recover the full PDF, which overcomes the curse of dimensionality. Notably, the mixture distribution has significant skill in capturing the transient behavior with fat tails of the high-dimensional non-Gaussian PDFs, and this facilitates the algorithms in accurately describing the intermittency and extreme events in complex turbulent systems. It is shown in a stringent set of test problems that the method only requires an order of O (100) ensembles to successfully recover the highly non-Gaussian transient PDFs in up to 6 dimensions with only small errors.

  11. Morphological Similarity and Ecological Overlap in Two Rotifer Species

    PubMed Central

    Gabaldón, Carmen; Montero-Pau, Javier; Serra, Manuel; Carmona, María José

    2013-01-01

    Co-occurrence of cryptic species raises theoretically relevant questions regarding their coexistence and ecological similarity. Given their great morphological similitude and close phylogenetic relationship (i.e., niche retention), these species will have similar ecological requirements and are expected to have strong competitive interactions. This raises the problem of finding the mechanisms that may explain the coexistence of cryptic species and challenges the conventional view of coexistence based on niche differentiation. The cryptic species complex of the rotifer Brachionus plicatilis is an excellent model to study these questions and to test hypotheses regarding ecological differentiation. Rotifer species within this complex are filtering zooplankters commonly found inhabiting the same ponds across the Iberian Peninsula and exhibit an extremely similar morphology—some of them being even virtually identical. Here, we explore whether subtle differences in body size and morphology translate into ecological differentiation by comparing two extremely morphologically similar species belonging to this complex: B. plicatilis and B. manjavacas. We focus on three key ecological features related to body size: (1) functional response, expressed by clearance rates; (2) tolerance to starvation, measured by growth and reproduction; and (3) vulnerability to copepod predation, measured by the number of preyed upon neonates. No major differences between B. plicatilis and B. manjavacas were found in the response to these features. Our results demonstrate the existence of a substantial niche overlap, suggesting that the subtle size differences between these two cryptic species are not sufficient to explain their coexistence. This lack of evidence for ecological differentiation in the studied biotic niche features is in agreement with the phylogenetic limiting similarity hypothesis but requires a mechanistic explanation of the coexistence of these species not based on differentiation related to biotic niche axes. PMID:23451154

  12. Capturing rogue waves by multi-point statistics

    NASA Astrophysics Data System (ADS)

    Hadjihosseini, A.; Wächter, Matthias; Hoffmann, N. P.; Peinke, J.

    2016-01-01

    As an example of a complex system with extreme events, we investigate ocean wave states exhibiting rogue waves. We present a statistical method of data analysis based on multi-point statistics which for the first time allows the grasping of extreme rogue wave events in a highly satisfactory statistical manner. The key to the success of the approach is mapping the complexity of multi-point data onto the statistics of hierarchically ordered height increments for different time scales, for which we can show that a stochastic cascade process with Markov properties is governed by a Fokker-Planck equation. Conditional probabilities as well as the Fokker-Planck equation itself can be estimated directly from the available observational data. With this stochastic description surrogate data sets can in turn be generated, which makes it possible to work out arbitrary statistical features of the complex sea state in general, and extreme rogue wave events in particular. The results also open up new perspectives for forecasting the occurrence probability of extreme rogue wave events, and even for forecasting the occurrence of individual rogue waves based on precursory dynamics.

  13. Milne problem for non-absorbing medium with extremely anisotropic scattering kernel in the case of specular and diffuse reflecting boundaries

    NASA Astrophysics Data System (ADS)

    Güleçyüz, M. Ç.; Şenyiğit, M.; Ersoy, A.

    2018-01-01

    The Milne problem is studied in one speed neutron transport theory using the linearly anisotropic scattering kernel which combines forward and backward scatterings (extremely anisotropic scattering) for a non-absorbing medium with specular and diffuse reflection boundary conditions. In order to calculate the extrapolated endpoint for the Milne problem, Legendre polynomial approximation (PN method) is applied and numerical results are tabulated for selected cases as a function of different degrees of anisotropic scattering. Finally, some results are discussed and compared with the existing results in literature.

  14. Dynamical properties and extremes of Northern Hemisphere climate fields over the past 60 years

    NASA Astrophysics Data System (ADS)

    Faranda, Davide; Messori, Gabriele; Alvarez-Castro, M. Carmen; Yiou, Pascal

    2017-12-01

    Atmospheric dynamics are described by a set of partial differential equations yielding an infinite-dimensional phase space. However, the actual trajectories followed by the system appear to be constrained to a finite-dimensional phase space, i.e. a strange attractor. The dynamical properties of this attractor are difficult to determine due to the complex nature of atmospheric motions. A first step to simplify the problem is to focus on observables which affect - or are linked to phenomena which affect - human welfare and activities, such as sea-level pressure, 2 m temperature, and precipitation frequency. We make use of recent advances in dynamical systems theory to estimate two instantaneous dynamical properties of the above fields for the Northern Hemisphere: local dimension and persistence. We then use these metrics to characterize the seasonality of the different fields and their interplay. We further analyse the large-scale anomaly patterns corresponding to phase-space extremes - namely time steps at which the fields display extremes in their instantaneous dynamical properties. The analysis is based on the NCEP/NCAR reanalysis data, over the period 1948-2013. The results show that (i) despite the high dimensionality of atmospheric dynamics, the Northern Hemisphere sea-level pressure and temperature fields can on average be described by roughly 20 degrees of freedom; (ii) the precipitation field has a higher dimensionality; and (iii) the seasonal forcing modulates the variability of the dynamical indicators and affects the occurrence of phase-space extremes. We further identify a number of robust correlations between the dynamical properties of the different variables.

  15. Risk Factors for Lower-Extremity Injuries Among Contemporary Dance Students.

    PubMed

    van Seters, Christine; van Rijn, Rogier M; van Middelkoop, Marienke; Stubbe, Janine H

    2017-10-10

    To determine whether student characteristics, lower-extremity kinematics, and strength are risk factors for sustaining lower-extremity injuries in preprofessional contemporary dancers. Prospective cohort study. Codarts University of the Arts. Forty-five first-year students of Bachelor Dance and Bachelor Dance Teacher. At the beginning of the academic year, the injury history (only lower-extremity) and student characteristics (age, sex, educational program) were assessed using a questionnaire. Besides, lower-extremity kinematics [single-leg squat (SLS)], strength (countermovement jump) and height and weight (body mass index) were measured during a physical performance test. Substantial lower-extremity injuries during the academic year were defined as any problems leading to moderate or severe reductions in training volume or in performance, or complete inability to participate in dance at least once during follow-up as measured with the Oslo Sports Trauma Research Center (OSTRC) Questionnaire on Health Problems. Injuries were recorded on a monthly basis using a questionnaire. Analyses on leg-level were performed using generalized estimating equations to test the associations between substantial lower-extremity injuries and potential risk factors. The 1-year incidence of lower-extremity injuries was 82.2%. Of these, 51.4% was a substantial lower-extremity injury. Multivariate analyses identified that ankle dorsiflexion during the SLS (OR 1.25; 95% confidence interval, 1.03-1.52) was a risk factor for a substantial lower-extremity injury. The findings indicate that contemporary dance students are at high risk for lower-extremity injuries. Therefore, the identified risk factor (ankle dorsiflexion) should be considered for prevention purposes.

  16. Efficient methods and readily customizable libraries for managing complexity of large networks.

    PubMed

    Dogrusoz, Ugur; Karacelik, Alper; Safarli, Ilkin; Balci, Hasan; Dervishi, Leonard; Siper, Metin Can

    2018-01-01

    One common problem in visualizing real-life networks, including biological pathways, is the large size of these networks. Often times, users find themselves facing slow, non-scaling operations due to network size, if not a "hairball" network, hindering effective analysis. One extremely useful method for reducing complexity of large networks is the use of hierarchical clustering and nesting, and applying expand-collapse operations on demand during analysis. Another such method is hiding currently unnecessary details, to later gradually reveal on demand. Major challenges when applying complexity reduction operations on large networks include efficiency and maintaining the user's mental map of the drawing. We developed specialized incremental layout methods for preserving a user's mental map while managing complexity of large networks through expand-collapse and hide-show operations. We also developed open-source JavaScript libraries as plug-ins to the web based graph visualization library named Cytsocape.js to implement these methods as complexity management operations. Through efficient specialized algorithms provided by these extensions, one can collapse or hide desired parts of a network, yielding potentially much smaller networks, making them more suitable for interactive visual analysis. This work fills an important gap by making efficient implementations of some already known complexity management techniques freely available to tool developers through a couple of open source, customizable software libraries, and by introducing some heuristics which can be applied upon such complexity management techniques to ensure preserving mental map of users.

  17. Bayesian uncertainty analysis for complex systems biology models: emulation, global parameter searches and evaluation of gene functions.

    PubMed

    Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith

    2018-01-02

    Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour and identifies the sets of rate parameters of interest.

  18. Phase retrieval from intensity-only data by relative entropy minimization.

    PubMed

    Deming, Ross W

    2007-11-01

    A recursive algorithm, which appears to be new, is presented for estimating the amplitude and phase of a wave field from intensity-only measurements on two or more scan planes at different axial positions. The problem is framed as a nonlinear optimization, in which the angular spectrum of the complex field model is adjusted in order to minimize the relative entropy, or Kullback-Leibler divergence, between the measured and reconstructed intensities. The most common approach to this so-called phase retrieval problem is a variation of the well-known Gerchberg-Saxton algorithm devised by Misell (J. Phys. D6, L6, 1973), which is efficient and extremely simple to implement. The new algorithm has a computational structure that is very similar to Misell's approach, despite the fundamental difference in the optimization criteria used for each. Based upon results from noisy simulated data, the new algorithm appears to be more robust than Misell's approach and to produce better results from low signal-to-noise ratio data. The convergence of the new algorithm is examined.

  19. Black holes and fundamental fields: Hair, kicks, and a gravitational Magnus effect

    NASA Astrophysics Data System (ADS)

    Okawa, Hirotada; Cardoso, Vitor

    2014-11-01

    Scalar fields pervade theoretical physics and are a fundamental ingredient to solve the dark matter problem, to realize the Peccei-Quinn mechanism in QCD or the string-axiverse scenario. They are also a useful proxy for more complex matter interactions, such as accretion disks or matter in extreme conditions. Here, we study the collision between scalar "clouds" and rotating black holes. For the first time we are able to compare analytic estimates and strong field, nonlinear numerical calculations for this problem. As the black hole pierces through the cloud it accretes according to the Bondi-Hoyle prediction, but is deflected through a purely kinematic gravitational "anti-Magnus" effect, which we predict to be present also during the interaction of black holes with accretion disks. After the interaction is over, we find large recoil velocities in the transverse direction. The end-state of the process belongs to the vacuum Kerr family if the scalar is massless, but can be a hairy black hole when the scalar is massive.

  20. On the matter of synovial fluid lubrication: implications for Metal-on-Metal hip tribology.

    PubMed

    Myant, Connor; Cann, Philippa

    2014-06-01

    Artificial articular joints present an interesting, and difficult, tribological problem. These bearing contacts undergo complex transient loading and multi axes kinematic cycles, over extremely long periods of time (>10 years). Despite extensive research, wear of the bearing surfaces, particularly metal-metal hips, remains a major problem. Comparatively little is known about the prevailing lubrication mechanism in artificial joints which is a serious gap in our knowledge as this determines film formation and hence wear. In this paper we review the accepted lubrication models for artificial hips and present a new concept to explain film formation with synovial fluid. This model, recently proposed by the authors, suggests that interfacial film formation is determined by rheological changes local to the contact and is driven by aggregation of synovial fluid proteins. The implications of this new mechanism for the tribological performance of new implant designs and the effect of patient synovial fluid properties are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. The infection algorithm: an artificial epidemic approach for dense stereo correspondence.

    PubMed

    Olague, Gustavo; Fernández, Francisco; Pérez, Cynthia B; Lutton, Evelyne

    2006-01-01

    We present a new bio-inspired approach applied to a problem of stereo image matching. This approach is based on an artificial epidemic process, which we call the infection algorithm. The problem at hand is a basic one in computer vision for 3D scene reconstruction. It has many complex aspects and is known as an extremely difficult one. The aim is to match the contents of two images in order to obtain 3D information that allows the generation of simulated projections from a viewpoint that is different from the ones of the initial photographs. This process is known as view synthesis. The algorithm we propose exploits the image contents in order to produce only the necessary 3D depth information, while saving computational time. It is based on a set of distributed rules, which propagate like an artificial epidemic over the images. Experiments on a pair of real images are presented, and realistic reprojected images have been generated.

  2. Multiple applications of ion chromatography oligosaccharide fingerprint profiles to solve a variety of sugar and sugar-biofuel industry problems.

    PubMed

    Eggleston, Gillian; Borges, Eduardo

    2015-03-25

    Sugar crops contain a broad variety of carbohydrates used for human consumption and the production of biofuels and bioproducts. Ion chromatography with integrated pulsed amperometric detection (IC-IPAD) can be used to simultaneously detect mono-, di-, and oligosaccharides, oligosaccharide isomers, mannitol, and ethanol in complex matrices from sugar crops. By utilizing a strong NaOH/NaOAc gradient method over 45 min, oligosaccharides of at least 2-12 dp can be detected. Fingerprint IC oligosaccharide profiles are extremely selective, sensitive, and reliable and can detect deterioration product metabolites from as low as 100 colony-forming units/mL lactic acid bacteria. The IC fingerprints can also be used to (i) monitor freeze deterioration, (ii) optimize harvesting methods and cut-to-crush times, (iii) differentiate between white refined sugar from sugar cane and from sugar beets, (iv) verify the activities of carbohydrate enzymes, (v) select yeasts for ethanol fermentations, and (vi) isolate and diagnose infections and processing problems in sugar factories.

  3. Generation of intervention strategy for a genetic regulatory network represented by a family of Markov Chains.

    PubMed

    Berlow, Noah; Pal, Ranadip

    2011-01-01

    Genetic Regulatory Networks (GRNs) are frequently modeled as Markov Chains providing the transition probabilities of moving from one state of the network to another. The inverse problem of inference of the Markov Chain from noisy and limited experimental data is an ill posed problem and often generates multiple model possibilities instead of a unique one. In this article, we address the issue of intervention in a genetic regulatory network represented by a family of Markov Chains. The purpose of intervention is to alter the steady state probability distribution of the GRN as the steady states are considered to be representative of the phenotypes. We consider robust stationary control policies with best expected behavior. The extreme computational complexity involved in search of robust stationary control policies is mitigated by using a sequential approach to control policy generation and utilizing computationally efficient techniques for updating the stationary probability distribution of a Markov chain following a rank one perturbation.

  4. The importance of situation-specific encodings: analysis of a simple connectionist model of letter transposition effects

    NASA Astrophysics Data System (ADS)

    Fang, Shin-Yi; Smith, Garrett; Tabor, Whitney

    2018-04-01

    This paper analyses a three-layer connectionist network that solves a translation-invariance problem, offering a novel explanation for transposed letter effects in word reading. Analysis of the hidden unit encodings provides insight into two central issues in cognitive science: (1) What is the novelty of claims of "modality-specific" encodings? and (2) How can a learning system establish a complex internal structure needed to solve a problem? Although these topics (embodied cognition and learnability) are often treated separately, we find a close relationship between them: modality-specific features help the network discover an abstract encoding by causing it to break the initial symmetries of the hidden units in an effective way. While this neural model is extremely simple compared to the human brain, our results suggest that neural networks need not be black boxes and that carefully examining their encoding behaviours may reveal how they differ from classical ideas about the mind-world relationship.

  5. Sparsity-based super-resolved coherent diffraction imaging of one-dimensional objects.

    PubMed

    Sidorenko, Pavel; Kfir, Ofer; Shechtman, Yoav; Fleischer, Avner; Eldar, Yonina C; Segev, Mordechai; Cohen, Oren

    2015-09-08

    Phase-retrieval problems of one-dimensional (1D) signals are known to suffer from ambiguity that hampers their recovery from measurements of their Fourier magnitude, even when their support (a region that confines the signal) is known. Here we demonstrate sparsity-based coherent diffraction imaging of 1D objects using extreme-ultraviolet radiation produced from high harmonic generation. Using sparsity as prior information removes the ambiguity in many cases and enhances the resolution beyond the physical limit of the microscope. Our approach may be used in a variety of problems, such as diagnostics of defects in microelectronic chips. Importantly, this is the first demonstration of sparsity-based 1D phase retrieval from actual experiments, hence it paves the way for greatly improving the performance of Fourier-based measurement systems where 1D signals are inherent, such as diagnostics of ultrashort laser pulses, deciphering the complex time-dependent response functions (for example, time-dependent permittivity and permeability) from spectral measurements and vice versa.

  6. The Nature and Characteristics of Youthful Extremism

    ERIC Educational Resources Information Center

    Zubok, Iu. A.; Chuprov, V. I.

    2010-01-01

    Extremism is an acute problem of the present day. Moods of extremism are manifested in all spheres of the life and activities of young people--in education, work, business, political life, and leisure activity. They can be found in both individual and group social self-determination and are influenced by the immediate social environment as well as…

  7. Exact simulation of max-stable processes.

    PubMed

    Dombry, Clément; Engelke, Sebastian; Oesting, Marco

    2016-06-01

    Max-stable processes play an important role as models for spatial extreme events. Their complex structure as the pointwise maximum over an infinite number of random functions makes their simulation difficult. Algorithms based on finite approximations are often inexact and computationally inefficient. We present a new algorithm for exact simulation of a max-stable process at a finite number of locations. It relies on the idea of simulating only the extremal functions, that is, those functions in the construction of a max-stable process that effectively contribute to the pointwise maximum. We further generalize the algorithm by Dieker & Mikosch (2015) for Brown-Resnick processes and use it for exact simulation via the spectral measure. We study the complexity of both algorithms, prove that our new approach via extremal functions is always more efficient, and provide closed-form expressions for their implementation that cover most popular models for max-stable processes and multivariate extreme value distributions. For simulation on dense grids, an adaptive design of the extremal function algorithm is proposed.

  8. Utilization of manual therapy to the lumbar spine in conjunction with traditional conservative care for individuals with bilateral lower extremity complex regional pain syndrome: A case series.

    PubMed

    Walston, Zachary; Hernandez, Luis; Yake, Dale

    2018-06-06

    Conservative therapies for complex regional pain syndrome (CRPS) have traditionally focused on exercise and desensitization techniques targeted at the involved extremity. The primary purpose of this case series is to report on the potential benefit of utilizing manual therapy to the lumbar spine in conjunction with traditional conservative care when treating patients with lower extremity CRPS. Two patients with the diagnosis of lower extremity CRPS were treated with manual therapy to the lumbar spine in conjunction with education, exercise, desensitization, and soft tissue techniques for the extremity. Patient 1 received 13 sessions over 6 weeks resulting in a 34-point improvement in oswestry disability index (ODI) and 35-point improvement in lower extremity functional scale (LEFS). Patient 2 received 21 sessions over 12 weeks resulting in a 28-point improvement in ODI and a 41-point improvement in LEFS. Both patients exhibited reductions in pain and clinically meaningful improvements in function. Manual therapies when applied to the lumbar spine in these patients as part of a comprehensive treatment plan resulted in improved spinal mobility, decreased pain, and reduction is distal referred symptoms. Although one cannot infer a cause and effect relationship from a case series, this report identifies meaningful clinical outcomes potentially associated with manual physical therapy to the lumbar spine for two patients with complex regional pain syndrome type 1.

  9. Multidisciplinary and participatory workshops with stakeholders in a community of extreme poverty in the Peruvian Amazon: Development of priority concerns and potential health, nutrition and education interventions

    PubMed Central

    Casapia, Martin; Joseph, Serene A; Gyorkos, Theresa W

    2007-01-01

    Background Communities of extreme poverty suffer disproportionately from a wide range of adverse outcomes, but are often neglected or underserved by organized services and research attention. In order to target the first Millennium Development Goal of eradicating extreme poverty, thereby reducing health inequalities, participatory research in these communities is needed. Therefore, the purpose of this study was to determine the priority problems and respective potential cost-effective interventions in Belen, a community of extreme poverty in the Peruvian Amazon, using a multidisciplinary and participatory focus. Methods Two multidisciplinary and participatory workshops were conducted with important stakeholders from government, non-government and community organizations, national institutes and academic institutions. In Workshop 1, participants prioritized the main health and health-related problems in the community of Belen. Problem trees were developed to show perceived causes and effects for the top six problems. In Workshop 2, following presentations describing data from recently completed field research in school and household populations of Belen, participants listed potential interventions for the priority problems, including associated barriers, enabling factors, costs and benefits. Results The top ten priority problems in Belen were identified as: 1) infant malnutrition; 2) adolescent pregnancy; 3) diarrhoea; 4) anaemia; 5) parasites; 6) lack of basic sanitation; 7) low level of education; 8) sexually transmitted diseases; 9) domestic violence; and 10) delayed school entry. Causes and effects for the top six problems, proposed interventions, and factors relating to the implementation of interventions were multidisciplinary in nature and included health, nutrition, education, social and environmental issues. Conclusion The two workshops provided valuable insight into the main health and health-related problems facing the community of Belen. The participatory focus of the workshops ensured the active involvement of important stakeholders from Belen. Based on the results of the workshops, effective and essential interventions are now being planned which will contribute to reducing health inequalities in the community. PMID:17623093

  10. A cognitive information processing framework for distributed sensor networks

    NASA Astrophysics Data System (ADS)

    Wang, Feiyi; Qi, Hairong

    2004-09-01

    In this paper, we present a cognitive agent framework (CAF) based on swarm intelligence and self-organization principles, and demonstrate it through collaborative processing for target classification in sensor networks. The framework involves integrated designs to provide both cognitive behavior at the organization level to conquer complexity and reactive behavior at the individual agent level to retain simplicity. The design tackles various problems in the current information processing systems, including overly complex systems, maintenance difficulties, increasing vulnerability to attack, lack of capability to tolerate faults, and inability to identify and cope with low-frequency patterns. An important and distinguishing point of the presented work from classical AI research is that the acquired intelligence does not pertain to distinct individuals but to groups. It also deviates from multi-agent systems (MAS) due to sheer quantity of extremely simple agents we are able to accommodate, to the degree that some loss of coordination messages and behavior of faulty/compromised agents will not affect the collective decision made by the group.

  11. Model reduction by trimming for a class of semi-Markov reliability models and the corresponding error bound

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Palumbo, Daniel L.

    1991-01-01

    Semi-Markov processes have proved to be an effective and convenient tool to construct models of systems that achieve reliability by redundancy and reconfiguration. These models are able to depict complex system architectures and to capture the dynamics of fault arrival and system recovery. A disadvantage of this approach is that the models can be extremely large, which poses both a model and a computational problem. Techniques are needed to reduce the model size. Because these systems are used in critical applications where failure can be expensive, there must be an analytically derived bound for the error produced by the model reduction technique. A model reduction technique called trimming is presented that can be applied to a popular class of systems. Automatic model generation programs were written to help the reliability analyst produce models of complex systems. This method, trimming, is easy to implement and the error bound easy to compute. Hence, the method lends itself to inclusion in an automatic model generator.

  12. Invariant U2 snRNA nucleotides form a stem loop to recognize the intron early in splicing

    PubMed Central

    Perriman, Rhonda; Ares, Manuel

    2010-01-01

    U2 snRNA-intron branchpoint pairing is a critical step in pre-mRNA recognition by the splicing apparatus, but the mechanism by which these two RNAs engage each other is unknown. Here we identify a new U2 snRNA structure, the branchpoint interaction stem-loop (BSL), that presents the U2 nucleotides that will contact the intron. We provide evidence that the BSL forms prior to interaction with the intron, and is disrupted by the DExD/H protein Prp5p during engagement of the snRNA with the intron. In vitro splicing complex assembly in a BSL-destabilized mutant extract suggests that the BSL is required at a previously unrecognized step between commitment complex and prespliceosome formation. The extreme evolutionary conservation of the BSL suggests it represents an ancient structural solution to the problem of intron branchpoint recognition by dynamic RNA elements that must serve multiple functions at other times during splicing. PMID:20471947

  13. Extreme value modeling for the analysis and prediction of time series of extreme tropospheric ozone levels: a case study.

    PubMed

    Escarela, Gabriel

    2012-06-01

    The occurrence of high concentrations of tropospheric ozone is considered as one of the most important issues of air management programs. The prediction of dangerous ozone levels for the public health and the environment, along with the assessment of air quality control programs aimed at reducing their severity, is of considerable interest to the scientific community and to policy makers. The chemical mechanisms of tropospheric ozone formation are complex, and highly variable meteorological conditions contribute additionally to difficulties in accurate study and prediction of high levels of ozone. Statistical methods offer an effective approach to understand the problem and eventually improve the ability to predict maximum levels of ozone. In this paper an extreme value model is developed to study data sets that consist of periodically collected maxima of tropospheric ozone concentrations and meteorological variables. The methods are applied to daily tropospheric ozone maxima in Guadalajara City, Mexico, for the period January 1997 to December 2006. The model adjusts the daily rate of change in ozone for concurrent impacts of seasonality and present and past meteorological conditions, which include surface temperature, wind speed, wind direction, relative humidity, and ozone. The results indicate that trend, annual effects, and key meteorological variables along with some interactions explain the variation in daily ozone maxima. Prediction performance assessments yield reasonably good results.

  14. Distributed neural control of a hexapod walking vehicle

    NASA Technical Reports Server (NTRS)

    Beer, R. D.; Sterling, L. S.; Quinn, R. D.; Chiel, H. J.; Ritzmann, R.

    1989-01-01

    There has been a long standing interest in the design of controllers for multilegged vehicles. The approach is to apply distributed control to this problem, rather than using parallel computing of a centralized algorithm. Researchers describe a distributed neural network controller for hexapod locomotion which is based on the neural control of locomotion in insects. The model considers the simplified kinematics with two degrees of freedom per leg, but the model includes the static stability constraint. Through simulation, it is demonstrated that this controller can generate a continuous range of statically stable gaits at different speeds by varying a single control parameter. In addition, the controller is extremely robust, and can continue the function even after several of its elements have been disabled. Researchers are building a small hexapod robot whose locomotion will be controlled by this network. Researchers intend to extend their model to the dynamic control of legs with more than two degrees of freedom by using data on the control of multisegmented insect legs. Another immediate application of this neural control approach is also exhibited in biology: the escape reflex. Advanced robots are being equipped with tactile sensing and machine vision so that the sensory inputs to the robot controller are vast and complex. Neural networks are ideal for a lower level safety reflex controller because of their extremely fast response time. The combination of robotics, computer modeling, and neurobiology has been remarkably fruitful, and is likely to lead to deeper insights into the problems of real time sensorimotor control.

  15. Many-Worlds Interpretation of Quantum Theory and Mesoscopic Anthropic Principle

    NASA Astrophysics Data System (ADS)

    Kamenshchik, A. Yu.; Teryaev, O. V.

    2008-10-01

    We suggest to combine the Anthropic Principle with Many-Worlds Interpretation of Quantum Theory. Realizing the multiplicity of worlds it provides an opportunity of explanation of some important events which are assumed to be extremely improbable. The Mesoscopic Anthropic Principle suggested here is aimed to explain appearance of such events which are necessary for emergence of Life and Mind. It is complementary to Cosmological Anthropic Principle explaining the fine tuning of fundamental constants. We briefly discuss various possible applications of Mesoscopic Anthropic Principle including the Solar Eclipses and assembling of complex molecules. Besides, we address the problem of Time's Arrow in the framework of Many-World Interpretation. We suggest the recipe for disentangling of quantities defined by fundamental physical laws and by an anthropic selection.

  16. [Coexistence of coeliac disease and inflammatory bowel disease in children].

    PubMed

    Krawiec, Paulina; Pawłowska-Kamieniak, Agnieszka; Pac-Kożuchowska, Elżbieta; Mroczkowska-Juchkiewcz, Agnieszka; Kominek, Katarzyna

    2016-01-01

    Coeliac disease and inflammatory bowel disease are chronic inflammatory conditions of gastrointestinal tract with complex aetiology with genetic, environmental and immunological factors contributing to its pathogenesis. It was noted that immune-mediated disorders often coexist. There is well-known association between coeliac disease and type 1 diabetes and ulcerative colitis and primary sclerosing cholangitis. However, growing body of literature suggests the association between coeliac disease and inflammatory bowel disease, particularly ulcerative colitis. This is an extremely rare problem in paediatric gastroenterology. To date there have been reported several cases of children with coexisting coeliac disease and inflammatory bowel disease. Herewith we present review of current literature on coexistence of coeliac disease and inflammatory bowel disease in children. © 2016 MEDPRESS.

  17. [THE ORGANIZATION OF REHABILITATION CARE OF POPULATION USING INNOVATIVE MEDICAL ORGANIZATIONAL TECHNOLOGIES AND PRINCIPLES OF PUBLIC PRIVATE PARTNERSHIP].

    PubMed

    Totskaia, E G; Sheliakina, O W; Sadovoii, M A; Netchaev, V S

    2015-01-01

    The article considers actual problems of actual stage of development of health care related to using innovative approaches to organization and management of rehabilitation care ofpopulation. The rehabilitation is most important direction of medical sector supporting complex of services in closed cycle of rendering medical care to population and significant social economic effects. The capacity and extreme unprofitability of rehabilitation services determine necessity of searching alternative forms of organization of this type of care and financing including mechanisms of public-private partnership. The experience is presented related to involvement of resources of non-public medical organizations for implementing public commitments on rendering qualitative rehabilitation services to population using innovative medical organizational technologies.

  18. pyNS: an open-source framework for 0D haemodynamic modelling.

    PubMed

    Manini, Simone; Antiga, Luca; Botti, Lorenzo; Remuzzi, Andrea

    2015-06-01

    A number of computational approaches have been proposed for the simulation of haemodynamics and vascular wall dynamics in complex vascular networks. Among them, 0D pulse wave propagation methods allow to efficiently model flow and pressure distributions and wall displacements throughout vascular networks at low computational costs. Although several techniques are documented in literature, the availability of open-source computational tools is still limited. We here present python Network Solver, a modular solver framework for 0D problems released under a BSD license as part of the archToolkit ( http://archtk.github.com ). As an application, we describe patient-specific models of the systemic circulation and detailed upper extremity for use in the prediction of maturation after surgical creation of vascular access for haemodialysis.

  19. Determination of molecular spectroscopic parameters and energy-transfer rates by double-resonance spectroscopy

    NASA Technical Reports Server (NTRS)

    Steinfeld, J. I.; Foy, B.; Hetzler, J.; Flannery, C.; Klaassen, J.; Mizugai, Y.; Coy, S.

    1990-01-01

    The spectroscopy of small to medium-size polyatomic molecules can be extremely complex, especially in higher-lying overtone and combination vibrational levels. The high density of levels also complicates the understanding of inelastic collision processes, which is required to model energy transfer and collision broadening of spectral lines. Both of these problems can be addressed by double-resonance spectroscopy, i.e., time-resolved pump-probe measurements using microwave, infrared, near-infrared, and visible-wavelength sources. Information on excited-state spectroscopy, transition moments, inelastic energy transfer rates and propensity rules, and pressure-broadening parameters may be obtained from such experiments. Examples are given for several species of importance in planetary atmospheres, including ozone, silane, ethane, and ammonia.

  20. A Synoptic Weather Typing Approach and Its application to Assess Climate Change Impacts on Extreme Weather Events at Local Scale in South-Central Canada

    NASA Astrophysics Data System (ADS)

    Shouquan Cheng, Chad; Li, Qian; Li, Guilong

    2010-05-01

    The synoptic weather typing approach has become popular in evaluating the impacts of climate change on a variety of environmental problems. One of the reasons is its ability to categorize a complex set of meteorological variables as a coherent index, which can facilitate analyses of local climate change impacts. The weather typing method has been successfully applied in Environment Canada for several research projects to analyze climatic change impacts on a number of extreme weather events, such as freezing rain, heavy rainfall, high-/low-flow events, air pollution, and human health. These studies comprise of three major parts: (1) historical simulation modeling to verify the extreme weather events, (2) statistical downscaling to provide station-scale future hourly/daily climate data, and (3) projections of changes in frequency and intensity of future extreme weather events in this century. To achieve these goals, in addition to synoptic weather typing, the modeling conceptualizations in meteorology and hydrology and a number of linear/nonlinear regression techniques were applied. Furthermore, a formal model result verification process has been built into each of the three parts of the projects. The results of the verification, based on historical observations of the outcome variables predicted by the models, showed very good agreement. The modeled results from these projects found that the frequency and intensity of future extreme weather events are projected to significantly increase under a changing climate in this century. This talk will introduce these research projects and outline the modeling exercise and result verification process. The major findings on future projections from the studies will be summarized in the presentation as well. One of the major conclusions from the studies is that the procedures (including synoptic weather typing) used in the studies are useful for climate change impact analysis on future extreme weather events. The implication of the significant increases in frequency and intensity of future extreme weather events would be useful to be considered when revising engineering infrastructure design standards and developing adaptation strategies and policies.

  1. Exploring regional stakeholder needs and requirements in terms of Extreme Weather Event Attribution

    NASA Astrophysics Data System (ADS)

    Schwab, M.; Meinke, I.; Vanderlinden, J. P.; Touili, N.; Von Storch, H.

    2015-12-01

    Extreme event attribution has increasingly received attention in the scientific community. It may also serve decision-making at the regional level where much of the climate change impact mitigation takes place. Nevertheless, there is, to date, little known about the requirements of regional actors in terms of extreme event attribution. We have therefore analysed these at the example of regional decision-makers for climate change-related activities and/or concerned with storm surge risks at the German Baltic Sea and heat wave risks in the Greater Paris area. In order to explore if stakeholders find scientific knowledge from extreme event attribution useful and how this information might be relevant to their decision-making, we consulted a diverse set of actors engaged in the assessment, mitigation and communication of storm surge, heat wave, and climate change-related risks. Extreme event attribution knowledge was perceived to be most useful to public and political awareness-raising, but was of little or no relevance for the consulted stakeholders themselves. It was not acknowledged that it would support adaptation planning as sometimes argued in the literature. The consulted coastal protection, health, and urban adaptation planners rather needed reliable statements about possible future changes in extreme events than causal statements about past events. To enhance salience, a suitable product of event attribution should be linked to regional problems, vulnerabilities, and impacts of climate change. Given that the tolerance of uncertainty is rather low, most of the stakeholders also claimed that a suitable product of event attribution is to be received from a trusted "honest broker" and published rather later, but with smaller uncertainties than vice versa. Institutional mechanisms, like regional climate services, which enable and foster communication, translation and mediation across the boundaries between knowledge and action can help fulfill such requirements. This is of particular importance for extreme event attribution which is often understood as science producing complex and abstract information attached to large uncertainties. They can serve as an interface for creating the necessary mutual understanding by being in a continuous dialogue with both science and stakeholders.

  2. Persisting behavior problems in extremely low birth weight adolescents.

    PubMed

    Taylor, H Gerry; Margevicius, Seunghee; Schluchter, Mark; Andreias, Laura; Hack, Maureen

    2015-04-01

    To describe behavior problems in extremely low birth weight (ELBW, <1000 g) adolescents born 1992 through 1995 based on parent ratings and adolescent self-ratings at age 14 years and to examine changes in parent ratings from ages 8-14. Parent ratings of behavior problems and adolescent self-ratings were obtained for 169 ELBW adolescents (mean birth weight 815 g, gestational age 26 wk) and 115 normal birth weight (NBW) controls at 14 years. Parent ratings of behavior at age 8 years were also available. Behavior outcomes were assessed using symptom severity scores and rates of scores above DSM-IV symptom cutoffs for clinical disorder. The ELBW group had higher symptom severity scores on parent ratings at age 14 years than NBW controls for inattentive attention-deficit hyperactivity disorder (ADHD), anxiety, and social problems (all p's < .01). Rates of parent ratings meeting DSM-IV symptom criteria for inattentive ADHD were also higher for the ELBW group (12% vs. 1%, p < .01). In contrast, the ELBW group had lower symptom severity scores on self-ratings than controls for several scales. Group differences in parent ratings decreased over time for ADHD, especially among females, but were stable for anxiety and social problems. Extremely low birth weight adolescents continue to have behavior problems similar to those evident at a younger age, but these problems are not evident in behavioral self-ratings. The findings suggest that parent ratings provide contrasting perspectives on behavior problems in ELBW youth and support the need to identify and treat these problems early in childhood.

  3. Extreme-scale Algorithms and Solver Resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dongarra, Jack

    A widening gap exists between the peak performance of high-performance computers and the performance achieved by complex applications running on these platforms. Over the next decade, extreme-scale systems will present major new challenges to algorithm development that could amplify this mismatch in such a way that it prevents the productive use of future DOE Leadership computers due to the following; Extreme levels of parallelism due to multicore processors; An increase in system fault rates requiring algorithms to be resilient beyond just checkpoint/restart; Complex memory hierarchies and costly data movement in both energy and performance; Heterogeneous system architectures (mixing CPUs, GPUs,more » etc.); and Conflicting goals of performance, resilience, and power requirements.« less

  4. The burgeoning field of transdisciplinary adaptation research in Quebec (1998-): a climate change-related public health narrative.

    PubMed

    Gosselin, Pierre; Bélanger, Diane; Lapaige, Véronique; Labbé, Yolaine

    2011-01-01

    This paper presents a public health narrative on Quebec's new climatic conditions and human health, and describes the transdisciplinary nature of the climate change adaptation research currently being adopted in Quebec, characterized by the three phases of problem identification, problem investigation, and problem transformation. A transdisciplinary approach is essential for dealing with complex ill-defined problems concerning human-environment interactions (for example, climate change), for allowing joint research, collective leadership, complex collaborations, and significant exchanges among scientists, decision makers, and knowledge users. Such an approach is widely supported in theory but has proved to be extremely difficult to implement in practice, and those who attempt it have met with heavy resistance, succeeding when they find the occasional opportunity within institutional or social contexts. In this paper we narrate the ongoing struggle involved in tackling the negative effects of climate change in multi-actor contexts at local and regional levels, a struggle that began in a quiet way in 1998. The paper will describe how public health adaptation research is supporting transdisciplinary action and implementation while also preparing for the future, and how this interaction to tackle a life-world problem (adaptation of the Quebec public health sector to climate change) in multi-actors contexts has progressively been established during the last 13 years. The first of the two sections introduces the social context of a Quebec undergoing climate changes. Current climatic conditions and expected changes will be described, and attendant health risks for the Quebec population. The second section addresses the scientific, institutional and normative dimensions of the problem. It corresponds to a "public health narrative" presented in three phases: (1) problem identification (1998-2002) beginning in northern Quebec; (2) problem investigation (2002-2006) in which the issues are successively explored, understood, and conceptualized for all of Quebec, and (3) problem transformation (2006-2009), which discusses major interactions among the stakeholders and the presentation of an Action Plan by a central actor, the Quebec government, in alliance with other stakeholders. In conclusion, we underline the importance, in the current context, of providing for a sustained transdisciplinary adaptation to climatic change. This paper should be helpful for (1) public health professionals confronted with establishing a transdisciplinary approach to a real-world problem other than climate change, (2) professionals in other sectors (such as public safety, built environment) confronted with climate change, who wish to implement transdisciplinary adaptive interventions and/or research, and (3) knowledge users (public and private actors; nongovernment organizations; citizens) from elsewhere in multi-contexts/environments/sectors who wish to promote complex collaborations (with us or not), collective leadership, and "transfrontier knowledge-to-action" for implementing climate change-related adaptation measures.

  5. The burgeoning field of transdisciplinary adaptation research in Quebec (1998–): a climate change-related public health narrative

    PubMed Central

    Gosselin, Pierre; Bélanger, Diane; Lapaige, Véronique; Labbé, Yolaine

    2011-01-01

    This paper presents a public health narrative on Quebec’s new climatic conditions and human health, and describes the transdisciplinary nature of the climate change adaptation research currently being adopted in Quebec, characterized by the three phases of problem identification, problem investigation, and problem transformation. A transdisciplinary approach is essential for dealing with complex ill-defined problems concerning human–environment interactions (for example, climate change), for allowing joint research, collective leadership, complex collaborations, and significant exchanges among scientists, decision makers, and knowledge users. Such an approach is widely supported in theory but has proved to be extremely difficult to implement in practice, and those who attempt it have met with heavy resistance, succeeding when they find the occasional opportunity within institutional or social contexts. In this paper we narrate the ongoing struggle involved in tackling the negative effects of climate change in multi-actor contexts at local and regional levels, a struggle that began in a quiet way in 1998. The paper will describe how public health adaptation research is supporting transdisciplinary action and implementation while also preparing for the future, and how this interaction to tackle a life-world problem (adaptation of the Quebec public health sector to climate change) in multi-actors contexts has progressively been established during the last 13 years. The first of the two sections introduces the social context of a Quebec undergoing climate changes. Current climatic conditions and expected changes will be described, and attendant health risks for the Quebec population. The second section addresses the scientific, institutional and normative dimensions of the problem. It corresponds to a “public health narrative” presented in three phases: (1) problem identification (1998–2002) beginning in northern Quebec; (2) problem investigation (2002–2006) in which the issues are successively explored, understood, and conceptualized for all of Quebec, and (3) problem transformation (2006–2009), which discusses major interactions among the stakeholders and the presentation of an Action Plan by a central actor, the Quebec government, in alliance with other stakeholders. In conclusion, we underline the importance, in the current context, of providing for a sustained transdisciplinary adaptation to climatic change. This paper should be helpful for (1) public health professionals confronted with establishing a transdisciplinary approach to a real-world problem other than climate change, (2) professionals in other sectors (such as public safety, built environment) confronted with climate change, who wish to implement transdisciplinary adaptive interventions and/or research, and (3) knowledge users (public and private actors; nongovernment organizations; citizens) from elsewhere in multi-contexts/environments/sectors who wish to promote complex collaborations (with us or not), collective leadership, and “transfrontier knowledge-to-action” for implementing climate change-related adaptation measures. PMID:21966228

  6. [Injury mechanisms in extreme violence settings].

    PubMed

    Arcaute-Velazquez, Fernando Federico; García-Núñez, Luis Manuel; Noyola-Vilallobos, Héctor Faustino; Espinoza-Mercado, Fernando; Rodríguez-Vega, Carlos Eynar

    2016-01-01

    Extreme violence events are consequence of current world-wide economic, political and social conditions. Injury patterns found among victims of extreme violence events are very complex, obeying several high-energy injury mechanisms. In this article, we present the basic concepts of trauma kinematics that regulate the clinical approach to victims of extreme violence events, in the hope that clinicians increase their theoretical armamentarium, and reflecting on obtaining better outcomes. Copyright © 2016. Published by Masson Doyma México S.A.

  7. Trajectories of psychopathology in extremely low birth weight survivors from early adolescence to adulthood: a 20-year longitudinal study.

    PubMed

    Van Lieshout, Ryan J; Ferro, Mark A; Schmidt, Louis A; Boyle, Michael H; Saigal, Saroj; Morrison, Katherine M; Mathewson, Karen J

    2018-04-18

    Individuals born extremely preterm are exposed to significant perinatal stresses that are associated with an increased risk of psychopathology. However, a paucity of longitudinal studies has prevented the empirical examination of long-term, dynamic effects of perinatal adversity on mental health. Here, internalizing and externalizing problems from adolescence through adulthood were compared in individuals born at extremely low birth weight (ELBW; <1,000 g) and normal birth weight (NBW; >2,500 g). Internalizing and externalizing data were collected over 20 years in three waves, during adolescence, young adulthood, and adulthood. Growth models were used to compare longitudinal trajectories in a geographically based sample of 151 ELBW survivors and 137 NBW control participants born between 1977 and 1982 matched for age, sex, and socioeconomic status at age 8. After adjusting for sex, socioeconomic and immigrant status, and family functioning, ELBW survivors failed to show the normative, age-related decline in internalizing problems over time relative to their NBW peers (β = .21; p < .01). Both groups exhibited small declines in externalizing problems over the same period. Self-esteem (but not physical health, IQ, or maternal mood) partially mediated the association between ELBW status and internalizing problems. Extremely low birth weight survivors experienced a blunting of the expected improvement in depression and anxiety from adolescence to adulthood. These findings suggest that altered physiological regulatory systems supporting emotional and cognitive processing may contribute to the maintenance of internalizing problems in this population. © 2018 Association for Child and Adolescent Mental Health.

  8. Physical exam of the adolescent shoulder: tips for evaluating and diagnosing common shoulder disorders in the adolescent athlete.

    PubMed

    Lazaro, Lionel E; Cordasco, Frank A

    2017-02-01

    In the young athlete, the shoulder is one of the most frequently injured joints during sports activities. The injuries are either from an acute traumatic event or overuse. Shoulder examination can present some challenges; given the multiple joints involved, the difficulty palpating the underlying structures, and the potential to have both intra- and/or extra-articular problems. Many of the shoulder examination tests can be positive in multiple problems. They usually have high sensitivity but low specificity and therefore low predictive value. The medical history coupled with a detailed physical exam can usually provide the information necessary to obtain an accurate diagnosis. A proficient shoulder examination and the development of an adequate differential diagnosis are important before considering advanced imaging. The shoulder complex relies upon the integrity of multiple structures for normal function. A detailed history is of paramount importance when evaluating young athletes with shoulder problems. A systematic physical examination is extremely important to guiding an accurate diagnosis. The patient's age and activity level are very important when considering the differential diagnosis. Findings obtain through history and physical examination should dictate the decision to obtain advanced imaging of the shoulder.

  9. Wavelet-Based Interpolation and Representation of Non-Uniformly Sampled Spacecraft Mission Data

    NASA Technical Reports Server (NTRS)

    Bose, Tamal

    2000-01-01

    A well-documented problem in the analysis of data collected by spacecraft instruments is the need for an accurate, efficient representation of the data set. The data may suffer from several problems, including additive noise, data dropouts, an irregularly-spaced sampling grid, and time-delayed sampling. These data irregularities render most traditional signal processing techniques unusable, and thus the data must be interpolated onto an even grid before scientific analysis techniques can be applied. In addition, the extremely large volume of data collected by scientific instrumentation presents many challenging problems in the area of compression, visualization, and analysis. Therefore, a representation of the data is needed which provides a structure which is conducive to these applications. Wavelet representations of data have already been shown to possess excellent characteristics for compression, data analysis, and imaging. The main goal of this project is to develop a new adaptive filtering algorithm for image restoration and compression. The algorithm should have low computational complexity and a fast convergence rate. This will make the algorithm suitable for real-time applications. The algorithm should be able to remove additive noise and reconstruct lost data samples from images.

  10. Simulating variable source problems via post processing of individual particle tallies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bleuel, D.L.; Donahue, R.J.; Ludewigt, B.A.

    2000-10-20

    Monte Carlo is an extremely powerful method of simulating complex, three dimensional environments without excessive problem simplification. However, it is often time consuming to simulate models in which the source can be highly varied. Similarly difficult are optimization studies involving sources in which many input parameters are variable, such as particle energy, angle, and spatial distribution. Such studies are often approached using brute force methods or intelligent guesswork. One field in which these problems are often encountered is accelerator-driven Boron Neutron Capture Therapy (BNCT) for the treatment of cancers. Solving the reverse problem of determining the best neutron source formore » optimal BNCT treatment can be accomplished by separating the time-consuming particle-tracking process of a full Monte Carlo simulation from the calculation of the source weighting factors which is typically performed at the beginning of a Monte Carlo simulation. By post-processing these weighting factors on a recorded file of individual particle tally information, the effect of changing source variables can be realized in a matter of seconds, instead of requiring hours or days for additional complete simulations. By intelligent source biasing, any number of different source distributions can be calculated quickly from a single Monte Carlo simulation. The source description can be treated as variable and the effect of changing multiple interdependent source variables on the problem's solution can be determined. Though the focus of this study is on BNCT applications, this procedure may be applicable to any problem that involves a variable source.« less

  11. Towards a Framework for Evolvable Network Design

    NASA Astrophysics Data System (ADS)

    Hassan, Hoda; Eltarras, Ramy; Eltoweissy, Mohamed

    The layered Internet architecture that had long guided network design and protocol engineering was an “interconnection architecture” defining a framework for interconnecting networks rather than a model for generic network structuring and engineering. We claim that the approach of abstracting the network in terms of an internetwork hinders the thorough understanding of the network salient characteristics and emergent behavior resulting in impeding design evolution required to address extreme scale, heterogeneity, and complexity. This paper reports on our work in progress that aims to: 1) Investigate the problem space in terms of the factors and decisions that influenced the design and development of computer networks; 2) Sketch the core principles for designing complex computer networks; and 3) Propose a model and related framework for building evolvable, adaptable and self organizing networks We will adopt a bottom up strategy primarily focusing on the building unit of the network model, which we call the “network cell”. The model is inspired by natural complex systems. A network cell is intrinsically capable of specialization, adaptation and evolution. Subsequently, we propose CellNet; a framework for evolvable network design. We outline scenarios for using the CellNet framework to enhance legacy Internet protocol stack.

  12. Grid Convergence of High Order Methods for Multiscale Complex Unsteady Viscous Compressible Flows

    NASA Technical Reports Server (NTRS)

    Sjoegreen, B.; Yee, H. C.

    2001-01-01

    Grid convergence of several high order methods for the computation of rapidly developing complex unsteady viscous compressible flows with a wide range of physical scales is studied. The recently developed adaptive numerical dissipation control high order methods referred to as the ACM and wavelet filter schemes are compared with a fifth-order weighted ENO (WENO) scheme. The two 2-D compressible full Navier-Stokes models considered do not possess known analytical and experimental data. Fine grid solutions from a standard second-order TVD scheme and a MUSCL scheme with limiters are used as reference solutions. The first model is a 2-D viscous analogue of a shock tube problem which involves complex shock/shear/boundary-layer interactions. The second model is a supersonic reactive flow concerning fuel breakup. The fuel mixing involves circular hydrogen bubbles in air interacting with a planar moving shock wave. Both models contain fine scale structures and are stiff in the sense that even though the unsteadiness of the flows are rapidly developing, extreme grid refinement and time step restrictions are needed to resolve all the flow scales as well as the chemical reaction scales.

  13. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support

    PubMed Central

    Anderson, Cynthia M.; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  14. Classification of brain MRI with big data and deep 3D convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Wegmayr, Viktor; Aitharaju, Sai; Buhmann, Joachim

    2018-02-01

    Our ever-aging society faces the growing problem of neurodegenerative diseases, in particular dementia. Magnetic Resonance Imaging provides a unique tool for non-invasive investigation of these brain diseases. However, it is extremely difficult for neurologists to identify complex disease patterns from large amounts of three-dimensional images. In contrast, machine learning excels at automatic pattern recognition from large amounts of data. In particular, deep learning has achieved impressive results in image classification. Unfortunately, its application to medical image classification remains difficult. We consider two reasons for this difficulty: First, volumetric medical image data is considerably scarcer than natural images. Second, the complexity of 3D medical images is much higher compared to common 2D images. To address the problem of small data set size, we assemble the largest dataset ever used for training a deep 3D convolutional neural network to classify brain images as healthy (HC), mild cognitive impairment (MCI) or Alzheimers disease (AD). We use more than 20.000 images from subjects of these three classes, which is almost 9x the size of the previously largest data set. The problem of high dimensionality is addressed by using a deep 3D convolutional neural network, which is state-of-the-art in large-scale image classification. We exploit its ability to process the images directly, only with standard preprocessing, but without the need for elaborate feature engineering. Compared to other work, our workflow is considerably simpler, which increases clinical applicability. Accuracy is measured on the ADNI+AIBL data sets, and the independent CADDementia benchmark.

  15. Eruptive stratigraphy of the Tatara-San Pedro complex, 36°S, sourthern volcanic zone, Chilean Andes: reconstruction method and implications for magma evolution at long-lived arc volcanic centers

    USGS Publications Warehouse

    Dungan, M.A.; Wulff, A.; Thompson, R.

    2001-01-01

    The Quaternary Tatara-San Pedro volcanic complex (36°S, Chilean Andes) comprises eight or more unconformity-bound volcanic sequences, representing variably preserved erosional remnants of volcanic centers generated during 930 ky of activity. The internal eruptive histories of several dominantly mafic to intermediate sequences have been reconstructed, on the basis of correlations of whole-rock major and trace element chemistry of flows between multiple sampled sections, but with critical contributions from photogrammetric, geochronologic, and paleomagnetic data. Many groups of flows representing discrete eruptive events define internal variation trends that reflect extrusion of heterogeneous or rapidly evolving magna batches from conduit-reservoir systems in which open-system processes typically played a large role. Long-term progressive evolution trends are extremely rare and the magma compositions of successive eruptive events rarely lie on precisely the same differentiation trend, even where they have evolved from similar parent magmas by similar processes. These observations are not consistent with magma differentiation in large long-lived reservoirs, but they may be accommodated by diverse interactions between newly arrived magma inputs and multiple resident pockets of evolved magma and / or crystal mush residing in conduit-dominated subvolcanic reservoirs. Without constraints provided by the reconstructed stratigraphic relations, the framework for petrologic modeling would be far different. A well-established eruptive stratigraphy may provide independent constraints on the petrologic processes involved in magma evolution-simply on the basis of the specific order in which diverse, broadly cogenetic magmas have been erupted. The Tatara-San Pedro complex includes lavas ranging from primitive basalt to high-SiO2 rhyolite, and although the dominant erupted magma type was basaltic andesite ( 52-55 wt % SiO2) each sequence is characterized by unique proportions of mafic, intermediate, and silicic eruptive products. Intermediate lava compositions also record different evolution paths, both within and between sequences. No systematic long-term pattern is evident from comparisons at the level of sequences. The considerable diversity of mafic and evolved magmas of the Tatara-San Pedro complex bears on interpretations of regional geochemical trends. The variable role of open-system processes in shaping the compositions of evolved Tatara-San Pedro complex magmas, and even some basaltic magmas, leads to the conclusion that addressing problems such as are magma genesis and elemental fluxes through subduction zones on the basis of averaged or regressed reconnaissance geochemical datasets is a tenuous exercise. Such compositional indices are highly instructive for identifying broad regional trends and first-order problems, but they should be used with extreme caution in attempts to quantify processes and magma sources, including crustal components, implicated in these trends.

  16. Fast solver for large scale eddy current non-destructive evaluation problems

    NASA Astrophysics Data System (ADS)

    Lei, Naiguang

    Eddy current testing plays a very important role in non-destructive evaluations of conducting test samples. Based on Faraday's law, an alternating magnetic field source generates induced currents, called eddy currents, in an electrically conducting test specimen. The eddy currents generate induced magnetic fields that oppose the direction of the inducing magnetic field in accordance with Lenz's law. In the presence of discontinuities in material property or defects in the test specimen, the induced eddy current paths are perturbed and the associated magnetic fields can be detected by coils or magnetic field sensors, such as Hall elements or magneto-resistance sensors. Due to the complexity of the test specimen and the inspection environments, the availability of theoretical simulation models is extremely valuable for studying the basic field/flaw interactions in order to obtain a fuller understanding of non-destructive testing phenomena. Theoretical models of the forward problem are also useful for training and validation of automated defect detection systems. Theoretical models generate defect signatures that are expensive to replicate experimentally. In general, modelling methods can be classified into two categories: analytical and numerical. Although analytical approaches offer closed form solution, it is generally not possible to obtain largely due to the complex sample and defect geometries, especially in three-dimensional space. Numerical modelling has become popular with advances in computer technology and computational methods. However, due to the huge time consumption in the case of large scale problems, accelerations/fast solvers are needed to enhance numerical models. This dissertation describes a numerical simulation model for eddy current problems using finite element analysis. Validation of the accuracy of this model is demonstrated via comparison with experimental measurements of steam generator tube wall defects. These simulations generating two-dimension raster scan data typically takes one to two days on a dedicated eight-core PC. A novel direct integral solver for eddy current problems and GPU-based implementation is also investigated in this research to reduce the computational time.

  17. Learning Human Aspects of Collaborative Software Development

    ERIC Educational Resources Information Center

    Hadar, Irit; Sherman, Sofia; Hazzan, Orit

    2008-01-01

    Collaboration has become increasingly widespread in the software industry as systems have become larger and more complex, adding human complexity to the technological complexity already involved in developing software systems. To deal with this complexity, human-centric software development methods, such as Extreme Programming and other agile…

  18. Nature-Inspired Cognitive Evolution to Play MS. Pac-Man

    NASA Astrophysics Data System (ADS)

    Tan, Tse Guan; Teo, Jason; Anthony, Patricia

    Recent developments in nature-inspired computation have heightened the need for research into the three main areas of scientific, engineering and industrial applications. Some approaches have reported that it is able to solve dynamic problems and very useful for improving the performance of various complex systems. So far however, there has been little discussion about the effectiveness of the application of these models to computer and video games in particular. The focus of this research is to explore the hybridization of nature-inspired computation methods for optimization of neural network-based cognition in video games, in this case the combination of a neural network with an evolutionary algorithm. In essence, a neural network is an attempt to mimic the extremely complex human brain system, which is building an artificial brain that is able to self-learn intelligently. On the other hand, an evolutionary algorithm is to simulate the biological evolutionary processes that evolve potential solutions in order to solve the problems or tasks by applying the genetic operators such as crossover, mutation and selection into the solutions. This paper investigates the abilities of Evolution Strategies (ES) to evolve feed-forward artificial neural network's internal parameters (i.e. weight and bias values) for automatically generating Ms. Pac-man controllers. The main objective of this game is to clear a maze of dots while avoiding the ghosts and to achieve the highest possible score. The experimental results show that an ES-based system can be successfully applied to automatically generate artificial intelligence for a complex, dynamic and highly stochastic video game environment.

  19. Correction of complex foot deformities using the Ilizarov external fixator.

    PubMed

    Kocaoğlu, Mehmet; Eralp, Levent; Atalar, Ata Can; Bilen, F Erkal

    2002-01-01

    There are many drawbacks to using conventional approaches to the treatment of complex foot deformities, like the increased risk of neurovascular injury, soft-tissue injury, and the shortening of the foot. An alternative approach that can eliminate these problems is the Ilizarov method. In the current study, a total of 23 deformed feet in 22 patients were treated using the Ilizarov method. The etiologic factors were burn contracture, poliomyelitis, neglected and relapsed clubfoot, trauma, gun shot injury, meningitis, and leg-length discrepancy (LLD). The average age of the patients was 18.2 (5-50) years. The mean duration of fixator application was 5.1 (2-14) months. We performed corrections without an osteotomy in nine feet and with an osteotomy in 14 feet. Additional bony corrective procedures included three tibial and one femoral osteotomies for lengthening and deformity correction, and one tibiotalar arthrodesis in five separate extremities. At the time of fixator removal, a plantigrade foot was achieved in 21 of the 23 feet by pressure mat analysis. Compared to preoperative status, gait was subjectively improved in all patients. Follow-up time from surgery averaged 25 months (13-38). Pin-tract problems were observed in all cases. Other complications were toe contractures in two feet, metatarsophalangeal subluxation from flexor tendon contractures in one foot, incomplete osteotomy in one foot, residual deformity in two feet, and recurrence of deformity in one foot. Our results indicate that the Ilizarov method is an effective alternative means of correcting complex foot deformities, especially in feet that previously have undergone surgery.

  20. Study of Environmental Data Complexity using Extreme Learning Machine

    NASA Astrophysics Data System (ADS)

    Leuenberger, Michael; Kanevski, Mikhail

    2017-04-01

    The main goals of environmental data science using machine learning algorithm deal, in a broad sense, around the calibration, the prediction and the visualization of hidden relationship between input and output variables. In order to optimize the models and to understand the phenomenon under study, the characterization of the complexity (at different levels) should be taken into account. Therefore, the identification of the linear or non-linear behavior between input and output variables adds valuable information for the knowledge of the phenomenon complexity. The present research highlights and investigates the different issues that can occur when identifying the complexity (linear/non-linear) of environmental data using machine learning algorithm. In particular, the main attention is paid to the description of a self-consistent methodology for the use of Extreme Learning Machines (ELM, Huang et al., 2006), which recently gained a great popularity. By applying two ELM models (with linear and non-linear activation functions) and by comparing their efficiency, quantification of the linearity can be evaluated. The considered approach is accompanied by simulated and real high dimensional and multivariate data case studies. In conclusion, the current challenges and future development in complexity quantification using environmental data mining are discussed. References - Huang, G.-B., Zhu, Q.-Y., Siew, C.-K., 2006. Extreme learning machine: theory and applications. Neurocomputing 70 (1-3), 489-501. - Kanevski, M., Pozdnoukhov, A., Timonin, V., 2009. Machine Learning for Spatial Environmental Data. EPFL Press; Lausanne, Switzerland, p.392. - Leuenberger, M., Kanevski, M., 2015. Extreme Learning Machines for spatial environmental data. Computers and Geosciences 85, 64-73.

  1. A dependence modelling study of extreme rainfall in Madeira Island

    NASA Astrophysics Data System (ADS)

    Gouveia-Reis, Délia; Guerreiro Lopes, Luiz; Mendonça, Sandra

    2016-08-01

    The dependence between variables plays a central role in multivariate extremes. In this paper, spatial dependence of Madeira Island's rainfall data is addressed within an extreme value copula approach through an analysis of maximum annual data. The impact of altitude, slope orientation, distance between rain gauge stations and distance from the stations to the sea are investigated for two different periods of time. The results obtained highlight the influence of the island's complex topography on the spatial distribution of extreme rainfall in Madeira Island.

  2. Study of Pure Proteins, Nucleic Acids and Their Complexes from Halobacteria of the Dead Sea: RNA Polymerase-DNA Interaction.

    DTIC Science & Technology

    1987-09-21

    objectives of our program are to isolate and characterize a fully active DNA dependent RNA polymerase from the extremely halophilic archaebacteria of the genus...operons in II. Marismortui. The halobacteriaceae are extreme halophiles . They require 3.5 M NaCI for optimal growth an(l no growth is observed below 2...was difficutlt to perform due to the extreme genetic instability in this strain (6). In contrast, the genoine of the extreme halophilic and prototrophic

  3. Web Based Information System for Job Training Activities Using Personal Extreme Programming (PXP)

    NASA Astrophysics Data System (ADS)

    Asri, S. A.; Sunaya, I. G. A. M.; Rudiastari, E.; Setiawan, W.

    2018-01-01

    Job training is one of the subjects in university or polytechnic that involves many users and reporting activities. Time and distance became problems for users to reporting and to do obligations tasks during job training due to the location where the job training took place. This research tried to develop a web based information system of job training to overcome the problems. This system was developed using Personal Extreme Programming (PXP). PXP is one of the agile methods is combination of Extreme Programming (XP) and Personal Software Process (PSP). The information system that has developed and tested which are 24% of users are strongly agree, 74% are agree, 1% disagree and 0% strongly disagree about system functionality.

  4. Identification of cloud fields by the nonparametric algorithm of pattern recognition from normalized video data recorded with the AVHRR instrument

    NASA Astrophysics Data System (ADS)

    Protasov, Konstantin T.; Pushkareva, Tatyana Y.; Artamonov, Evgeny S.

    2002-02-01

    The problem of cloud field recognition from the NOAA satellite data is urgent for solving not only meteorological problems but also for resource-ecological monitoring of the Earth's underlying surface associated with the detection of thunderstorm clouds, estimation of the liquid water content of clouds and the moisture of the soil, the degree of fire hazard, etc. To solve these problems, we used the AVHRR/NOAA video data that regularly displayed the situation in the territory. The complexity and extremely nonstationary character of problems to be solved call for the use of information of all spectral channels, mathematical apparatus of testing statistical hypotheses, and methods of pattern recognition and identification of the informative parameters. For a class of detection and pattern recognition problems, the average risk functional is a natural criterion for the quality and the information content of the synthesized decision rules. In this case, to solve efficiently the problem of identifying cloud field types, the informative parameters must be determined by minimization of this functional. Since the conditional probability density functions, representing mathematical models of stochastic patterns, are unknown, the problem of nonparametric reconstruction of distributions from the leaning samples arises. To this end, we used nonparametric estimates of distributions with the modified Epanechnikov kernel. The unknown parameters of these distributions were determined by minimization of the risk functional, which for the learning sample was substituted by the empirical risk. After the conditional probability density functions had been reconstructed for the examined hypotheses, a cloudiness type was identified using the Bayes decision rule.

  5. On extreme points of the diffusion polytope

    DOE PAGES

    Hay, M. J.; Schiff, J.; Fisch, N. J.

    2017-01-04

    Here, we consider a class of diffusion problems defined on simple graphs in which the populations at any two vertices may be averaged if they are connected by an edge. The diffusion polytope is the convex hull of the set of population vectors attainable using finite sequences of these operations. A number of physical problems have linear programming solutions taking the diffusion polytope as the feasible region, e.g. the free energy that can be removed from plasma using waves, so there is a need to describe and enumerate its extreme points. We also review known results for the case ofmore » the complete graph Kn, and study a variety of problems for the path graph Pn and the cyclic graph Cn. Finall, we describe the different kinds of extreme points that arise, and identify the diffusion polytope in a number of simple cases. In the case of increasing initial populations on Pn the diffusion polytope is topologically an n-dimensional hypercube.« less

  6. Studying Weather and Climate Extremes in a Non-stationary Framework

    NASA Astrophysics Data System (ADS)

    Wu, Z.

    2010-12-01

    The study of weather and climate extremes often uses the theory of extreme values. Such a detection method has a major problem: to obtain the probability distribution of extremes, one has to implicitly assume the Earth’s climate is stationary over a long period within which the climatology is defined. While such detection makes some sense in a purely statistical view of stationary processes, it can lead to misleading statistical properties of weather and climate extremes caused by long term climate variability and change, and may also cause enormous difficulty in attributing and predicting these extremes. To alleviate this problem, here we report a novel non-stationary framework for studying weather and climate extremes in a non-stationary framework. In this new framework, the weather and climate extremes will be defined as timescale-dependent quantities derived from the anomalies with respect to non-stationary climatologies of different timescales. With this non-stationary framework, the non-stationary and nonlinear nature of climate system will be taken into account; and the attribution and the prediction of weather and climate extremes can then be separated into 1) the change of the statistical properties of the weather and climate extremes themselves and 2) the background climate variability and change. The new non-stationary framework will use the ensemble empirical mode decomposition (EEMD) method, which is a recent major improvement of the Hilbert-Huang Transform for time-frequency analysis. Using this tool, we will adaptively decompose various weather and climate data from observation and climate models in terms of the components of the various natural timescales contained in the data. With such decompositions, the non-stationary statistical properties (both spatial and temporal) of weather and climate anomalies and of their corresponding climatologies will be analyzed and documented.

  7. TLEM 2.0 - a comprehensive musculoskeletal geometry dataset for subject-specific modeling of lower extremity.

    PubMed

    Carbone, V; Fluit, R; Pellikaan, P; van der Krogt, M M; Janssen, D; Damsgaard, M; Vigneron, L; Feilkas, T; Koopman, H F J M; Verdonschot, N

    2015-03-18

    When analyzing complex biomechanical problems such as predicting the effects of orthopedic surgery, subject-specific musculoskeletal models are essential to achieve reliable predictions. The aim of this paper is to present the Twente Lower Extremity Model 2.0, a new comprehensive dataset of the musculoskeletal geometry of the lower extremity, which is based on medical imaging data and dissection performed on the right lower extremity of a fresh male cadaver. Bone, muscle and subcutaneous fat (including skin) volumes were segmented from computed tomography and magnetic resonance images scans. Inertial parameters were estimated from the image-based segmented volumes. A complete cadaver dissection was performed, in which bony landmarks, attachments sites and lines-of-action of 55 muscle actuators and 12 ligaments, bony wrapping surfaces, and joint geometry were measured. The obtained musculoskeletal geometry dataset was finally implemented in the AnyBody Modeling System (AnyBody Technology A/S, Aalborg, Denmark), resulting in a model consisting of 12 segments, 11 joints and 21 degrees of freedom, and including 166 muscle-tendon elements for each leg. The new TLEM 2.0 dataset was purposely built to be easily combined with novel image-based scaling techniques, such as bone surface morphing, muscle volume registration and muscle-tendon path identification, in order to obtain subject-specific musculoskeletal models in a quick and accurate way. The complete dataset, including CT and MRI scans and segmented volume and surfaces, is made available at http://www.utwente.nl/ctw/bw/research/projects/TLEMsafe for the biomechanical community, in order to accelerate the development and adoption of subject-specific models on large scale. TLEM 2.0 is freely shared for non-commercial use only, under acceptance of the TLEMsafe Research License Agreement. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Analytic Theory and Control of the Motion of Spinning Rigid Bodies

    NASA Technical Reports Server (NTRS)

    Tsiotras, Panagiotis

    1993-01-01

    Numerical simulations are often resorted to, in order to understand the attitude response and control characteristics of a rigid body. However, this approach in performing sensitivity and/or error analyses may be prohibitively expensive and time consuming, especially when a large number of problem parameters are involved. Thus, there is an important role for analytical models in obtaining an understanding of the complex dynamical behavior. In this dissertation, new analytic solutions are derived for the complete attitude motion of spinning rigid bodies, under minimal assumptions. Hence, we obtain the most general solutions reported in the literature so far. Specifically, large external torques and large asymmetries are included in the problem statement. Moreover, problems involving large angular excursions are treated in detail. A new tractable formulation of the kinematics is introduced which proves to be extremely helpful in the search for analytic solutions of the attitude history of such kinds of problems. The main utility of the new formulation becomes apparent however, when searching for feedback control laws for stabilization and/or reorientation of spinning spacecraft. This is an inherently nonlinear problem, where standard linear control techniques fail. We derive a class of control laws for spin axis stabilization of symmetric spacecraft using only two pairs of gas jet actuators. Practically, this could correspond to a spacecraft operating in failure mode, for example. Theoretically, it is also an important control problem which, because of its difficulty, has received little, if any, attention in the literature. The proposed control laws are especially simple and elegant. A feedback control law that achieves arbitrary reorientation of the spacecraft is also derived, using ideas from invariant manifold theory. The significance of this research is twofold. First, it provides a deeper understanding of the fundamental behavior of rigid bodies subject to body-fixed torques. Assessment of the analytic solutions reveals that they are very accurate; for symmetric bodies the solutions of Euler's equations of motion are, in fact, exact. Second, the results of this research have a fundamental impact on practical scientific and mechanical applications in terms of the analysis and control of all finite-sized rigid bodies ranging from nanomachines to very large bodies, both man made and natural. After all, Euler's equations of motion apply to all physical bodies, barring only the extreme limits of quantum mechanics and relativity.

  9. Changes in extreme events and the potential impacts on human health.

    PubMed

    Bell, Jesse E; Brown, Claudia Langford; Conlon, Kathryn; Herring, Stephanie; Kunkel, Kenneth E; Lawrimore, Jay; Luber, George; Schreck, Carl; Smith, Adam; Uejio, Christopher

    2018-04-01

    Extreme weather and climate-related events affect human health by causing death, injury, and illness, as well as having large socioeconomic impacts. Climate change has caused changes in extreme event frequency, intensity, and geographic distribution, and will continue to be a driver for change in the future. Some of these events include heat waves, droughts, wildfires, dust storms, flooding rains, coastal flooding, storm surges, and hurricanes. The pathways connecting extreme events to health outcomes and economic losses can be diverse and complex. The difficulty in predicting these relationships comes from the local societal and environmental factors that affect disease burden. More information is needed about the impacts of climate change on public health and economies to effectively plan for and adapt to climate change. This paper describes some of the ways extreme events are changing and provides examples of the potential impacts on human health and infrastructure. It also identifies key research gaps to be addressed to improve the resilience of public health to extreme events in the future. Extreme weather and climate events affect human health by causing death, injury, and illness, as well as having large socioeconomic impacts. Climate change has caused changes in extreme event frequency, intensity, and geographic distribution, and will continue to be a driver for change in the future. Some of these events include heat waves, droughts, wildfires, flooding rains, coastal flooding, surges, and hurricanes. The pathways connecting extreme events to health outcomes and economic losses can be diverse and complex. The difficulty in predicting these relationships comes from the local societal and environmental factors that affect disease burden.

  10. [Introduction of a clinical protocol for extravasation at the National Institute of Oncology, Budapest, Hungary].

    PubMed

    Bartal, Alexandra; Mátrai, Zoltán; Rosta, András; Szûcs, Attila

    2011-03-01

    Extravasation of cytostatics occurs when an infusion containing a cytotoxic drug leaks into the surrounding perivascular and subcutaneous tissues. Incidence of cytostatic extravasation is found to be 0.1-6% according to the literature. Depending on the severity of complications, pain, loss of function in the extremities, or in extreme cases tissue necrosis necessitating an amputation may develop, drawing consequences like delay or interruption of the chemotherapy. Extent of complications is greatly influenced by the type of medication administered, general condition of the patient, and professional preparedness of staff providing the oncological health service. The protocol recently implemented in the National Institute of Oncology is a short, compact guidance for physicians and nurses providing oncological care, so by quick and adequate management of extravasation cases, severe complications could be prevented. More complex practical guidelines including algorithms could be created as a result of a wider collaboration, with the help of which oncological health professionals could easily cope with this rare problem. The authors describe in their review the implementation of the use of dry warm and cold packs, dymethylsulfoxide and hyaluronidase and their function within the algorithm of extravasation treatment.

  11. [Driving under the influence of benzodiazepines and antidepressants: prescription and abuse].

    PubMed

    Coutinho, Daniel; Vieira, Duarte Nuno; Teixeira, Helena M

    2011-01-01

    Benzodiazepines are drugs usually used in anxiety disorders, dyssomnias, convulsions, muscle disorders, alcohol and other drugs detoxification, as well as in preoperative sedation/amnesia. Moreover, antidepressants are mainly indicated in depression and as co-therapeutic drugs in other psychiatric disorders. The use of benzodiazepines and antidepressants is associated with some health and public safety problems. Decreased of attention, concentration, reflexes, visual capacity, motor coordination and reasoning, associated with increased reaction time and lack of awareness of driving impairment among these drug users, contributes to the increased risk on traffic safety linked with these drugs. This risk may further increase with non-compliance of medical prescription, drug abuse or concomitant use of alcohol. The relationship between the use of psychoactive drugs and road traffic safety is, however, an extremely complex subject and has a primordial importance in the clarification of the role of benzodiazepine and antidepressant effects on driving skills. The prevention of driving under the influence of these drugs depends on the awareness, among doctors, of the risks associated with their use. Thus, the consciousness of medical prescription, as well as providing clear information to patients is extremely important.

  12. The application of fuzzy Delphi and fuzzy inference system in supplier ranking and selection

    NASA Astrophysics Data System (ADS)

    Tahriri, Farzad; Mousavi, Maryam; Hozhabri Haghighi, Siamak; Zawiah Md Dawal, Siti

    2014-06-01

    In today's highly rival market, an effective supplier selection process is vital to the success of any manufacturing system. Selecting the appropriate supplier is always a difficult task because suppliers posses varied strengths and weaknesses that necessitate careful evaluations prior to suppliers' ranking. This is a complex process with many subjective and objective factors to consider before the benefits of supplier selection are achieved. This paper identifies six extremely critical criteria and thirteen sub-criteria based on the literature. A new methodology employing those criteria and sub-criteria is proposed for the assessment and ranking of a given set of suppliers. To handle the subjectivity of the decision maker's assessment, an integration of fuzzy Delphi with fuzzy inference system has been applied and a new ranking method is proposed for supplier selection problem. This supplier selection model enables decision makers to rank the suppliers based on three classifications including "extremely preferred", "moderately preferred", and "weakly preferred". In addition, in each classification, suppliers are put in order from highest final score to the lowest. Finally, the methodology is verified and validated through an example of a numerical test bed.

  13. Detection strategies for extreme mass ratio inspirals

    NASA Astrophysics Data System (ADS)

    Cornish, Neil J.

    2011-05-01

    The capture of compact stellar remnants by galactic black holes provides a unique laboratory for exploring the near-horizon geometry of the Kerr spacetime, or possible departures from general relativity if the central cores prove not to be black holes. The gravitational radiation produced by these extreme mass ratio inspirals (EMRIs) encodes a detailed map of the black hole geometry, and the detection and characterization of these signals is a major scientific goal for the LISA mission. The waveforms produced are very complex, and the signals need to be coherently tracked for tens of thousands of cycles to produce a detection, making EMRI signals one of the most challenging data analysis problems in all of gravitational wave astronomy. Estimates for the number of templates required to perform an exhaustive grid-based matched-filter search for these signals are astronomically large, and far out of reach of current computational resources. Here I describe an alternative approach that employs a hybrid between genetic algorithms and Markov chain Monte Carlo techniques, along with several time-saving techniques for computing the likelihood function. This approach has proven effective at the blind extraction of relatively weak EMRI signals from simulated LISA data sets.

  14. Extreme learning machine: a new alternative for measuring heat collection rate and heat loss coefficient of water-in-glass evacuated tube solar water heaters.

    PubMed

    Liu, Zhijian; Li, Hao; Tang, Xindong; Zhang, Xinyu; Lin, Fan; Cheng, Kewei

    2016-01-01

    Heat collection rate and heat loss coefficient are crucial indicators for the evaluation of in service water-in-glass evacuated tube solar water heaters. However, the direct determination requires complex detection devices and a series of standard experiments, wasting too much time and manpower. To address this problem, we previously used artificial neural networks and support vector machine to develop precise knowledge-based models for predicting the heat collection rates and heat loss coefficients of water-in-glass evacuated tube solar water heaters, setting the properties measured by "portable test instruments" as the independent variables. A robust software for determination was also developed. However, in previous results, the prediction accuracy of heat loss coefficients can still be improved compared to those of heat collection rates. Also, in practical applications, even a small reduction in root mean square errors (RMSEs) can sometimes significantly improve the evaluation and business processes. As a further study, in this short report, we show that using a novel and fast machine learning algorithm-extreme learning machine can generate better predicted results for heat loss coefficient, which reduces the average RMSEs to 0.67 in testing.

  15. New perspective on single-radiator multiple-port antennas for adaptive beamforming applications

    PubMed Central

    Choo, Hosung

    2017-01-01

    One of the most challenging problems in recent antenna engineering fields is to achieve highly reliable beamforming capabilities in an extremely restricted space of small handheld devices. In this paper, we introduce a new perspective on single-radiator multiple-port (SRMP) antenna to alter the traditional approach of multiple-antenna arrays for improving beamforming performances with reduced aperture sizes. The major contribution of this paper is to demonstrate the beamforming capability of the SRMP antenna for use as an extremely miniaturized front-end component in more sophisticated beamforming applications. To examine the beamforming capability, the radiation properties and the array factor of the SRMP antenna are theoretically formulated for electromagnetic characterization and are used as complex weights to form adaptive array patterns. Then, its fundamental performance limits are rigorously explored through enumerative studies by varying the dielectric constant of the substrate, and field tests are conducted using a beamforming hardware to confirm the feasibility. The results demonstrate that the new perspective of the SRMP antenna allows for improved beamforming performances with the ability of maintaining consistently smaller aperture sizes compared to the traditional multiple-antenna arrays. PMID:29023493

  16. Ethical research as the target of animal extremism: an international problem.

    PubMed

    Conn, P Michael; Rantin, F T

    2010-02-01

    Animal extremism has been increasing worldwide; frequently researchers are the targets of actions by groups with extreme animal rights agendas. Sometimes this targeting is violent and may involve assaults on family members or destruction of property. In this article, we summarize recent events and suggest steps that researchers can take to educate the public on the value of animal research both for people and animals.

  17. Prevalence and psychosocial risk factors of upper extremity musculoskeletal pain in industries of Taiwan: a nationwide study.

    PubMed

    Lee, Hsin-Yi; Yeh, Wen-Yu; Chen, Chun-Wan; Wang, Jung-Der

    2005-07-01

    Prevalence of upper extremity disorders and their associations with psychosocial factors in the workplace have received more attention recently. A national survey of cross-sectional design was performed to determine the prevalence rates of upper extremity disorders among different industries. Trained interviewers administered questionnaires to 17,669 workers and data on musculoskeletal complaints were obtained along with information on risk factors. Overall the 1-year prevalence of neck (14.8%), shoulder (16.6%), and hand (12.4%) disorders were higher than those of the upper back (7.1%) and elbow (8.3%) among those who sought medical treatment due to the complaint. Workers in construction and agriculture-related industries showed a higher prevalence of upper extremity disorders. After multiple logistic regression adjusted for age, education, and employment duration, we found job content, physical working condition, a harmonious interpersonal relationship at the workplace and organizational problems were significant determinants of upper extremity disorders in manufacturing and service industries. Male workers in manufacturing industries showed more concern about physical working conditions while female workers in public administration emphasized problems of job content and interpersonal relationships. We concluded that these factors were major job stressors contributing to musculoskeletal pain of the upper extremity.

  18. Three-dimensional Navier-Stokes simulations of turbine rotor-stator interaction

    NASA Technical Reports Server (NTRS)

    Rai, Man Mohan

    1988-01-01

    Fluid flows within turbomachinery tend to be extremely complex in nature. Understanding such flows is crucial to improving current designs of turbomachinery. The computational approach can be used to great advantage in understanding flows in turbomachinery. A finite difference, unsteady, thin layer, Navier-Stokes approach to calculating the flow within an axial turbine stage is presented. The relative motion between the stator and rotor airfoils is made possible with the use of patched grids that move relative to each other. The calculation includes endwall and tip leakage effects. An introduction to the rotor-stator problem and sample results in the form of time averaged surface pressures are presented. The numerical data are compared with experimental data and the agreement between the two is found to be good.

  19. SHARP ENTRYWISE PERTURBATION BOUNDS FOR MARKOV CHAINS.

    PubMed

    Thiede, Erik; VAN Koten, Brian; Weare, Jonathan

    For many Markov chains of practical interest, the invariant distribution is extremely sensitive to perturbations of some entries of the transition matrix, but insensitive to others; we give an example of such a chain, motivated by a problem in computational statistical physics. We have derived perturbation bounds on the relative error of the invariant distribution that reveal these variations in sensitivity. Our bounds are sharp, we do not impose any structural assumptions on the transition matrix or on the perturbation, and computing the bounds has the same complexity as computing the invariant distribution or computing other bounds in the literature. Moreover, our bounds have a simple interpretation in terms of hitting times, which can be used to draw intuitive but rigorous conclusions about the sensitivity of a chain to various types of perturbations.

  20. Forensic analysis of dyed textile fibers.

    PubMed

    Goodpaster, John V; Liszewski, Elisa A

    2009-08-01

    Textile fibers are a key form of trace evidence, and the ability to reliably associate or discriminate them is crucial for forensic scientists worldwide. While microscopic and instrumental analysis can be used to determine the composition of the fiber itself, additional specificity is gained by examining fiber color. This is particularly important when the bulk composition of the fiber is relatively uninformative, as it is with cotton, wool, or other natural fibers. Such analyses pose several problems, including extremely small sample sizes, the desire for nondestructive techniques, and the vast complexity of modern dye compositions. This review will focus on more recent methods for comparing fiber color by using chromatography, spectroscopy, and mass spectrometry. The increasing use of multivariate statistics and other data analysis techniques for the differentiation of spectra from dyed fibers will also be discussed.

  1. Interactive intelligent remote operations: application to space robotics

    NASA Astrophysics Data System (ADS)

    Dupuis, Erick; Gillett, G. R.; Boulanger, Pierre; Edwards, Eric; Lipsett, Michael G.

    1999-11-01

    A set of tolls addressing the problems specific to the control and monitoring of remote robotic systems from extreme distances has been developed. The tools include the capability to model and visualize the remote environment, to generate and edit complex task scripts, to execute the scripts to supervisory control mode and to monitor and diagnostic equipment from multiple remote locations. Two prototype systems are implemented for demonstration. The first demonstration, using a prototype joint design called Dexter, shows the applicability of the approach to space robotic operation in low Earth orbit. The second demonstration uses a remotely controlled excavator in an operational open-pit tar sand mine. This demonstrates that the tools developed can also be used for planetary exploration operations as well as for terrestrial mining applications.

  2. Model-based virtual VSB mask writer verification for efficient mask error checking and optimization prior to MDP

    NASA Astrophysics Data System (ADS)

    Pack, Robert C.; Standiford, Keith; Lukanc, Todd; Ning, Guo Xiang; Verma, Piyush; Batarseh, Fadi; Chua, Gek Soon; Fujimura, Akira; Pang, Linyong

    2014-10-01

    A methodology is described wherein a calibrated model-based `Virtual' Variable Shaped Beam (VSB) mask writer process simulator is used to accurately verify complex Optical Proximity Correction (OPC) and Inverse Lithography Technology (ILT) mask designs prior to Mask Data Preparation (MDP) and mask fabrication. This type of verification addresses physical effects which occur in mask writing that may impact lithographic printing fidelity and variability. The work described here is motivated by requirements for extreme accuracy and control of variations for today's most demanding IC products. These extreme demands necessitate careful and detailed analysis of all potential sources of uncompensated error or variation and extreme control of these at each stage of the integrated OPC/ MDP/ Mask/ silicon lithography flow. The important potential sources of variation we focus on here originate on the basis of VSB mask writer physics and other errors inherent in the mask writing process. The deposited electron beam dose distribution may be examined in a manner similar to optical lithography aerial image analysis and image edge log-slope analysis. This approach enables one to catch, grade, and mitigate problems early and thus reduce the likelihood for costly long-loop iterations between OPC, MDP, and wafer fabrication flows. It moreover describes how to detect regions of a layout or mask where hotspots may occur or where the robustness to intrinsic variations may be improved by modification to the OPC, choice of mask technology, or by judicious design of VSB shots and dose assignment.

  3. The prevalence of foot problems in older women: a cause for concern.

    PubMed

    Dawson, Jill; Thorogood, Margaret; Marks, Sally-Anne; Juszczak, Ed; Dodd, Chris; Lavis, Grahame; Fitzpatrick, Ray

    2002-06-01

    Painful feet are an extremely common problem amongst older women. Such problems increase the risk of falls and hamper mobility. The aetiology of painful and deformed feet is poorly understood. Data were obtained during a pilot case-control study about past high heel usage in women, in relation to osteoarthritis of the knee. A total of 127 women aged 50-70 were interviewed (31 cases, 96 controls); case-control sets were matched for age. The following information was obtained about footwear: (1) age when first wore shoes with heels 1, 2 and 3 inches high; (2) height of heels worn for work; (3) maximum height of heels worn regularly for work, going out socially and for dancing, in 10-year age bands. Information about work-related activities and lifetime occupational history was gathered using a Life-Grid. The interview included a foot inspection. Foot problems, particularly foot arthritis, affected considerably more cases than controls (45 per cent versus 16 per cent, p = 0.001) and was considered a confounder. Cases were therefore excluded from subsequent analyses. Amongst controls, the prevalence of any foot problems was very high (83 per cent). All women had regularly worn one inch heels and few (8 per cent) had never worn 2 inch heels. Foot problems were significantly associated with a history of wearing relatively lower heels. Few work activities were related to foot problems; regular lifting was associated with foot pain (p = 0.03). Most women in this age-group have been exposed to high-heeled shoes over many years, making aetiological research difficult in this area. Foot pain and deformities are widespread. The relationship between footwear, occupational activities and foot problems is a complex one that deserves considerably more research.

  4. Choice of optimal working fluid for binary power plants at extremely low temperature brine

    NASA Astrophysics Data System (ADS)

    Tomarov, G. V.; Shipkov, A. A.; Sorokina, E. V.

    2016-12-01

    The geothermal energy development problems based on using binary power plants utilizing lowpotential geothermal resources are considered. It is shown that one of the possible ways of increasing the efficiency of heat utilization of geothermal brine in a wide temperature range is the use of multistage power systems with series-connected binary power plants based on incremental primary energy conversion. Some practically significant results of design-analytical investigations of physicochemical properties of various organic substances and their influence on the main parameters of the flowsheet and the technical and operational characteristics of heat-mechanical and heat-exchange equipment for binary power plant operating on extremely-low temperature geothermal brine (70°C) are presented. The calculation results of geothermal brine specific flow rate, capacity (net), and other operation characteristics of binary power plants with the capacity of 2.5 MW at using various organic substances are a practical interest. It is shown that the working fluid selection significantly influences on the parameters of the flowsheet and the operational characteristics of the binary power plant, and the problem of selection of working fluid is in the search for compromise based on the priorities in the field of efficiency, safety, and ecology criteria of a binary power plant. It is proposed in the investigations on the working fluid selection of the binary plant to use the plotting method of multiaxis complex diagrams of relative parameters and characteristic of binary power plants. Some examples of plotting and analyzing these diagrams intended to choose the working fluid provided that the efficiency of geothermal brine is taken as main priority.

  5. A Firefly-Inspired Method for Protein Structure Prediction in Lattice Models

    PubMed Central

    Maher, Brian; Albrecht, Andreas A.; Loomes, Martin; Yang, Xin-She; Steinhöfel, Kathleen

    2014-01-01

    We introduce a Firefly-inspired algorithmic approach for protein structure prediction over two different lattice models in three-dimensional space. In particular, we consider three-dimensional cubic and three-dimensional face-centred-cubic (FCC) lattices. The underlying energy models are the Hydrophobic-Polar (H-P) model, the Miyazawa–Jernigan (M-J) model and a related matrix model. The implementation of our approach is tested on ten H-P benchmark problems of a length of 48 and ten M-J benchmark problems of a length ranging from 48 until 61. The key complexity parameter we investigate is the total number of objective function evaluations required to achieve the optimum energy values for the H-P model or competitive results in comparison to published values for the M-J model. For H-P instances and cubic lattices, where data for comparison are available, we obtain an average speed-up over eight instances of 2.1, leaving out two extreme values (otherwise, 8.8). For six M-J instances, data for comparison are available for cubic lattices and runs with a population size of 100, where, a priori, the minimum free energy is a termination criterion. The average speed-up over four instances is 1.2 (leaving out two extreme values, otherwise 1.1), which is achieved for a population size of only eight instances. The present study is a test case with initial results for ad hoc parameter settings, with the aim of justifying future research on larger instances within lattice model settings, eventually leading to the ultimate goal of implementations for off-lattice models. PMID:24970205

  6. A firefly-inspired method for protein structure prediction in lattice models.

    PubMed

    Maher, Brian; Albrecht, Andreas A; Loomes, Martin; Yang, Xin-She; Steinhöfel, Kathleen

    2014-01-07

    We introduce a Firefly-inspired algorithmic approach for protein structure prediction over two different lattice models in three-dimensional space. In particular, we consider three-dimensional cubic and three-dimensional face-centred-cubic (FCC) lattices. The underlying energy models are the Hydrophobic-Polar (H-P) model, the Miyazawa-Jernigan (M-J) model and a related matrix model. The implementation of our approach is tested on ten H-P benchmark problems of a length of 48 and ten M-J benchmark problems of a length ranging from 48 until 61. The key complexity parameter we investigate is the total number of objective function evaluations required to achieve the optimum energy values for the H-P model or competitive results in comparison to published values for the M-J model. For H-P instances and cubic lattices, where data for comparison are available, we obtain an average speed-up over eight instances of 2.1, leaving out two extreme values (otherwise, 8.8). For six M-J instances, data for comparison are available for cubic lattices and runs with a population size of 100, where, a priori, the minimum free energy is a termination criterion. The average speed-up over four instances is 1.2 (leaving out two extreme values, otherwise 1.1), which is achieved for a population size of only eight instances. The present study is a test case with initial results for ad hoc parameter settings, with the aim of justifying future research on larger instances within lattice model settings, eventually leading to the ultimate goal of implementations for off-lattice models.

  7. Integration of modern statistical tools for the analysis of climate extremes into the web-GIS “CLIMATE”

    NASA Astrophysics Data System (ADS)

    Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.

  8. Investigations into Gravitational Wave Emission from Compact Body Inspiral into Massive Black Holes

    NASA Technical Reports Server (NTRS)

    Hughes, Scott A.

    2005-01-01

    In contrast to year 1 (when much of the activity associated with this grant focused upon developing our group at MIT), year 2 was a period of very focused attention on research problems. We made significant progress developing relativistic waveforms for the extreme mass ratio inspiral problem; we have pushed forward a formalism our group developed for mapping the spacetimes of massive compact objects; and, in collaboration with the Caltech group, we began to develop a framework for addressing issues in LISA data analysis for extreme mass ratio systems.

  9. Extreme marginalization: addiction and other mental health disorders, stigma, and imprisonment

    PubMed Central

    Kreek, Mary Jeanne

    2013-01-01

    Major well-defined medical problems that are, in part, the unfortunate outcome of a negative social environment may include specific addictive diseases and other mental health disorders, in particular the affective disorders of anxiety, depression, social phobia, and post-traumatic stress syndrome. This overview touches on the topic of extreme marginalization associated with addiction and other mental health disorders, along with arrest, imprisonment, and parole. All of these are characterized by lasting stigma that hauntingly continues to impact upon each person suffering from any of these problems. PMID:21884162

  10. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  11. Complex Problem Solving: What It Is and What It Is Not

    PubMed Central

    Dörner, Dietrich; Funke, Joachim

    2017-01-01

    Computer-simulated scenarios have been part of psychological research on problem solving for more than 40 years. The shift in emphasis from simple toy problems to complex, more real-life oriented problems has been accompanied by discussions about the best ways to assess the process of solving complex problems. Psychometric issues such as reliable assessments and addressing correlations with other instruments have been in the foreground of these discussions and have left the content validity of complex problem solving in the background. In this paper, we return the focus to content issues and address the important features that define complex problems. PMID:28744242

  12. Impact of Behavioral Inhibition and Parenting Style on Internalizing and Externalizing Problems from Early Childhood through Adolescence

    ERIC Educational Resources Information Center

    Williams, Lela Rankin; Degnan, Kathryn A.; Perez-Edgar, Koraly E.; Henderson, Heather A.; Rubin, Kenneth H.; Pine, Daniel S.; Steinberg, Laurence; Fox, Nathan A.

    2009-01-01

    Behavioral inhibition (BI) is characterized by a pattern of extreme social reticence, risk for internalizing behavior problems, and possible protection against externalizing behavior problems. Parenting style may also contribute to these associations between BI and behavior problems (BP). A sample of 113 children was assessed for BI in the…

  13. Why do people buy dogs with potential welfare problems related to extreme conformation and inherited disease? A representative study of Danish owners of four small dog breeds

    PubMed Central

    Kondrup, S. V.; Bennett, P. C.; Forkman, B.; Meyer, I; Proschowsky, H. F.; Serpell, J. A.; Lund, T. B.

    2017-01-01

    A number of dog breeds suffer from welfare problems due to extreme phenotypes and high levels of inherited diseases but the popularity of such breeds is not declining. Using a survey of owners of two popular breeds with extreme physical features (French Bulldog and Chihuahua), one with a high load of inherited diseases not directly related to conformation (Cavalier King Charles Spaniel), and one representing the same size range but without extreme conformation and with the same level of disease as the overall dog population (Cairn Terrier), we investigated this seeming paradox. We examined planning and motivational factors behind acquisition of the dogs, and whether levels of experienced health and behavior problems were associated with the quality of the owner-dog relationship and the intention to re-procure a dog of the same breed. Owners of each of the four breeds (750/breed) were randomly drawn from a nationwide Danish dog registry and invited to participate. Of these, 911 responded, giving a final sample of 846. There were clear differences between owners of the four breeds with respect to degree of planning prior to purchase, with owners of Chihuahuas exhibiting less. Motivations behind choice of dog were also different. Health and other breed attributes were more important to owners of Cairn Terriers, whereas the dog’s personality was reported to be more important for owners of French Bulldogs and Cavalier King Charles Spaniels but less important for Chihuahua owners. Higher levels of health and behavior problems were positively associated with a closer owner-dog relationship for owners of Cavalier King Charles Spaniels and Chihuahuas but, for owners of French Bulldogs, high levels of problems were negatively associated with an intention to procure the same breed again. In light of these findings, it appears less paradoxical that people continue to buy dogs with welfare problems. PMID:28234931

  14. Why do people buy dogs with potential welfare problems related to extreme conformation and inherited disease? A representative study of Danish owners of four small dog breeds.

    PubMed

    Sandøe, P; Kondrup, S V; Bennett, P C; Forkman, B; Meyer, I; Proschowsky, H F; Serpell, J A; Lund, T B

    2017-01-01

    A number of dog breeds suffer from welfare problems due to extreme phenotypes and high levels of inherited diseases but the popularity of such breeds is not declining. Using a survey of owners of two popular breeds with extreme physical features (French Bulldog and Chihuahua), one with a high load of inherited diseases not directly related to conformation (Cavalier King Charles Spaniel), and one representing the same size range but without extreme conformation and with the same level of disease as the overall dog population (Cairn Terrier), we investigated this seeming paradox. We examined planning and motivational factors behind acquisition of the dogs, and whether levels of experienced health and behavior problems were associated with the quality of the owner-dog relationship and the intention to re-procure a dog of the same breed. Owners of each of the four breeds (750/breed) were randomly drawn from a nationwide Danish dog registry and invited to participate. Of these, 911 responded, giving a final sample of 846. There were clear differences between owners of the four breeds with respect to degree of planning prior to purchase, with owners of Chihuahuas exhibiting less. Motivations behind choice of dog were also different. Health and other breed attributes were more important to owners of Cairn Terriers, whereas the dog's personality was reported to be more important for owners of French Bulldogs and Cavalier King Charles Spaniels but less important for Chihuahua owners. Higher levels of health and behavior problems were positively associated with a closer owner-dog relationship for owners of Cavalier King Charles Spaniels and Chihuahuas but, for owners of French Bulldogs, high levels of problems were negatively associated with an intention to procure the same breed again. In light of these findings, it appears less paradoxical that people continue to buy dogs with welfare problems.

  15. Food in health security in North East Asia.

    PubMed

    Moon, Hyun-Kyung

    2009-01-01

    Food and health security in North East Asia including South Korea, North Korea, China and Japan was compared. Because this region contains countries with many complex problems, it is worthwhile to study the current situation. With about 24% of the world's population, all North East Asian countries supply between 2400 and 3000 Kcal of energy. Regarding health status, two extreme problems exist. One is malnutrition in North Korea and China and the other is chronic degenerative disease in Japan, South Korea and China. Because quality, quantity and safety of the food supply have to be secured for health security, some topics are selected and discussed. 1) World food price can have an effect on food security for countries with a low food self sufficiency rate such as Japan and Korea; specially, for the urban poor. 2) Population aging can increase the number of aged people without food security. An aged population with less income and no support from their off-spring, because of disappearing traditional values, may have food insecurity. 3) Population growth and economic growth in this region may worsen food problems. Since a quarter of the world's population resides in this region, populations will continue to increase. With economic growth, people will consume more animal products. 4) Climate change generates food production problems. As the progress of industry continues, there will be less land for food and more pollutants in the environment. 5) Political instability will cause food insecurity and conflict will cause problems with regard to food aid.

  16. A Risk-Constrained Multi-Stage Decision Making Approach to the Architectural Analysis of Mars Missions

    NASA Technical Reports Server (NTRS)

    Kuwata, Yoshiaki; Pavone, Marco; Balaram, J. (Bob)

    2012-01-01

    This paper presents a novel risk-constrained multi-stage decision making approach to the architectural analysis of planetary rover missions. In particular, focusing on a 2018 Mars rover concept, which was considered as part of a potential Mars Sample Return campaign, we model the entry, descent, and landing (EDL) phase and the rover traverse phase as four sequential decision-making stages. The problem is to find a sequence of divert and driving maneuvers so that the rover drive is minimized and the probability of a mission failure (e.g., due to a failed landing) is below a user specified bound. By solving this problem for several different values of the model parameters (e.g., divert authority), this approach enables rigorous, accurate and systematic trade-offs for the EDL system vs. the mobility system, and, more in general, cross-domain trade-offs for the different phases of a space mission. The overall optimization problem can be seen as a chance-constrained dynamic programming problem, with the additional complexity that 1) in some stages the disturbances do not have any probabilistic characterization, and 2) the state space is extremely large (i.e, hundreds of millions of states for trade-offs with high-resolution Martian maps). To this purpose, we solve the problem by performing an unconventional combination of average and minimax cost analysis and by leveraging high efficient computation tools from the image processing community. Preliminary trade-off results are presented.

  17. Do causes of stress differ in their association with problem drinking by sex in Korean adolescents?

    PubMed

    Choi, Jae-Woo; Park, Eun-Cheol; Kim, Jae-Hyun; Park, So-Hee

    2017-01-01

    Previous studies have focused mainly on whether stress causes present drinking or excessive drinking. However, few studies have been conducted on the relationship between stress and problem drinking in adolescents. The objective of this study was to examine the stress level and the cause of stress related to problem drinking behavior according to sex among Korean youth. Data for this study were pooled from cross-sectional data collected annually from 2007 through 2012 from the Korea Youth Risk Behavior Web-based Survey. A representative sample of 442,113 students from 800 randomly selected middle and high schools in Korea were included. Multiple logistic regression models were used in the analysis. Both male and female students with extremely high stress were more likely to engage in problem drinking than were students with no stress (odds ratios [OR], 1.73 in males and 1.41 in females). The major causes of stress in male students that were associated with problem drinking were conflict with a teacher, trouble with parents, and peer relationships (ORs, 2.47, 1.72, and 1.71, respectively), whereas there are no statistically significant association between causes of stress and problem drinking among female students. Considering stress level, Male students with extremely high stress level were associated with problem drinking regardless of causes of stress, while Female students who felt extremely high levels of stress were more likely to engage in problem drinking due to stress from a conflict with parents, peer relationships, appearance, and financial difficulty (ORs, 1.53, 1.53, 1.46, and 1.47, respectively). Adolescents who engage in problem drinking may be affected by different causes of stress according to sex. Thus, appropriate approaches that reflect sex differences will be helpful to alleviate problem drinking in adolescents and educational authorities need to arrange more effective education program for drinking given positive associations between drinking education and problem drinking. Copyright © 2016. Published by Elsevier Ltd.

  18. Reinforcement Learning in a Nonstationary Environment: The El Farol Problem

    NASA Technical Reports Server (NTRS)

    Bell, Ann Maria

    1999-01-01

    This paper examines the performance of simple learning rules in a complex adaptive system based on a coordination problem modeled on the El Farol problem. The key features of the El Farol problem are that it typically involves a medium number of agents and that agents' pay-off functions have a discontinuous response to increased congestion. First we consider a single adaptive agent facing a stationary environment. We demonstrate that the simple learning rules proposed by Roth and Er'ev can be extremely sensitive to small changes in the initial conditions and that events early in a simulation can affect the performance of the rule over a relatively long time horizon. In contrast, a reinforcement learning rule based on standard practice in the computer science literature converges rapidly and robustly. The situation is reversed when multiple adaptive agents interact: the RE algorithms often converge rapidly to a stable average aggregate attendance despite the slow and erratic behavior of individual learners, while the CS based learners frequently over-attend in the early and intermediate terms. The symmetric mixed strategy equilibria is unstable: all three learning rules ultimately tend towards pure strategies or stabilize in the medium term at non-equilibrium probabilities of attendance. The brittleness of the algorithms in different contexts emphasize the importance of thorough and thoughtful examination of simulation-based results.

  19. Net Force of an Ideal Conductor on an Element of a Line of Charge Moving With Extreme Relativistic Speed

    ERIC Educational Resources Information Center

    Cawley, Robert

    1978-01-01

    Considers the problem of determining the force on an element of a finite length line of charge moving horizontally with extreme relativistic speed through an evacuated space above an infinite plane ideal conducting surface. (SL)

  20. Fighting Illiteracy in the Arab World

    ERIC Educational Resources Information Center

    Hammud, Muwafaq Abu; Jarrar, Amani G.

    2017-01-01

    Illiteracy in the Arab world is becoming an urgent necessity particularly facing problems of poverty, ignorance, extremism, which impede the required economic, social, political and cultural development processes. Extremism, violence and terrorism, in the Arab world, can only be eliminated by spreading of knowledge, fighting illiteracy. The study…

  1. Complex-Spectrum Magnetic Environment enhances and/or modifies Bioeffects of Hypokinetic Stress Condition: an Animal Study

    NASA Astrophysics Data System (ADS)

    Temuriantz, N. A.; Martinyuk, V. S.; Ptitsyna, N. G.; Villoresi, G.; Iucci, N.; Tyasto, M. I.; Dorman, L. I.

    During last decades it was shown by many authors that ultra-low and extremely low frequency electric and magnetic fields ULF 0-10 Hz ELF 10-1000 Hz may produce biological effects and consequently may be a possible source for health problems Spaceflight electric and magnetic environments are characterized by complex combination of static and time-varying components in ULF-ELF range and by high variability The objective of this study was to investigate the possible influence of such magnetic fields on rats to understand the pathway regarding functional state of cardiovascular system Magnetic field MF pattern with variable complex spectra in 0-150 Hz frequency range was simulated using 3-axial Helmholtz coils and special computer-based equipment The effect of the real world MF exposure on rats was also tested in combination with hypokinetic stress condition which is typical for spaceflights It was revealed that variable complex-spectrum MF acts as a weak or moderate stress-like factor which amplifies and or modifies the functional shifts caused by other stress-factors The value and direction of the functional shifts caused by MF exposure significantly depend on gender individual-typological constitutional features and also on the physiological state norm stress of organism Our results support the idea that variable complex-spectrum MF action involves sympathetic activation overload in cholesterol transport in blood and also secretor activation of tissue basophyls mast cells that can influence the regional haemodynamics These

  2. Some Problems of Extremes in Geometry and Construction

    ERIC Educational Resources Information Center

    Yanovsky, Levi

    2008-01-01

    Two original problems in geometry are presented with solutions utilizing to differential calculus: (a) rectangle inscribed in a sector; (b) point on the ray of the angle. The possibility of applying mathematics in general and differential calculus in particular for solution of practical problems is discussed. (Contains 8 figures.)

  3. Dialogue-Based Research in Man-Machine Communication

    DTIC Science & Technology

    1975-11-01

    This paper first surveys current knowledge of human communication from a point of view which seeks to find or develop knowledge that will be useful...complexity is explored. Building a useful knowledge of human communication is an extremely complex task. Controlling this complexity and its effects, without

  4. Statistical complexity without explicit reference to underlying probabilities

    NASA Astrophysics Data System (ADS)

    Pennini, F.; Plastino, A.

    2018-06-01

    We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.

  5. Complex Fluids and Hydraulic Fracturing.

    PubMed

    Barbati, Alexander C; Desroches, Jean; Robisson, Agathe; McKinley, Gareth H

    2016-06-07

    Nearly 70 years old, hydraulic fracturing is a core technique for stimulating hydrocarbon production in a majority of oil and gas reservoirs. Complex fluids are implemented in nearly every step of the fracturing process, most significantly to generate and sustain fractures and transport and distribute proppant particles during and following fluid injection. An extremely wide range of complex fluids are used: naturally occurring polysaccharide and synthetic polymer solutions, aqueous physical and chemical gels, organic gels, micellar surfactant solutions, emulsions, and foams. These fluids are loaded over a wide range of concentrations with particles of varying sizes and aspect ratios and are subjected to extreme mechanical and environmental conditions. We describe the settings of hydraulic fracturing (framed by geology), fracturing mechanics and physics, and the critical role that non-Newtonian fluid dynamics and complex fluids play in the hydraulic fracturing process.

  6. Multivariate non-normally distributed random variables in climate research - introduction to the copula approach

    NASA Astrophysics Data System (ADS)

    Schölzel, C.; Friederichs, P.

    2008-10-01

    Probability distributions of multivariate random variables are generally more complex compared to their univariate counterparts which is due to a possible nonlinear dependence between the random variables. One approach to this problem is the use of copulas, which have become popular over recent years, especially in fields like econometrics, finance, risk management, or insurance. Since this newly emerging field includes various practices, a controversial discussion, and vast field of literature, it is difficult to get an overview. The aim of this paper is therefore to provide an brief overview of copulas for application in meteorology and climate research. We examine the advantages and disadvantages compared to alternative approaches like e.g. mixture models, summarize the current problem of goodness-of-fit (GOF) tests for copulas, and discuss the connection with multivariate extremes. An application to station data shows the simplicity and the capabilities as well as the limitations of this approach. Observations of daily precipitation and temperature are fitted to a bivariate model and demonstrate, that copulas are valuable complement to the commonly used methods.

  7. Geomagnetic Navigation of Autonomous Underwater Vehicle Based on Multi-objective Evolutionary Algorithm.

    PubMed

    Li, Hong; Liu, Mingyong; Zhang, Feihu

    2017-01-01

    This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments.

  8. Geomagnetic Navigation of Autonomous Underwater Vehicle Based on Multi-objective Evolutionary Algorithm

    PubMed Central

    Li, Hong; Liu, Mingyong; Zhang, Feihu

    2017-01-01

    This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments. PMID:28747884

  9. Monitoring challenges and innovative ideas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Neill, R.V.; Hunsaker, C.T.; Levine, D.A.

    1990-01-01

    Monitoring programs are difficult to design even when they focus on specific problems. Ecosystems are complex, and it is often impossible to predetermine what aspects of system structure or dynamics will respond to a specific insult. It is equally difficult to interpret whether a response is a stabilizing compensatory mechanism or a real loss of capacity to maintain the ecosystem. The problems are compounded in a broad monitoring program designed to assess ecosystem health'' at regional and continental scales. It is challenging in the extreme to monitor ecosystem response, at any scale, to past insults as well as an unknownmore » future array of impacts. The present paper will examine some of the fundamental issues and challenges raised by large-scale monitoring efforts. The challenges will serve as a framework and as an excuse to discuss several important topics in more detail. Following the discussion of challenges, we suggest some basic innovations that could be important across a range of monitoring programs. The innovations include integrative measures, innovative methodology, and creative interpretation. 59 refs., 1 tab.« less

  10. Determining the mechanical constitutive properties of metals as a function of strain rate and temperature: A combined experimental and modeling approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    I. M. Robertson; A. Beaudoin; J. Lambros

    2004-01-05

    OAK-135 Development and validation of constitutive models for polycrystalline materials subjected to high strain rate loading over a range of temperatures are needed to predict the response of engineering materials to in-service type conditions (foreign object damage, high-strain rate forging, high-speed sheet forming, deformation behavior during forming, response to extreme conditions, etc.). To account accurately for the complex effects that can occur during extreme and variable loading conditions, requires significant and detailed computational and modeling efforts. These efforts must be closely coupled with precise and targeted experimental measurements that not only verify the predictions of the models, but also providemore » input about the fundamental processes responsible for the macroscopic response. Achieving this coupling between modeling and experimentation is the guiding principle of this program. Specifically, this program seeks to bridge the length scale between discrete dislocation interactions with grain boundaries and continuum models for polycrystalline plasticity. Achieving this goal requires incorporating these complex dislocation-interface interactions into the well-defined behavior of single crystals. Despite the widespread study of metal plasticity, this aspect is not well understood for simple loading conditions, let alone extreme ones. Our experimental approach includes determining the high-strain rate response as a function of strain and temperature with post-mortem characterization of the microstructure, quasi-static testing of pre-deformed material, and direct observation of the dislocation behavior during reloading by using the in situ transmission electron microscope deformation technique. These experiments will provide the basis for development and validation of physically-based constitutive models, which will include dislocation-grain boundary interactions for polycrystalline systems. One aspect of the program will involve the dire ct observation of specific mechanisms of micro-plasticity, as these will indicate the boundary value problem that should be addressed. This focus on the pre-yield region in the quasi-static effort (the elasto-plastic transition) is also a tractable one from an experimental and modeling viewpoint. In addition, our approach will minimize the need to fit model parameters to experimental data to obtain convergence. These are critical steps to reach the primary objective of simulating and modeling material performance under extreme loading conditions. In this annual report, we describe the progress made in the first year of this program.« less

  11. Advanced Software V&V for Civil Aviation and Autonomy

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  12. Extremophilic and extremotolerant actinomycetes in different soil types

    NASA Astrophysics Data System (ADS)

    Zenova, G. M.; Manucharova, N. A.; Zvyagintsev, D. G.

    2011-04-01

    Problems on the resistance of soil actinomycetes to various environmental factors (pH, salinity, temperature, and moisture) are discussed. Actinomycetes as a special group of prokaryotes were revealed to have a greater range of tolerance to these factors than was thought earlier. The regularities of the distribution of extremophilic and extremotolerant actinomycetes developing in unusual for mycelial bacteria conditions, their structural-functional characteristics, and their taxonomic composition were determined. The predominance of acidophilic representatives of the Micromonospora genus in acid soils (typical peat, soddy-podzolic, and taiga podzol) and the haloalkaliphilic Streptomyces pluricilirescens and S. prunicolor species in desert saline soils are shown. The specific features of the actinomycete complexes on thermal fields of the weakly developed stratified volcanic soils are described. In these complexes, the thermophilic forms were represented only by species of the Micromonospora genus; and the mesophilic forms, by Microbispora species. In the periodically heated desert soils, among the thermophilic actinomycetes, representatives of rare Actinomadura, Saccharopolyspora and Streptosporangium genera along with Streptomyces species were indicated. The mechanisms of the resistance of the actinomycetes to the extreme environmental conditions are discussed.

  13. Violence Against Children in Afghanistan: Community Perspectives.

    PubMed

    Cameron, Cate M; O'Leary, Patrick J; Lakhani, Ali; Osborne, Jodie M; de Souza, Luana; Hope, Kristen; Naimi, Mohammad S; Khan, Hassan; Jawad, Qazi S; Majidi, Sabir

    2018-03-01

    Violence against children (VAC) is a significant international problem and, in Afghanistan, is particularly complex given the country has suffered armed conflict and extreme poverty for more than 30 years. The aim of this study was to examine the level of knowledge and observation of VAC by community leaders, professional groups, and business owners in three Afghan districts. A survey of community and religious leaders; health, socio-legal, and education professionals; and business owners from Kabul, Jalalabad, and Torkham ( n = 182) was conducted. Structured interviews included qualitative and quantitative components. Questions related to knowledge and experience of VAC, and to perceptions of consequences, causes, and strategies for preventing VAC. The statistical significance of differences between participant groups and measures of association were assessed by Pearson's chi-square test, the Mann-Whitney test, and the Kruskall-Wallis one-way ANOVA. Qualitative responses were analyzed thematically. VAC was reported to occur mostly in the home, community, and workplace. The scale of the problem varied, with religious and community leaders underreporting VAC by 30% to 40% compared with other participant groups ( p < .001). Business owners also significantly underreported VAC in the workplace, despite admitting to acts of discipline that included physical contact. There were some regional differences, with lower reporting of violence in Jalalabad compared with the two other locations ( p < .001). Causes of VAC were consistently attributed to poverty, lack of education, and the effects of war. The findings of this study indicate that VAC is a serious and complex problem in Afghanistan. Decades of armed conflict and entrenched poverty influence how violence is perceived and recognized. Consideration should be given to initiatives that build on the existing strengths within the community while raising awareness and recognition of the nature, extent, and burden of VAC in the community.

  14. Accurate detection of hierarchical communities in complex networks based on nonlinear dynamical evolution

    NASA Astrophysics Data System (ADS)

    Zhuo, Zhao; Cai, Shi-Min; Tang, Ming; Lai, Ying-Cheng

    2018-04-01

    One of the most challenging problems in network science is to accurately detect communities at distinct hierarchical scales. Most existing methods are based on structural analysis and manipulation, which are NP-hard. We articulate an alternative, dynamical evolution-based approach to the problem. The basic principle is to computationally implement a nonlinear dynamical process on all nodes in the network with a general coupling scheme, creating a networked dynamical system. Under a proper system setting and with an adjustable control parameter, the community structure of the network would "come out" or emerge naturally from the dynamical evolution of the system. As the control parameter is systematically varied, the community hierarchies at different scales can be revealed. As a concrete example of this general principle, we exploit clustered synchronization as a dynamical mechanism through which the hierarchical community structure can be uncovered. In particular, for quite arbitrary choices of the nonlinear nodal dynamics and coupling scheme, decreasing the coupling parameter from the global synchronization regime, in which the dynamical states of all nodes are perfectly synchronized, can lead to a weaker type of synchronization organized as clusters. We demonstrate the existence of optimal choices of the coupling parameter for which the synchronization clusters encode accurate information about the hierarchical community structure of the network. We test and validate our method using a standard class of benchmark modular networks with two distinct hierarchies of communities and a number of empirical networks arising from the real world. Our method is computationally extremely efficient, eliminating completely the NP-hard difficulty associated with previous methods. The basic principle of exploiting dynamical evolution to uncover hidden community organizations at different scales represents a "game-change" type of approach to addressing the problem of community detection in complex networks.

  15. Actionable Science Lessons Emerging from the Department of Interior Climate Science Center Network

    NASA Astrophysics Data System (ADS)

    McMahon, G.; Meadow, A. M.; Mikels-Carrasco, J.

    2015-12-01

    The DOI Advisory Committee on Climate Change and Natural Resource Science (ACCCNRS) has recommended that co-production of actionable science be the core programmatic focus of the Climate Science Center enterprise. Efforts by the Southeast Climate Science Center suggest that the complexity of many climate adaptation decision problems (many stakeholders that can influence implementation of a decision; the problems that can be viewed at many scales in space and time; dynamic objectives with competing values; complex, non-linear systems) complicates development of research-based information that scientists and non-scientists view as comprehensible, trustworthy, legitimate, and accurate. Going forward, organizers of actionable science efforts should consider inclusion of a broad set of stakeholders, beyond formal decisionmakers, and ensure that sufficient resources are available to explore the interests and values of this broader group. Co-produced research endeavors should foster agency and collaboration across a wide range of stakeholders. We recognize that stakeholder agency may be constrained by scientific or political power structures that limit the ability to initiate discussion, make claims, and call things into question. Co-production efforts may need to be preceded by more descriptive assessments that summarize existing climate science in ways that stakeholders can understand and link with their concerns. Such efforts can build rapport and trust among scientists and non-scientists, and may help stakeholders and scientists alike to frame adaptation decision problems amenable to a co-production effort. Finally, university and government researchers operate within an evaluation structure that rewards researcher-driven science that, at the extreme, "throws information over the fence" in the hope that information users will make better decisions. Research evaluation processes must reward more consultative, collaborative, and collegial research approaches if researchers are to widely adopt co-production methods

  16. Application of complex discrete wavelet transform in classification of Doppler signals using complex-valued artificial neural network.

    PubMed

    Ceylan, Murat; Ceylan, Rahime; Ozbay, Yüksel; Kara, Sadik

    2008-09-01

    In biomedical signal classification, due to the huge amount of data, to compress the biomedical waveform data is vital. This paper presents two different structures formed using feature extraction algorithms to decrease size of feature set in training and test data. The proposed structures, named as wavelet transform-complex-valued artificial neural network (WT-CVANN) and complex wavelet transform-complex-valued artificial neural network (CWT-CVANN), use real and complex discrete wavelet transform for feature extraction. The aim of using wavelet transform is to compress data and to reduce training time of network without decreasing accuracy rate. In this study, the presented structures were applied to the problem of classification in carotid arterial Doppler ultrasound signals. Carotid arterial Doppler ultrasound signals were acquired from left carotid arteries of 38 patients and 40 healthy volunteers. The patient group included 22 males and 16 females with an established diagnosis of the early phase of atherosclerosis through coronary or aortofemoropopliteal (lower extremity) angiographies (mean age, 59 years; range, 48-72 years). Healthy volunteers were young non-smokers who seem to not bear any risk of atherosclerosis, including 28 males and 12 females (mean age, 23 years; range, 19-27 years). Sensitivity, specificity and average detection rate were calculated for comparison, after training and test phases of all structures finished. These parameters have demonstrated that training times of CVANN and real-valued artificial neural network (RVANN) were reduced using feature extraction algorithms without decreasing accuracy rate in accordance to our aim.

  17. Improved 3D live-wire method with application to 3D CT chest image analysis

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Higgins, William E.

    2006-03-01

    The definition of regions of interests (ROIs), such as suspect cancer nodules or lymph nodes in 3D CT chest images, is often difficult because of the complexity of the phenomena that give rise to them. Manual slice tracing has been used widely for years for such problems, because it is easy to implement and guaranteed to work. But the manual method is extremely time-consuming, especially for high-solution 3D images which may have hundreds of slices, and it is subject to operator biases. Numerous automated image-segmentation methods have been proposed, but they are generally strongly application dependent, and even the "most robust" methods have difficulty in defining complex anatomical ROIs. To address this problem, the semi-automatic interactive paradigm referred to as "live wire" segmentation has been proposed by researchers. In live-wire segmentation, the human operator interactively defines an ROI's boundary guided by an active automated method which suggests what to define. This process in general is far faster, more reproducible and accurate than manual tracing, while, at the same time, permitting the definition of complex ROIs having ill-defined boundaries. We propose a 2D live-wire method employing an improved cost over previous works. In addition, we define a new 3D live-wire formulation that enables rapid definition of 3D ROIs. The method only requires the human operator to consider a few slices in general. Experimental results indicate that the new 2D and 3D live-wire approaches are efficient, allow for high reproducibility, and are reliable for 2D and 3D object segmentation.

  18. Using integrated research and interdisciplinary science: Potential benefits and challenges to managers of parks and protected areas

    USGS Publications Warehouse

    van Riper, Charles; Powell, Robert B.; Machlis, Gary; van Wagtendonk, Jan W.; van Riper, Carena J.; von Ruschkowski, Eick; Schwarzbach, Steven E.; Galipeau, Russell E.

    2012-01-01

    Our purpose in this paper is to build a case for utilizing interdisciplinary science to enhance the management of parks and protected areas. We suggest that interdisciplinary science is necessary for dealing with the complex issues of contemporary resource management, and that using the best available integrated scientific information be embraced and supported at all levels of agencies that manage parks and protected areas. It will take the commitment of park managers, scientists, and agency leaders to achieve the goal of implementing the results of interdisciplinary science into park management. Although such calls go back at least several decades, today interdisciplinary science is sporadically being promoted as necessary for supporting effective protected area management(e.g., Machlis et al. 1981; Kelleher and Kenchington 1991). Despite this history, rarely has "interdisciplinary science" been defined, its importance explained, or guidance provided on how to translate and then implement the associated research results into management actions (Tress et al. 2006; Margles et al. 2010). With the extremely complex issues that now confront protected areas (e.g., climate change influences, extinctions and loss of biodiversity, human and wildlife demographic changes, and unprecedented human population growth) information from more than one scientific discipline will need to be brought to bear in order to achieve sustained management solutions that resonate with stakeholders (Ostrom 2009). Although interdisciplinary science is not the solution to all problems, we argue that interdisciplinary research is an evolving and widely supported best practice. In the case of park and protected area management, interdisciplinary science is being driven by the increasing recognition of the complexity and interconnectedness of human and natural systems, and the notion that addressing many problems can be more rapidly advanced through interdisciplinary study and analysis.

  19. Tribal Colleges: The Original Extreme Makeover Experts

    ERIC Educational Resources Information Center

    Powless, Donna

    2015-01-01

    In this article, the author states "our experience with education is a prime example in proving we are experts at problem-solving and are the originators of the extreme makeover." Educational institutions were introduced to the Native people in an outrageous manner--often as a mask for assimilating American Indians, routinely resulting…

  20. Modeling Hydrodynamics on the Wave Group Scale in Topographically Complex Reef Environments

    NASA Astrophysics Data System (ADS)

    Reyns, J.; Becker, J. M.; Merrifield, M. A.; Roelvink, J. A.

    2016-02-01

    The knowledge of the characteristics of waves and the associated wave-driven currents is important for sediment transport and morphodynamics, nutrient dynamics and larval dispersion within coral reef ecosystems. Reef-lined coasts differ from sandy beaches in that they have a steep offshore slope, that the non-sandy bottom topography is very rough, and that the distance between the point of maximum short wave dissipation and the actual coastline is usually large. At this short wave breakpoint, long waves are released, and these infragravity (IG) scale motions account for the bulk of the water level variance on the reef flat, the lagoon and eventually, the sandy beaches fronting the coast through run-up. These IG energy dominated water level motions are reinforced during extreme events such as cyclones or swells through larger incident band wave heights and low frequency wave resonance on the reef. Recently, a number of hydro(-morpho)dynamic models that have the capability to model these IG waves have successfully been applied to morphologically differing reef environments. One of these models is the XBeach model, which is curvilinear in nature. This poses serious problems when trying to model an entire atoll for example, as it is extremely difficult to build curvilinear grids that are optimal for the simulation of hydrodynamic processes, while maintaining the topology in the grid. One solution to remediate this problem of grid connectivity is the use of unstructured grids. We present an implementation of the wave action balance on the wave group scale with feedback to the flow momentum balance, which is the foundation of XBeach, within the framework of the unstructured Delft3D Flexible Mesh model. The model can be run in stationary as well as in instationary mode, and it can be forced by regular waves, time series or wave spectra. We show how the code is capable of modeling the wave generated flow at a number of topographically complex reef sites and for a number of different forcing conditions, by comparison with field data.

  1. Climate change and temperature extremes: A review of heat- and cold-related morbidity and mortality concerns of municipalities.

    PubMed

    Gronlund, Carina J; Sullivan, Kyle P; Kefelegn, Yonathan; Cameron, Lorraine; O'Neill, Marie S

    2018-08-01

    Cold and hot weather are associated with mortality and morbidity. Although the burden of temperature-associated mortality may shift towards high temperatures in the future, cold temperatures may represent a greater current-day problem in temperate cities. Hot and cold temperature vulnerabilities may coincide across several personal and neighborhood characteristics, suggesting opportunities for increasing present and future resilience to extreme temperatures. We present a narrative literature review encompassing the epidemiology of cold- and heat-related mortality and morbidity, related physiologic and environmental mechanisms, and municipal responses to hot and cold weather, illustrated by Detroit, Michigan, USA, a financially burdened city in an economically diverse metropolitan area. The Detroit area experiences sharp increases in mortality and hospitalizations with extreme heat, while cold temperatures are associated with more gradual increases in mortality, with no clear threshold. Interventions such as heating and cooling centers may reduce but not eliminate temperature-associated health problems. Furthermore, direct hemodynamic responses to cold, sudden exertion, poor indoor air quality and respiratory epidemics likely contribute to cold-related mortality. Short- and long-term interventions to enhance energy and housing security and housing quality may reduce temperature-related health problems. Extreme temperatures can increase morbidity and mortality in municipalities like Detroit that experience both extreme heat and prolonged cold seasons amidst large socioeconomic disparities. The similarities in physiologic and built-environment vulnerabilities to both hot and cold weather suggest prioritization of strategies that address both present-day cold and near-future heat concerns. Copyright © 2018. Published by Elsevier B.V.

  2. Efficient Statistically Accurate Algorithms for the Fokker-Planck Equation in Large Dimensions

    NASA Astrophysics Data System (ADS)

    Chen, N.; Majda, A.

    2017-12-01

    Solving the Fokker-Planck equation for high-dimensional complex turbulent dynamical systems is an important and practical issue. However, most traditional methods suffer from the curse of dimensionality and have difficulties in capturing the fat tailed highly intermittent probability density functions (PDFs) of complex systems in turbulence, neuroscience and excitable media. In this article, efficient statistically accurate algorithms are developed for solving both the transient and the equilibrium solutions of Fokker-Planck equations associated with high-dimensional nonlinear turbulent dynamical systems with conditional Gaussian structures. The algorithms involve a hybrid strategy that requires only a small number of ensembles. Here, a conditional Gaussian mixture in a high-dimensional subspace via an extremely efficient parametric method is combined with a judicious non-parametric Gaussian kernel density estimation in the remaining low-dimensional subspace. Particularly, the parametric method, which is based on an effective data assimilation framework, provides closed analytical formulae for determining the conditional Gaussian distributions in the high-dimensional subspace. Therefore, it is computationally efficient and accurate. The full non-Gaussian PDF of the system is then given by a Gaussian mixture. Different from the traditional particle methods, each conditional Gaussian distribution here covers a significant portion of the high-dimensional PDF. Therefore a small number of ensembles is sufficient to recover the full PDF, which overcomes the curse of dimensionality. Notably, the mixture distribution has a significant skill in capturing the transient behavior with fat tails of the high-dimensional non-Gaussian PDFs, and this facilitates the algorithms in accurately describing the intermittency and extreme events in complex turbulent systems. It is shown in a stringent set of test problems that the method only requires an order of O(100) ensembles to successfully recover the highly non-Gaussian transient PDFs in up to 6 dimensions with only small errors.

  3. Complex Regional Pain Syndrome

    MedlinePlus

    Complex regional pain syndrome (CRPS) is a chronic pain condition. It causes intense pain, usually in the arms, hands, legs, or feet. ... in skin temperature, color, or texture Intense burning pain Extreme skin sensitivity Swelling and stiffness in affected ...

  4. Hemodialysis Dose and Adequacy

    MedlinePlus

    ... a patient's Kt/V is extremely low, the measurement should be repeated, unless a reason for the low Kt/V is obvious. Obvious reasons include treatment interruption, problems with blood or solution flow, and a problem in sampling either the pre- ...

  5. Case management services for work related upper extremity disorders. Integrating workplace accommodation and problem solving.

    PubMed

    Shaw, W S; Feuerstein, M; Lincoln, A E; Miller, V I; Wood, P M

    2001-08-01

    A case manager's ability to obtain worksite accommodations and engage workers in active problem solving may improve health and return to work outcomes for clients with work related upper extremity disorders (WRUEDs). This study examines the feasibility of a 2 day training seminar to help nurse case managers identify ergonomic risk factors, provide accommodation, and conduct problem solving skills training with workers' compensation claimants recovering from WRUEDs. Eight procedural steps to this case management approach were identified, translated into a training workshop format, and conveyed to 65 randomly selected case managers. Results indicate moderate to high self ratings of confidence to perform ergonomic assessments (mean = 7.5 of 10) and to provide problem solving skills training (mean = 7.2 of 10) after the seminar. This training format was suitable to experienced case managers and generated a moderate to high level of confidence to use this case management approach.

  6. Variational Bayesian Learning for Wavelet Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Roussos, E.; Roberts, S.; Daubechies, I.

    2005-11-01

    In an exploratory approach to data analysis, it is often useful to consider the observations as generated from a set of latent generators or "sources" via a generally unknown mapping. For the noisy overcomplete case, where we have more sources than observations, the problem becomes extremely ill-posed. Solutions to such inverse problems can, in many cases, be achieved by incorporating prior knowledge about the problem, captured in the form of constraints. This setting is a natural candidate for the application of the Bayesian methodology, allowing us to incorporate "soft" constraints in a natural manner. The work described in this paper is mainly driven by problems in functional magnetic resonance imaging of the brain, for the neuro-scientific goal of extracting relevant "maps" from the data. This can be stated as a `blind' source separation problem. Recent experiments in the field of neuroscience show that these maps are sparse, in some appropriate sense. The separation problem can be solved by independent component analysis (ICA), viewed as a technique for seeking sparse components, assuming appropriate distributions for the sources. We derive a hybrid wavelet-ICA model, transforming the signals into a domain where the modeling assumption of sparsity of the coefficients with respect to a dictionary is natural. We follow a graphical modeling formalism, viewing ICA as a probabilistic generative model. We use hierarchical source and mixing models and apply Bayesian inference to the problem. This allows us to perform model selection in order to infer the complexity of the representation, as well as automatic denoising. Since exact inference and learning in such a model is intractable, we follow a variational Bayesian mean-field approach in the conjugate-exponential family of distributions, for efficient unsupervised learning in multi-dimensional settings. The performance of the proposed algorithm is demonstrated on some representative experiments.

  7. Droughts and governance impacts on water scarcity: an~analysis in the Brazilian semi-arid

    NASA Astrophysics Data System (ADS)

    Silva, A. C. S.; Galvão, C. O.; Silva, G. N. S.

    2015-06-01

    Extreme events are part of climate variability. Dealing with variability is still a challenge that might be increased due to climate change. However, impacts of extreme events are not only dependent on their variability, but also on management and governance. In Brazil, its semi-arid region is vulnerable to extreme events, especially droughts, for centuries. Actually, other Brazilian regions that have been mostly concerned with floods are currently also experiencing droughts. This article evaluates how a combination between climate variability and water governance might affect water scarcity and increase the impacts of extreme events on some regions. For this evaluation, Ostrom's framework for analyzing social-ecological systems (SES) was applied. Ostrom's framework is useful for understanding interactions between resource systems, governance systems and resource users. This study focuses on social-ecological systems located in a drought-prone region of Brazil. Two extreme events were selected, one in 1997-2000, when Brazil's new water policy was very young, and the other one in 2012-2015. The analysis of SES considering Ostrom's principle "Clearly defined boundaries" showed that deficiencies in water management cause the intensification of drought's impacts for the water users. The reasons are more related to water management and governance problems than to drought event magnitude or climate change. This is a problem that holdup advances in dealing with extreme events.

  8. Violent Extremism, Community-Based Violence Prevention, and Mental Health Professionals.

    PubMed

    Weine, Stevan M; Stone, Andrew; Saeed, Aliya; Shanfield, Stephen; Beahrs, John; Gutman, Alisa; Mihajlovic, Aida

    2017-01-01

    New community-based initiatives being developed to address violent extremism in the United States are utilizing mental health services and leadership. This article reviews current approaches to preventing violent extremism, the contribution that mental illness and psychosocial problems can make to violent extremism, and the rationale for integrating mental health strategies into preventing violent extremism. The authors describe a community-based targeted violence prevention model and the potential roles of mental health professionals. This model consists of a multidisciplinary team that assesses at-risk individuals with comprehensive threat and behavioral evaluations, arranges for ongoing support and treatment, conducts follow-up evaluations, and offers outreach, education, and resources for communities. This model would enable mental health professionals in local communities to play key roles in preventing violent extremism through their practice and leadership.

  9. The Long-Term Effectiveness of the Family Check-Up on School-Age Conduct Problems: Moderation by Neighborhood Deprivation

    PubMed Central

    Shaw, Daniel S.; Sitnick, Stephanie L.; Brennan, Lauretta M.; Choe, Daniel E.; Dishion, Thomas J.; Wilson, Melvin N.; Gardner, Frances

    2016-01-01

    Several studies suggest that neighborhood deprivation is a unique risk factor in child and adolescent development of problem behavior. We sought to examine whether previously established intervention effects of the Family Check-Up (FCU) on child conduct problems at age 7.5 would persist through age 9.5, and whether neighborhood deprivation would moderate these effects. In addition, we examined whether improvements in parent-child interaction during early childhood associated with the FCU would be related to later reductions in child aggression among families living in the highest-risk neighborhoods. Using a multisite cohort of at-risk children identified on the basis of family, child, and socioeconomic risk and randomly assigned to the FCU, intervention effects were found to be moderated by neighborhood deprivation, such that they were only directly present for those living at moderate versus extreme levels of neighborhood deprivation. Additionally, improvements in child aggression were evident for children living in extreme neighborhood deprivation when parents improved the quality of their parent-child interaction during the toddler period (i.e., moderated mediation). Implications of the findings are discussed in relation to the possibilities and possible limitations in prevention of early problem behavior for those children living in extreme and moderate levels of poverty. PMID:26646197

  10. Novel spin transition between S = 5/2 and S = 3/2 in highly saddled iron(III) porphyrin complexes at extremely low temperatures.

    PubMed

    Ohgo, Yoshiki; Chiba, Yuya; Hashizume, Daisuke; Uekusa, Hidehiro; Ozeki, Tomoji; Nakamura, Mikio

    2006-05-14

    A novel spin transition between S = 5/2 and S = 3/2 has been observed for the first time in five-coordinate, highly saddled iron(III) porphyrinates by EPR and SQUID measurements at extremely low temperatures.

  11. EFFECTS OF ULTRAVIOLET RADIATION ON THE MODERATE HALOPHILE HALOMONAS ELONGATA AND THE EXTREME HALOPHILE HALOBACTERIUM SALINARUM

    EPA Science Inventory

    Both the moderately halophilic bacterium, Halomonas elongata, and the extremely halophilic archaea, Halobacterium salinarum, can be found in hypersaline environments (e.g., salterns). On complex media, H. elongata grows over a salt range of 0.05-5.2 M, whereas, H. salinarum multi...

  12. Lower-extremity musculoskeletal geometry affects the calculation of patellofemoral forces in vertical jumping and weightlifting.

    PubMed

    Cleather, D I; Bull, A M J

    2010-01-01

    The calculation of the patellofemoral joint contact force using three-dimensional (3D) modelling techniques requires a description of the musculoskeletal geometry of the lower limb. In this study, the influence of the complexity of the muscle model was studied by considering two different muscle models, the Delp and Horsman models. Both models were used to calculate the patellofemoral force during standing, vertical jumping, and Olympic-style weightlifting. The patellofemoral forces predicted by the Horsman model were markedly lower than those predicted by the Delp model in all activities and represented more realistic values when compared with previous work. This was found to be a result of a lower level of redundancy in the Delp model, which forced a higher level of muscular activation in order to allow a viable solution. The higher level of complexity in the Horsman model resulted in a greater degree of redundancy and consequently lower activation and patellofemoral forces. The results of this work demonstrate that a well-posed muscle model must have an adequate degree of complexity to create a sufficient independence, variability, and number of moment arms in order to ensure adequate redundancy of the force-sharing problem such that muscle forces are not overstated.

  13. Mycobacterium bovis and Other Uncommon Members of the Mycobacterium tuberculosis Complex.

    PubMed

    Esteban, Jaime; Muñoz-Egea, Maria-Carmen

    2016-12-01

    Since its discovery by Theobald Smith, Mycobacterium bovis has been a human pathogen closely related to animal disease. At present, M. bovis tuberculosis is still a problem of importance in many countries and is considered the main cause of zoonotic tuberculosis throughout the world. Recent development of molecular epidemiological tools has helped us to improve our knowledge about transmission patterns of this organism, which causes a disease indistinguishable from that caused by Mycobacterium tuberculosis. Diagnosis and treatment of this mycobacterium are similar to those for conventional tuberculosis, with the important exceptions of constitutive resistance to pyrazinamide and the fact that multidrug-resistant and extremely drug-resistant M. bovis strains have been described. Among other members of this complex, Mycobacterium africanum is the cause of many cases of tuberculosis in West Africa and can be found in other areas mainly in association with immigration. M. bovis BCG is the currently available vaccine for tuberculosis, but it can cause disease in some patients. Other members of the M. tuberculosis complex are mainly animal pathogens with only exceptional cases of human disease, and there are even some strains, like "Mycobacterium canettii," which is a rare human pathogen that could have an important role in the knowledge of the evolution of tuberculosis in the history.

  14. Evaluating reproducibility of differential expression discoveries in microarray studies by considering correlated molecular changes.

    PubMed

    Zhang, Min; Zhang, Lin; Zou, Jinfeng; Yao, Chen; Xiao, Hui; Liu, Qing; Wang, Jing; Wang, Dong; Wang, Chenguang; Guo, Zheng

    2009-07-01

    According to current consistency metrics such as percentage of overlapping genes (POG), lists of differentially expressed genes (DEGs) detected from different microarray studies for a complex disease are often highly inconsistent. This irreproducibility problem also exists in other high-throughput post-genomic areas such as proteomics and metabolism. A complex disease is often characterized with many coordinated molecular changes, which should be considered when evaluating the reproducibility of discovery lists from different studies. We proposed metrics percentage of overlapping genes-related (POGR) and normalized POGR (nPOGR) to evaluate the consistency between two DEG lists for a complex disease, considering correlated molecular changes rather than only counting gene overlaps between the lists. Based on microarray datasets of three diseases, we showed that though the POG scores for DEG lists from different studies for each disease are extremely low, the POGR and nPOGR scores can be rather high, suggesting that the apparently inconsistent DEG lists may be highly reproducible in the sense that they are actually significantly correlated. Observing different discovery results for a disease by the POGR and nPOGR scores will obviously reduce the uncertainty of the microarray studies. The proposed metrics could also be applicable in many other high-throughput post-genomic areas.

  15. The AMchip04 and the processing unit prototype for the FastTracker

    NASA Astrophysics Data System (ADS)

    Andreani, A.; Annovi, A.; Beretta, M.; Bogdan, M.; Citterio, M.; Alberti, F.; Giannetti, P.; Lanza, A.; Magalotti, D.; Piendibene, M.; Shochet, M.; Stabile, A.; Tang, J.; Tompkins, L.; Volpi, G.

    2012-08-01

    Modern experiments search for extremely rare processes hidden in much larger background levels. As the experiment`s complexity, the accelerator backgrounds and luminosity increase we need increasingly complex and exclusive event selection. We present the first prototype of a new Processing Unit (PU), the core of the FastTracker processor (FTK). FTK is a real time tracking device for the ATLAS experiment`s trigger upgrade. The computing power of the PU is such that a few hundred of them will be able to reconstruct all the tracks with transverse momentum above 1 GeV/c in ATLAS events up to Phase II instantaneous luminosities (3 × 1034 cm-2 s-1) with an event input rate of 100 kHz and a latency below a hundred microseconds. The PU provides massive computing power to minimize the online execution time of complex tracking algorithms. The time consuming pattern recognition problem, generally referred to as the ``combinatorial challenge'', is solved by the Associative Memory (AM) technology exploiting parallelism to the maximum extent; it compares the event to all pre-calculated ``expectations'' or ``patterns'' (pattern matching) simultaneously, looking for candidate tracks called ``roads''. This approach reduces to a linear behavior the typical exponential complexity of the CPU based algorithms. Pattern recognition is completed by the time data are loaded into the AM devices. We report on the design of the first Processing Unit prototypes. The design had to address the most challenging aspects of this technology: a huge number of detector clusters (``hits'') must be distributed at high rate with very large fan-out to all patterns (10 Million patterns will be located on 128 chips placed on a single board) and a huge number of roads must be collected and sent back to the FTK post-pattern-recognition functions. A network of high speed serial links is used to solve the data distribution problem.

  16. Cooperative Efforts in Fuels Management

    Treesearch

    Gerald L. Adams

    1995-01-01

    Our forests have been neglected or protected to death, creating an extreme wildfire risk in wildland urban intermix communities. We as agencies and organizations are just now beginning to understand that the fuel problems we have across the western states are not a single agency problem, but "our problem." Wildfires do not respect boundaries, be they...

  17. Mental health assessed by the Strengths and Difficulties Questionnaire for children born extremely preterm without severe disabilities at 11 years of age: a Norwegian, national population-based study.

    PubMed

    Fevang, Silje Katrine Elgen; Hysing, Mari; Sommerfelt, Kristian; Elgen, Irene

    2017-12-01

    The aims were to investigate mental health problems with the Strength and Difficulties Questionnaire (SDQ) in children born extremely preterm/extremely low birth weight (EP/ELBW) without severe disabilities compared to controls, and to identify peri-, or neonatal factors possibly predicting later mental health problems. A national Norwegian cohort of 11-year-old EP/ELBW children, excluding those with intellectual disabilities, non-ambulatory cerebral palsy, blindness and/or deafness, was assessed. Parents and teachers completed the SDQ. Mean scores and scores ≥90th percentile for the control group, combined (parent and/or teacher reporting the child ≥90th percentile), and pervasive ratings (both parent and teacher reporting the child ≥90th percentile) were presented. The controls consisted of an unselected population of all 11-year-old children born in 1995 who attended public or private schools in Bergen. Of the eligible children, 216 (64%) EP/ELBW and 1882 (61%) control children participated. The EP/ELBW children had significantly higher scores and/or increased risk of parent, teacher, combined, and pervasive rated hyperactivity/inattention, emotional-, and peer problems (OR 2.1-6.3). Only parents reported the EP/ELBW children to be at an increased risk of conduct problems (OR 1.6, 95% CI 1.1-2.6). Only low maternal education at birth was significantly associated with mental health problems at 11 years of age (OR 2.5, 95% CI 1.2-5.4). EP/ELBW children without severe disabilities had increased risk of symptoms of hyperactivity/inattention, emotional-, and peer problems. None of the peri- or neonatal factors were significantly associated with later mental health problems, except for low maternal education.

  18. Making Energy-Water Nexus Scenarios more Fit-for-Purpose through Better Characterization of Extremes

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Levy, M. A.; Chen, R. S.; Schnarr, E.

    2017-12-01

    Often quantitative scenarios of future trends exhibit less variability than the historic data upon which the models that generate them are based. The problem of dampened variability, which typically also entails dampened extremes, manifests both temporally and spatially. As a result, risk assessments that rely on such scenarios are in danger of producing misleading results. This danger is pronounced in nexus issues, because of the multiple dimensions of change that are relevant. We illustrate the above problem by developing alternative joint distributions of the probability of drought and of human population totals, across U.S. counties over the period 2010-2030. For the dampened-extremes case we use drought frequencies derived from climate models used in the U.S. National Climate Assessment and the Environmental Protection Agency's population and land use projections contained in its Integrated Climate and Land Use Scenarios (ICLUS). For the elevated extremes case we use an alternative spatial drought frequency estimate based on tree-ring data, covering a 555-year period (Ho et al 2017); and we introduce greater temporal and spatial extremes in the ICLUS socioeconomic projections so that they conform to observed extremes in the historical U.S. spatial census data 1790-present (National Historical Geographic Information System). We use spatial and temporal coincidence of high population and extreme drought as a proxy for energy-water nexus risk. We compare the representation of risk in the dampened-extreme and elevated-extreme scenario analysis. We identify areas of the country where using more realistic portrayals of extremes makes the biggest difference in estimate risk and suggest implications for future risk assessments. References: Michelle Ho, Upmanu Lall, Xun Sun, Edward R. Cook. 2017. Multiscale temporal variability and regional patterns in 555 years of conterminous U.S. streamflow. Water Resources Research. . doi: 10.1002/2016WR019632

  19. High-pressure lime injection.

    DOT National Transportation Integrated Search

    1965-08-01

    The presence of unstable soils in many areas of Louisiana results in numerous problems in design and construction in these areas. These problem soils are primarily of two categories, the first of which consists of the high clay contents and extreme p...

  20. Hay preservation with propionic acid

    USDA-ARS?s Scientific Manuscript database

    Most hay producers are quite familiar with the problems associated with baling moist hays. Normally, these problems include spontaneous heating, increased evidence of mold, losses of dry matter (DM) during storage, poorer nutritive value, and (in extreme cases) spontaneous combustion. Numerous fact...

  1. Technical Parameters Modeling of a Gas Probe Foaming Using an Active Experimental Type Research

    NASA Astrophysics Data System (ADS)

    Tîtu, A. M.; Sandu, A. V.; Pop, A. B.; Ceocea, C.; Tîtu, S.

    2018-06-01

    The present paper deals with a current and complex topic, namely - a technical problem solving regarding the modeling and then optimization of some technical parameters related to the natural gas extraction process. The study subject is to optimize the gas probe sputtering using experimental research methods and data processing by regular probe intervention with different sputtering agents. This procedure makes that the hydrostatic pressure to be reduced by the foam formation from the water deposit and the scrubbing agent which can be removed from the surface by the produced gas flow. The probe production data was analyzed and the so-called candidate for the research itself emerged. This is an extremely complex study and it was carried out on the field works, finding that due to the severe gas field depletion the wells flow decreases and the start of their loading with deposit water, was registered. It was required the regular wells foaming, to optimize the daily production flow and the disposal of the wellbore accumulated water. In order to analyze the process of natural gas production, the factorial experiment and other methods were used. The reason of this choice is that the method can offer very good research results with a small number of experimental data. Finally, through this study the extraction process problems were identified by analyzing and optimizing the technical parameters, which led to a quality improvement of the extraction process.

  2. Rethinking Use of the OML Model in Electric Sail Development

    NASA Technical Reports Server (NTRS)

    Stone, Nobie H.

    2016-01-01

    In 1924, Irvin Langmuir and H. M. Mott-Smith published a theoretical model for the complex plasma sheath phenomenon in which they identified some very special cases which greatly simplified the sheath and allowed a closed solution to the problem. The most widely used application is for an electrostatic, or "Langmuir," probe in laboratory plasma. Although the Langmuir probe is physically simple (a biased wire) the theory describing its functional behavior and its current-voltage characteristic is extremely complex and, accordingly, a number of assumptions and approximations are used in the LMS model. These simplifications, correspondingly, place limits on the model's range of application. Adapting the LMS model to real-life conditions is the subject of numerous papers and dissertations. The Orbit-Motion Limited (OML) model that is widely used today is one of these adaptions that is a convenient means of calculating sheath effects. Since the Langmuir probe is a simple biased wire immersed in plasma, it is particularly tempting to use the OML equation in calculating the characteristics of the long, highly biased wires of an Electric Sail in the solar wind plasma. However, in order to arrive at the OML equation, a number of additional simplifying assumptions and approximations (beyond those made by Langmuir-Mott-Smith) are necessary. The OML equation is a good approximation when all conditions are met, but it would appear that the Electric Sail problem lies outside of the limits of applicability.

  3. A Tensor-Train accelerated solver for integral equations in complex geometries

    NASA Astrophysics Data System (ADS)

    Corona, Eduardo; Rahimian, Abtin; Zorin, Denis

    2017-04-01

    We present a framework using the Quantized Tensor Train (QTT) decomposition to accurately and efficiently solve volume and boundary integral equations in three dimensions. We describe how the QTT decomposition can be used as a hierarchical compression and inversion scheme for matrices arising from the discretization of integral equations. For a broad range of problems, computational and storage costs of the inversion scheme are extremely modest O (log ⁡ N) and once the inverse is computed, it can be applied in O (Nlog ⁡ N) . We analyze the QTT ranks for hierarchically low rank matrices and discuss its relationship to commonly used hierarchical compression techniques such as FMM and HSS. We prove that the QTT ranks are bounded for translation-invariant systems and argue that this behavior extends to non-translation invariant volume and boundary integrals. For volume integrals, the QTT decomposition provides an efficient direct solver requiring significantly less memory compared to other fast direct solvers. We present results demonstrating the remarkable performance of the QTT-based solver when applied to both translation and non-translation invariant volume integrals in 3D. For boundary integral equations, we demonstrate that using a QTT decomposition to construct preconditioners for a Krylov subspace method leads to an efficient and robust solver with a small memory footprint. We test the QTT preconditioners in the iterative solution of an exterior elliptic boundary value problem (Laplace) formulated as a boundary integral equation in complex, multiply connected geometries.

  4. [Revision hip arthroplasty by Waldemar Link custom-made total hip prosthesis].

    PubMed

    Medenica, Ivica; Luković, Milan; Radoicić, Dragan

    2010-02-01

    The number of patients undergoing hip arthroplasty revision is constantly growing. Especially, complex problem is extensive loss of bone stock and pelvic discontinuity that requires reconstruction. The paper presented a 50-year old patient, who ten years ago underwent a total cement artrhroplasty of the left hip. A year after the primary operation the patient had difficulties in walking without crutches. Problems intensified in the last five years, the patient had severe pain, totally limited movement in the left hip and could not walk at all. Radiographically, we found loose femoral component, massive loss of bone stock of proximal femur, acetabular protrusion and a consequent pelvic discontinuity. Clinically, a completely disfunctional left hip joint was registered (Harris hip score--7.1). We performed total rearthroplasty by a custom-made Waldemar Link total hip prosthesis with acetabular antiprotrusio cage and compensation of bone defects with a graft from the bone bank. A year after the operation, we found clinically an extreme improvement in Harris hip score--87.8. Radiographically, we found stability of implanted components, a complete graft integration and bone bridging across the site of pelvic discontinuity. Pelvic discontinuity and massive loss of proximal femoral bone stock is a challenging and complex entity. Conventional prostheses cannot provide an adequate fixation and stability of the hip. Application of custom-made prosthesis (measured specificaly for a patient) and additional alografting bone defects is a good method in revision surgery after unsuccessful hip arthroplasty with extensive bone defects.

  5. Hadoop-Based Distributed System for Online Prediction of Air Pollution Based on Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Ghaemi, Z.; Farnaghi, M.; Alimohammadi, A.

    2015-12-01

    The critical impact of air pollution on human health and environment in one hand and the complexity of pollutant concentration behavior in the other hand lead the scientists to look for advance techniques for monitoring and predicting the urban air quality. Additionally, recent developments in data measurement techniques have led to collection of various types of data about air quality. Such data is extremely voluminous and to be useful it must be processed at high velocity. Due to the complexity of big data analysis especially for dynamic applications, online forecasting of pollutant concentration trends within a reasonable processing time is still an open problem. The purpose of this paper is to present an online forecasting approach based on Support Vector Machine (SVM) to predict the air quality one day in advance. In order to overcome the computational requirements for large-scale data analysis, distributed computing based on the Hadoop platform has been employed to leverage the processing power of multiple processing units. The MapReduce programming model is adopted for massive parallel processing in this study. Based on the online algorithm and Hadoop framework, an online forecasting system is designed to predict the air pollution of Tehran for the next 24 hours. The results have been assessed on the basis of Processing Time and Efficiency. Quite accurate predictions of air pollutant indicator levels within an acceptable processing time prove that the presented approach is very suitable to tackle large scale air pollution prediction problems.

  6. Bidirectional extreme learning machine for regression problem and its learning effectiveness.

    PubMed

    Yang, Yimin; Wang, Yaonan; Yuan, Xiaofang

    2012-09-01

    It is clear that the learning effectiveness and learning speed of neural networks are in general far slower than required, which has been a major bottleneck for many applications. Recently, a simple and efficient learning method, referred to as extreme learning machine (ELM), was proposed by Huang , which has shown that, compared to some conventional methods, the training time of neural networks can be reduced by a thousand times. However, one of the open problems in ELM research is whether the number of hidden nodes can be further reduced without affecting learning effectiveness. This brief proposes a new learning algorithm, called bidirectional extreme learning machine (B-ELM), in which some hidden nodes are not randomly selected. In theory, this algorithm tends to reduce network output error to 0 at an extremely early learning stage. Furthermore, we find a relationship between the network output error and the network output weights in the proposed B-ELM. Simulation results demonstrate that the proposed method can be tens to hundreds of times faster than other incremental ELM algorithms.

  7. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, F.; Kramer, S.; Koch, T.; Pfützner, B.

    2017-12-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high-resolution groundwater level simulation was carried out. A decision support process with an intensive stakeholder interaction combined with high-resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  8. Response surface method in geotechnical/structural analysis, phase 1

    NASA Astrophysics Data System (ADS)

    Wong, F. S.

    1981-02-01

    In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bakosi, Jozsef; Christon, Mark A.; Francois, Marianne M.

    This report describes the work carried out for completion of the Thermal Hydraulics Methods (THM) Level 3 Milestone THM.CFD.P5.05 for the Consortium for Advanced Simulation of Light Water Reactors (CASL). A series body-fitted computational meshes have been generated by Numeca's Hexpress/Hybrid, a.k.a. 'Spider', meshing technology for the V5H 3x3 and 5x5 rod bundle geometry used to compute the fluid dynamics of grid-to-rod fretting (GTRF). Spider is easy to use, fast, and automatically generates high-quality meshes for extremely complex geometries, required for the GTRF problem. Hydra-TH has been used to carry out large-eddy simulations on both 3x3 and 5x5 geometries, usingmore » different mesh resolutions. The results analyzed show good agreement with Star-CCM+ simulations and experimental data.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bakosi, Jozsef; Christon, Mark A.; Francois, Marianne M.

    This report describes the work carried out for completion of the Thermal Hydraulics Methods (THM) Level 3 Milestone THM.CFD.P5.05 for the Consortium for Advanced Simulation of Light Water Reactors (CASL). A series of body-fitted computational meshes have been generated by Numeca's Hexpress/Hybrid, a.k.a. 'Spider', meshing technology for the V5H 3 x 3 and 5 x 5 rod bundle geometries and subsequently used to compute the fluid dynamics of grid-to-rod fretting (GTRF). Spider is easy to use, fast, and automatically generates high-quality meshes for extremely complex geometries, required for the GTRF problem. Hydra-TH has been used to carry out large-eddy simulationsmore » on both 3 x 3 and 5 x 5 geometries, using different mesh resolutions. The results analyzed show good agreement with Star-CCM+ simulations and experimental data.« less

  11. Extreme value analysis in biometrics.

    PubMed

    Hüsler, Jürg

    2009-04-01

    We review some approaches of extreme value analysis in the context of biometrical applications. The classical extreme value analysis is based on iid random variables. Two different general methods are applied, which will be discussed together with biometrical examples. Different estimation, testing, goodness-of-fit procedures for applications are discussed. Furthermore, some non-classical situations are considered where the data are possibly dependent, where a non-stationary behavior is observed in the data or where the observations are not univariate. A few open problems are also stated.

  12. Effect of Environment-Based Coursework on the Nature of Attitudes toward the Endangered Species Act.

    ERIC Educational Resources Information Center

    Bright, Alan D.; Tarrant, Michael A.

    2002-01-01

    Examines college students' attitudes and complexity of thinking about the Endangered Species Act (ESA) and the effects of environment-based coursework on their attitudes and thinking. Investigates attitudes in terms of their direction, extremity, ambivalence, and importance and measures complexity of thinking as integrative complexity. (Contains…

  13. [Outlier cases in surgical disciplines. Micro-economic and macro-economic problems].

    PubMed

    Tecklenburg, A; Liebeneiner, J; Schaefer, O

    2009-09-01

    Postoperative complications will always occur and the negative impact puts strain on patients, relatives and the attending physicians. The conversion to a remuneration system based on flat rates (diagnosis-related groups) presents additional economic problems for hospitals in some resource-intensive treatments. This particularly pertains to extremely cost-intensive cases in which costs succeed revenue by the factor of 2 and are often surgical procedures. Here the economic risk increases with the number of interventions performed. Despite improvements in the remuneration system this problem persists. An improved payment for these treatments is desirable. To achieve this it is necessary to systematically analyze the extremely cost-intensive cases by experts of different medical disciplines to create a data basis for a proposal of a cost-covering payment.

  14. Neural architecture design based on extreme learning machine.

    PubMed

    Bueno-Crespo, Andrés; García-Laencina, Pedro J; Sancho-Gómez, José-Luis

    2013-12-01

    Selection of the optimal neural architecture to solve a pattern classification problem entails to choose the relevant input units, the number of hidden neurons and its corresponding interconnection weights. This problem has been widely studied in many research works but their solutions usually involve excessive computational cost in most of the problems and they do not provide a unique solution. This paper proposes a new technique to efficiently design the MultiLayer Perceptron (MLP) architecture for classification using the Extreme Learning Machine (ELM) algorithm. The proposed method provides a high generalization capability and a unique solution for the architecture design. Moreover, the selected final network only retains those input connections that are relevant for the classification task. Experimental results show these advantages. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Dismounted Complex Blast Injury.

    PubMed

    Andersen, Romney C; Fleming, Mark; Forsberg, Jonathan A; Gordon, Wade T; Nanos, George P; Charlton, Michael T; Ficke, James R

    2012-01-01

    The severe Dismounted Complex Blast Injury (DCBI) is characterized by high-energy injuries to the bilateral lower extremities (usually proximal transfemoral amputations) and/or upper extremity (usually involving the non-dominant side), in addition to open pelvic injuries, genitourinary, and abdominal trauma. Initial resuscitation and multidisciplinary surgical management appear to be the keys to survival. Definitive treatment follows general principals of open wound management and includes decontamination through aggressive and frequent debridement, hemorrhage control, viable tissue preservation, and appropriate timing of wound closure. These devastating injuries are associated with paradoxically favorable survival rates, but associated injuries and higher amputation levels lead to more difficult reconstructive challenges.

  16. A variational approach to probing extreme events in turbulent dynamical systems

    PubMed Central

    Farazmand, Mohammad; Sapsis, Themistoklis P.

    2017-01-01

    Extreme events are ubiquitous in a wide range of dynamical systems, including turbulent fluid flows, nonlinear waves, large-scale networks, and biological systems. We propose a variational framework for probing conditions that trigger intermittent extreme events in high-dimensional nonlinear dynamical systems. We seek the triggers as the probabilistically feasible solutions of an appropriately constrained optimization problem, where the function to be maximized is a system observable exhibiting intermittent extreme bursts. The constraints are imposed to ensure the physical admissibility of the optimal solutions, that is, significant probability for their occurrence under the natural flow of the dynamical system. We apply the method to a body-forced incompressible Navier-Stokes equation, known as the Kolmogorov flow. We find that the intermittent bursts of the energy dissipation are independent of the external forcing and are instead caused by the spontaneous transfer of energy from large scales to the mean flow via nonlinear triad interactions. The global maximizer of the corresponding variational problem identifies the responsible triad, hence providing a precursor for the occurrence of extreme dissipation events. Specifically, monitoring the energy transfers within this triad allows us to develop a data-driven short-term predictor for the intermittent bursts of energy dissipation. We assess the performance of this predictor through direct numerical simulations. PMID:28948226

  17. When self-reliance is not safe: associations between reduced help-seeking and subsequent mental health symptoms in suicidal adolescents.

    PubMed

    Labouliere, Christa D; Kleinman, Marjorie; Gould, Madelyn S

    2015-04-01

    The majority of suicidal adolescents have no contact with mental health services, and reduced help-seeking in this population further lessens the likelihood of accessing treatment. A commonly-reported reason for not seeking help is youths' perception that they should solve problems on their own. In this study, we explore associations between extreme self-reliance behavior (i.e., solving problems on your own all of the time), help-seeking behavior, and mental health symptoms in a community sample of adolescents. Approximately 2150 adolescents, across six schools, participated in a school-based suicide prevention screening program, and a subset of at-risk youth completed a follow-up interview two years later. Extreme self-reliance was associated with reduced help-seeking, clinically-significant depressive symptoms, and serious suicidal ideation at the baseline screening. Furthermore, in a subset of youth identified as at-risk at the baseline screening, extreme self-reliance predicted level of suicidal ideation and depressive symptoms two years later even after controlling for baseline symptoms. Given these findings, attitudes that reinforce extreme self-reliance behavior may be an important target for youth suicide prevention programs. Reducing extreme self-reliance in youth with suicidality may increase their likelihood of appropriate help-seeking and concomitant reductions in symptoms.

  18. When Self-Reliance Is Not Safe: Associations between Reduced Help-Seeking and Subsequent Mental Health Symptoms in Suicidal Adolescents

    PubMed Central

    Labouliere, Christa D.; Kleinman, Marjorie; Gould, Madelyn S.

    2015-01-01

    The majority of suicidal adolescents have no contact with mental health services, and reduced help-seeking in this population further lessens the likelihood of accessing treatment. A commonly-reported reason for not seeking help is youths’ perception that they should solve problems on their own. In this study, we explore associations between extreme self-reliance behavior (i.e., solving problems on your own all of the time), help-seeking behavior, and mental health symptoms in a community sample of adolescents. Approximately 2150 adolescents, across six schools, participated in a school-based suicide prevention screening program, and a subset of at-risk youth completed a follow-up interview two years later. Extreme self-reliance was associated with reduced help-seeking, clinically-significant depressive symptoms, and serious suicidal ideation at the baseline screening. Furthermore, in a subset of youth identified as at-risk at the baseline screening, extreme self-reliance predicted level of suicidal ideation and depressive symptoms two years later even after controlling for baseline symptoms. Given these findings, attitudes that reinforce extreme self-reliance behavior may be an important target for youth suicide prevention programs. Reducing extreme self-reliance in youth with suicidality may increase their likelihood of appropriate help-seeking and concomitant reductions in symptoms. PMID:25837350

  19. Solving Complex Problems: A Convergent Approach to Cognitive Load Measurement

    ERIC Educational Resources Information Center

    Zheng, Robert; Cook, Anne

    2012-01-01

    The study challenged the current practices in cognitive load measurement involving complex problem solving by manipulating the presence of pictures in multiple rule-based problem-solving situations and examining the cognitive load resulting from both off-line and online measures associated with complex problem solving. Forty-eight participants…

  20. Pediatric lower extremity mower injuries.

    PubMed

    Hill, Sean M; Elwood, Eric T

    2011-09-01

    Lawn mower injuries in children represent an unfortunate common problem to the plastic reconstructive surgeon. There are approximately 68,000 per year reported in the United States. Compounding this problem is the fact that a standard treatment algorithm does not exist. This study follows a series of 7 pediatric patients treated for lower extremity mower injuries by a single plastic surgeon. The extent of soft tissue injury varied. All patients were treated with negative pressure wound therapy as a bridge to definitive closure. Of the 7 patients, 4 required skin grafts, 1 required primary closure, 1 underwent a lower extremity amputation secondary to wounds, and 1 was repaired using a cross-leg flap. Function limitations were minimal for all of our patients after reconstruction. Our basic treatment algorithm is presented with initial debridement followed by the simplest method possible for wound closure using negative pressure wound therapy, if necessary.

  1. Spectroscopic, computational and electrochemical studies on the formation of the copper complex of 1-amino-4-hydroxy-9,10-anthraquinone and effect of it on superoxide formation by NADH dehydrogenase.

    PubMed

    Roy, Sanjay; Mondal, Palash; Sengupta, Partha Sarathi; Dhak, Debasis; Santra, Ramesh Chandra; Das, Saurabh; Guin, Partha Sarathi

    2015-03-28

    A 1 : 2 copper(II) complex of 1-amino-4-hydroxy-9,10-anthraquinone (QH) having the molecular formula CuQ2 was prepared and characterized by elemental analysis, NMR, FTIR, UV-vis and mass spectroscopy. The powder diffraction of the solid complex, magnetic susceptibility and ESR spectra were also recorded. The presence of the planar anthraquinone moiety in the complex makes it extremely difficult to obtain a single crystal suitable for X-ray diffraction studies. To overcome this problem, density functional theory (DFT) was used to evaluate an optimized structure of CuQ2. In the optimized structure, it was found that there is a tilt of the two planar aromatic anthraquinone rings of the complex with respect to each other in the two planes containing the O-Cu(II)-O plane. The present study is an important addition to the understanding of the structural aspects of metal-anthracyclines because there are only a few reports on the actual structures of metal-anthracyclines. The theoretical vibrational spectrum of the complex was assigned with the help of vibrational energy distribution analysis (VEDA) using potential energy distribution (PED) and compared with experimental results. Being important in producing the biochemical action of this class of molecules, the electrochemical behavior of the complex was studied in aqueous and non-aqueous solvents to find certain electrochemical parameters. In aqueous media, reduction involves a kinetic effect during electron transfer at an electrode surface, which was characterized very carefully using cyclic voltammetry. Electrochemical studies showed a significant modification in the electrochemical properties of 1-amino-4-hydroxy-9,10-anthraquinone (QH) when bound to Cu(II) in the complex compared to those observed for free QH. This suggests that the copper complex might be a good choice as a biologically active molecule, which was reflected in the lack of stimulated superoxide generation by the complex.

  2. Preservation of hay with propionic acid

    USDA-ARS?s Scientific Manuscript database

    Most hay producers are quite familiar with the problems associated with baling moist hays. Normally, these problems include spontaneous heating, increased evidence of mold, losses of dry matter (DM) during storage, poorer nutritive value, and (in extreme cases) spontaneous combustion. Numerous fact...

  3. Explicit Computations of Instantons and Large Deviations in Beta-Plane Turbulence

    NASA Astrophysics Data System (ADS)

    Laurie, J.; Bouchet, F.; Zaboronski, O.

    2012-12-01

    We use a path integral formalism and instanton theory in order to make explicit analytical predictions about large deviations and rare events in beta-plane turbulence. The path integral formalism is a concise way to get large deviation results in dynamical systems forced by random noise. In the most simple cases, it leads to the same results as the Freidlin-Wentzell theory, but it has a wider range of applicability. This approach is however usually extremely limited, due to the complexity of the theoretical problems. As a consequence it provides explicit results in a fairly limited number of models, often extremely simple ones with only a few degrees of freedom. Few exception exist outside the realm of equilibrium statistical physics. We will show that the barotropic model of beta-plane turbulence is one of these non-equilibrium exceptions. We describe sets of explicit solutions to the instanton equation, and precise derivations of the action functional (or large deviation rate function). The reason why such exact computations are possible is related to the existence of hidden symmetries and conservation laws for the instanton dynamics. We outline several applications of this apporach. For instance, we compute explicitly the very low probability to observe flows with an energy much larger or smaller than the typical one. Moreover, we consider regimes for which the system has multiple attractors (corresponding to different numbers of alternating jets), and discuss the computation of transition probabilities between two such attractors. These extremely rare events are of the utmost importance as the dynamics undergo qualitative macroscopic changes during such transitions.

  4. From problem solving to problem definition: scrutinizing the complex nature of clinical practice.

    PubMed

    Cristancho, Sayra; Lingard, Lorelei; Regehr, Glenn

    2017-02-01

    In medical education, we have tended to present problems as being singular, stable, and solvable. Problem solving has, therefore, drawn much of medical education researchers' attention. This focus has been important but it is limited in terms of preparing clinicians to deal with the complexity of the 21st century healthcare system in which they will provide team-based care for patients with complex medical illness. In this paper, we use the Soft Systems Engineering principles to introduce the idea that in complex, team-based situations, problems usually involve divergent views and evolve with multiple solution iterations. As such we need to shift the conversation from (1) problem solving to problem definition, and (2) from a problem definition derived exclusively at the level of the individual to a definition derived at the level of the situation in which the problem is manifested. Embracing such a focus on problem definition will enable us to advocate for novel educational practices that will equip trainees to effectively manage the problems they will encounter in complex, team-based healthcare.

  5. Which Extreme Variant of the Problem-Solving Method of Teaching Should Be More Characteristic of the Many Teacher Variations of Problem-Solving Teaching?

    ERIC Educational Resources Information Center

    Mahan, Luther A.

    1970-01-01

    Compares the effects of two problem-solving teaching approaches. Lower ability students in an activity group demonstrated superior growth in basic science understanding, &roblem-solving skills, science interests, personal adjustment, and school attitudes. Neither method favored cognitive learning by higher ability students. (PR)

  6. Resource and Information Maintenance of Foreign Citizens in Russia: Statement of a Problem

    ERIC Educational Resources Information Center

    Dorozhkin, Evgenij M.; Leontyeva, Tatyana V.; Shchetynina, Anna V.; Krivtsov, Artem I.

    2016-01-01

    The relevance of studied problem is determined by the fact that in a multiethnic country the problem of the ethno-cultural specificity of different groups of people is extremely severe, and the activity of the processes of intercultural communications in the modern world requires knowledge and understanding of other cultures. The aim of the…

  7. Highly Scalable Asynchronous Computing Method for Partial Differential Equations: A Path Towards Exascale

    NASA Astrophysics Data System (ADS)

    Konduri, Aditya

    Many natural and engineering systems are governed by nonlinear partial differential equations (PDEs) which result in a multiscale phenomena, e.g. turbulent flows. Numerical simulations of these problems are computationally very expensive and demand for extreme levels of parallelism. At realistic conditions, simulations are being carried out on massively parallel computers with hundreds of thousands of processing elements (PEs). It has been observed that communication between PEs as well as their synchronization at these extreme scales take up a significant portion of the total simulation time and result in poor scalability of codes. This issue is likely to pose a bottleneck in scalability of codes on future Exascale systems. In this work, we propose an asynchronous computing algorithm based on widely used finite difference methods to solve PDEs in which synchronization between PEs due to communication is relaxed at a mathematical level. We show that while stability is conserved when schemes are used asynchronously, accuracy is greatly degraded. Since message arrivals at PEs are random processes, so is the behavior of the error. We propose a new statistical framework in which we show that average errors drop always to first-order regardless of the original scheme. We propose new asynchrony-tolerant schemes that maintain accuracy when synchronization is relaxed. The quality of the solution is shown to depend, not only on the physical phenomena and numerical schemes, but also on the characteristics of the computing machine. A novel algorithm using remote memory access communications has been developed to demonstrate excellent scalability of the method for large-scale computing. Finally, we present a path to extend this method in solving complex multi-scale problems on Exascale machines.

  8. Relations between work and upper extremity musculoskeletal problems (UEMSP) and the moderating role of psychosocial work factors on the relation between computer work and UEMSP.

    PubMed

    Nicolakakis, Nektaria; Stock, Susan R; Abrahamowicz, Michal; Kline, Rex; Messing, Karen

    2017-11-01

    Computer work has been identified as a risk factor for upper extremity musculoskeletal problems (UEMSP). But few studies have investigated how psychosocial and organizational work factors affect this relation. Nor have gender differences in the relation between UEMSP and these work factors  been studied. We sought to estimate: (1) the association between UEMSP and a range of physical, psychosocial and organizational work exposures, including the duration of computer work, and (2) the moderating effect of psychosocial work exposures on the relation between computer work and UEMSP. Using 2007-2008 Québec survey data on 2478 workers, we carried out gender-stratified multivariable logistic regression modeling and two-way interaction analyses. In both genders, odds of UEMSP were higher with exposure to high physical work demands and emotionally demanding work. Additionally among women, UEMSP were associated with duration of occupational computer exposure, sexual harassment, tense situations when dealing with clients, high quantitative demands and lack of prospects for promotion, and among men, with low coworker support, episodes of unemployment, low job security and contradictory work demands. Among women, the effect of computer work on UEMSP was considerably increased in the presence of emotionally demanding work, and may also be moderated by low recognition at work, contradictory work demands, and low supervisor support. These results suggest that the relations between UEMSP and computer work are moderated by psychosocial work exposures and that the relations between working conditions and UEMSP are somewhat different for each gender, highlighting the complexity of these relations and the importance of considering gender.

  9. Changes in Extreme Events and the Potential Impacts on National Security

    NASA Astrophysics Data System (ADS)

    Bell, J.

    2017-12-01

    Extreme weather and climate events affect human health by causing death, injury, and illness, as well as having large socio-economic impacts. Climate change has caused changes in extreme event frequency, intensity and geographic distribution, and will continue to be a driver for changes in the future. Some of the extreme events that have already changed are heat waves, droughts, wildfires, flooding rains, coastal flooding, storm surge, and hurricanes. The pathways connecting extreme events to health outcomes and economic losses can be diverse and complex. The difficulty in predicting these relationships comes from the local intricacies of societal and environmental factors that influences the level of exposure. The goal of this presentation is to discuss the national security implications of changes in extreme weather events and demonstrate how changes in extremes can lead to a host cascading issues. To illustrate this point, this presentation will provide examples of the various pathways that extreme events can increase disease burden and cause economic stress.

  10. Implicit methods for efficient musculoskeletal simulation and optimal control

    PubMed Central

    van den Bogert, Antonie J.; Blana, Dimitra; Heinrich, Dieter

    2011-01-01

    The ordinary differential equations for musculoskeletal dynamics are often numerically stiff and highly nonlinear. Consequently, simulations require small time steps, and optimal control problems are slow to solve and have poor convergence. In this paper, we present an implicit formulation of musculoskeletal dynamics, which leads to new numerical methods for simulation and optimal control, with the expectation that we can mitigate some of these problems. A first order Rosenbrock method was developed for solving forward dynamic problems using the implicit formulation. It was used to perform real-time dynamic simulation of a complex shoulder arm system with extreme dynamic stiffness. Simulations had an RMS error of only 0.11 degrees in joint angles when running at real-time speed. For optimal control of musculoskeletal systems, a direct collocation method was developed for implicitly formulated models. The method was applied to predict gait with a prosthetic foot and ankle. Solutions were obtained in well under one hour of computation time and demonstrated how patients may adapt their gait to compensate for limitations of a specific prosthetic limb design. The optimal control method was also applied to a state estimation problem in sports biomechanics, where forces during skiing were estimated from noisy and incomplete kinematic data. Using a full musculoskeletal dynamics model for state estimation had the additional advantage that forward dynamic simulations, could be done with the same implicitly formulated model to simulate injuries and perturbation responses. While these methods are powerful and allow solution of previously intractable problems, there are still considerable numerical challenges, especially related to the convergence of gradient-based solvers. PMID:22102983

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.A. Krommes

    Fusion physics poses an extremely challenging, practically complex problem that does not yield readily to simple paradigms. Nevertheless, various of the theoretical tools and conceptual advances emphasized at the KaufmanFest 2007 have motivated and/or found application to the development of fusion-related plasma turbulence theory. A brief historical commentary is given on some aspects of that specialty, with emphasis on the role (and limitations) of Hamiltonian/symplectic approaches, variational methods, oscillation-center theory, and nonlinear dynamics. It is shown how to extract a renormalized ponderomotive force from the statistical equations of plasma turbulence, and the possibility of a renormalized K-χ theorem is discussed.more » An unusual application of quasilinear theory to the problem of plasma equilibria in the presence of stochastic magnetic fields is described. The modern problem of zonal-flow dynamics illustrates a confluence of several techniques, including (i) the application of nonlinear-dynamics methods, especially center-manifold theory, to the problem of the transition to plasma turbulence in the face of self-generated zonal flows; and (ii) the use of Hamiltonian formalism to determine the appropriate (Casimir) invariant to be used in a novel wave-kinetic analysis of systems of interacting zonal flows and drift waves. Recent progress in the theory of intermittent chaotic statistics and the generation of coherent structures from turbulence is mentioned, and an appeal is made for some new tools to cope with these interesting and difficult problems in nonlinear plasma physics. Finally, the important influence of the intellectually stimulating research environment fostered by Prof. Allan Kaufman on the author's thinking and teaching methodology is described.« less

  12. Sectional methods for aggregation problems: application to volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Rossi, E.

    2016-12-01

    Particle aggregation is a general problem that is common to several scientific disciplines such as planetary formation, food industry and aerosol sciences. So far the ordinary approach to this class of problems relies on the solution of the Smoluchowski Coagulation Equations (SCE), a set of Ordinary Differential Equations (ODEs) derived from the Population Balance Equations (PBE), which basically describe the change in time of an initial grain-size distribution due to the interaction of "single" particles. The frequency of particles collisions and their sticking efficiencies depend on the specific problem under analysis, but the mathematical framework and the possible solutions to the ODEs seem to be somehow discipline-independent and very general. In this work we will focus on the problem of volcanic ash aggregation, since it represents an extreme case of complexity that can be relevant also to other disciplines. In fact volcanic ash aggregates observed during the fallouts are characterized by relevant porosities and they do not fit with simplified descriptions based on monomer-like structures or fractal geometries. In this work we propose a bidimensional approach to the PBEs which uses additive (mass) and non-additive (volume) internal descriptors in order to better characterize the evolution of volcanic ash aggregation. In particular we used sectional methods (fixed-pivot) to discretize the internal parameters space. This algorithm has been applied to a one dimensional volcanic plume model in order to investigate how the Total Grain Size Distribution (TGSD) changes throughout the erupted column in real scenarios (i.e. Eyjafjallajokull 2010, Sakurajima 2013 and Mt. Saint Helens 1980).

  13. Data-assisted reduced-order modeling of extreme events in complex dynamical systems

    PubMed Central

    Koumoutsakos, Petros

    2018-01-01

    The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in regions associated with extreme events, where data is sparse. PMID:29795631

  14. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    PubMed

    Wan, Zhong Yi; Vlachas, Pantelis; Koumoutsakos, Petros; Sapsis, Themistoklis

    2018-01-01

    The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in regions associated with extreme events, where data is sparse.

  15. Local Difference Measures between Complex Networks for Dynamical System Model Evaluation

    PubMed Central

    Lange, Stefan; Donges, Jonathan F.; Volkholz, Jan; Kurths, Jürgen

    2015-01-01

    A faithful modeling of real-world dynamical systems necessitates model evaluation. A recent promising methodological approach to this problem has been based on complex networks, which in turn have proven useful for the characterization of dynamical systems. In this context, we introduce three local network difference measures and demonstrate their capabilities in the field of climate modeling, where these measures facilitate a spatially explicit model evaluation. Building on a recent study by Feldhoff et al. [1] we comparatively analyze statistical and dynamical regional climate simulations of the South American monsoon system. Three types of climate networks representing different aspects of rainfall dynamics are constructed from the modeled precipitation space-time series. Specifically, we define simple graphs based on positive as well as negative rank correlations between rainfall anomaly time series at different locations, and such based on spatial synchronizations of extreme rain events. An evaluation against respective networks built from daily satellite data provided by the Tropical Rainfall Measuring Mission 3B42 V7 reveals far greater differences in model performance between network types for a fixed but arbitrary climate model than between climate models for a fixed but arbitrary network type. We identify two sources of uncertainty in this respect. Firstly, climate variability limits fidelity, particularly in the case of the extreme event network; and secondly, larger geographical link lengths render link misplacements more likely, most notably in the case of the anticorrelation network; both contributions are quantified using suitable ensembles of surrogate networks. Our model evaluation approach is applicable to any multidimensional dynamical system and especially our simple graph difference measures are highly versatile as the graphs to be compared may be constructed in whatever way required. Generalizations to directed as well as edge- and node-weighted graphs are discussed. PMID:25856374

  16. Online sparse Gaussian process based human motion intent learning for an electrically actuated lower extremity exoskeleton.

    PubMed

    Long, Yi; Du, Zhi-Jiang; Chen, Chao-Feng; Dong, Wei; Wang, Wei-Dong

    2017-07-01

    The most important step for lower extremity exoskeleton is to infer human motion intent (HMI), which contributes to achieve human exoskeleton collaboration. Since the user is in the control loop, the relationship between human robot interaction (HRI) information and HMI is nonlinear and complicated, which is difficult to be modeled by using mathematical approaches. The nonlinear approximation can be learned by using machine learning approaches. Gaussian Process (GP) regression is suitable for high-dimensional and small-sample nonlinear regression problems. GP regression is restrictive for large data sets due to its computation complexity. In this paper, an online sparse GP algorithm is constructed to learn the HMI. The original training dataset is collected when the user wears the exoskeleton system with friction compensation to perform unconstrained movement as far as possible. The dataset has two kinds of data, i.e., (1) physical HRI, which is collected by torque sensors placed at the interaction cuffs for the active joints, i.e., knee joints; (2) joint angular position, which is measured by optical position sensors. To reduce the computation complexity of GP, grey relational analysis (GRA) is utilized to specify the original dataset and provide the final training dataset. Those hyper-parameters are optimized offline by maximizing marginal likelihood and will be applied into online GP regression algorithm. The HMI, i.e., angular position of human joints, will be regarded as the reference trajectory for the mechanical legs. To verify the effectiveness of the proposed algorithm, experiments are performed on a subject at a natural speed. The experimental results show the HMI can be obtained in real time, which can be extended and employed in the similar exoskeleton systems.

  17. Local difference measures between complex networks for dynamical system model evaluation.

    PubMed

    Lange, Stefan; Donges, Jonathan F; Volkholz, Jan; Kurths, Jürgen

    2015-01-01

    A faithful modeling of real-world dynamical systems necessitates model evaluation. A recent promising methodological approach to this problem has been based on complex networks, which in turn have proven useful for the characterization of dynamical systems. In this context, we introduce three local network difference measures and demonstrate their capabilities in the field of climate modeling, where these measures facilitate a spatially explicit model evaluation.Building on a recent study by Feldhoff et al. [8] we comparatively analyze statistical and dynamical regional climate simulations of the South American monsoon system [corrected]. types of climate networks representing different aspects of rainfall dynamics are constructed from the modeled precipitation space-time series. Specifically, we define simple graphs based on positive as well as negative rank correlations between rainfall anomaly time series at different locations, and such based on spatial synchronizations of extreme rain events. An evaluation against respective networks built from daily satellite data provided by the Tropical Rainfall Measuring Mission 3B42 V7 reveals far greater differences in model performance between network types for a fixed but arbitrary climate model than between climate models for a fixed but arbitrary network type. We identify two sources of uncertainty in this respect. Firstly, climate variability limits fidelity, particularly in the case of the extreme event network; and secondly, larger geographical link lengths render link misplacements more likely, most notably in the case of the anticorrelation network; both contributions are quantified using suitable ensembles of surrogate networks. Our model evaluation approach is applicable to any multidimensional dynamical system and especially our simple graph difference measures are highly versatile as the graphs to be compared may be constructed in whatever way required. Generalizations to directed as well as edge- and node-weighted graphs are discussed.

  18. Controlling Emergent Ferromagnetism at Complex Oxide Interfaces

    NASA Astrophysics Data System (ADS)

    Grutter, Alexander

    The emergence of complex magnetic ground states at ABO3 perovskite heterostructure interfaces is among the most promising routes towards highly tunable nanoscale materials for spintronic device applications. Despite recent progress, isolating and controlling the underlying mechanisms behind these emergent properties remains a highly challenging materials physics problems. In particular, generating and tuning ferromagnetism localized at the interface of two non-ferromagnetic materials is of fundamental and technological interest. An ideal model system in which to study such effects is the CaRuO3/CaMnO3 interface, where the constituent materials are paramagnetic and antiferromagnetic in the bulk, respectively. Due to small fractional charge transfer to the CaMnO3 (0.07 e-/Mn) from the CaRuO3, the interfacial Mn ions are in a canted antiferromagnetic state. The delicate balance between antiferromagnetic superexchange and ferromagnetic double exchange results in a magnetic ground state which is extremely sensitive to perturbations. We exploit this sensitivity to achieve control of the magnetic interface, tipping the balance between ferromagnetic and antiferromagnetic interactions through octahedral connectivity modification. Such connectivity effects are typically tightly confined to interfaces, but by targeting a purely interfacial emergent magnetic system, we achieve drastic alterations to the magnetic ground state. These results demonstrate the extreme sensitivity of the magnetic state to the magnitude of the charge transfer, suggesting the potential for direct electric field control. We achieve such electric field control through direct back gating of a CaRuO3/CaMnO3 bilayer. Thus, the CaRuO3/CaMnO3 system provides new insight into how charge transfer, interfacial symmetry, and electric fields may be used to control ferromagnetism at the atomic scale.

  19. Compositional patterns in the genomes of unicellular eukaryotes.

    PubMed

    Costantini, Maria; Alvarez-Valin, Fernando; Costantini, Susan; Cammarano, Rosalia; Bernardi, Giorgio

    2013-11-05

    The genomes of multicellular eukaryotes are compartmentalized in mosaics of isochores, large and fairly homogeneous stretches of DNA that belong to a small number of families characterized by different average GC levels, by different gene concentration (that increase with GC), different chromatin structures, different replication timing in the cell cycle, and other different properties. A question raised by these basic results concerns how far back in evolution the compartmentalized organization of the eukaryotic genomes arose. In the present work we approached this problem by studying the compositional organization of the genomes from the unicellular eukaryotes for which full sequences are available, the sample used being representative. The average GC levels of the genomes from unicellular eukaryotes cover an extremely wide range (19%-60% GC) and the compositional patterns of individual genomes are extremely different but all genomes tested show a compositional compartmentalization. The average GC range of the genomes of unicellular eukaryotes is very broad (as broad as that of prokaryotes) and individual compositional patterns cover a very broad range from very narrow to very complex. Both features are not surprising for organisms that are very far from each other both in terms of phylogenetic distances and of environmental life conditions. Most importantly, all genomes tested, a representative sample of all supergroups of unicellular eukaryotes, are compositionally compartmentalized, a major difference with prokaryotes.

  20. Trait-based Affective Processes in Alcohol-Involved Risk Behaviors

    PubMed Central

    Wray, Tyler B.; Simons, Jeffrey S.; Dvorak, Robert D.; Gaher, Raluca M.

    2012-01-01

    This study tested a theoretical model of alcohol use, markers of extreme intoxication, and risk behavior as a function of trait affect, distress tolerance, and affect-based behavior dysregulation. Positive affective pathways to risk behavior were primarily expected to be indirect via high levels of alcohol use, while negative affect paths were expected to be more directly associated with engagement in risk behavior. In addition, we expected trait affectivity and distress tolerance would primarily exhibit relationships with alcohol use and problems through behavioral dysregulation occurring during extreme affective states. To evaluate these hypotheses, we tested a SEM with three alcohol–related outcomes: “Typical” alcohol use, “blackout” drinking,” and risk behavior. Results were complex, but generally supported the hypotheses. High trait negative affect and low tolerance for affective distress contribute to difficulty controlling behavior when negatively aroused and this is directly associated with increased risk behavior when drinking. In contrast, associations between positive urgency and risk behaviors are indirect via increased alcohol consumption. Positive affectivity exhibited both inverse and positive effects in the model, with the net effect on alcohol outcomes being insignificant. These findings contribute important information about the distinct pathways between affect, alcohol use, and alcohol-involved risk behavior among college students. PMID:22770825

  1. Fundamentals of Cryogenics

    NASA Technical Reports Server (NTRS)

    Johnson, Wesley; Tomsik, Thomas; Moder, Jeff

    2014-01-01

    Analysis of the extreme conditions that are encountered in cryogenic systems requires the most effort out of analysts and engineers. Due to the costs and complexity associated with the extremely cold temperatures involved, testing is sometimes minimized and extra analysis is often relied upon. This short course is designed as an introduction to cryogenic engineering and analysis, and it is intended to introduce the basic concepts related to cryogenic analysis and testing as well as help the analyst understand the impacts of various requests on a test facility. Discussion will revolve around operational functions often found in cryogenic systems, hardware for both tests and facilities, and what design or modelling tools are available for performing the analysis. Emphasis will be placed on what scenarios to use what hardware or the analysis tools to get the desired results. The class will provide a review of first principles, engineering practices, and those relations directly applicable to this subject including such topics as cryogenic fluids, thermodynamics and heat transfer, material properties at low temperature, insulation, cryogenic equipment, instrumentation, refrigeration, testing of cryogenic systems, cryogenics safety and typical thermal and fluid analysis used by the engineer. The class will provide references for further learning on various topics in cryogenics for those who want to dive deeper into the subject or have encountered specific problems.

  2. Multiobjective generalized extremal optimization algorithm for simulation of daylight illuminants

    NASA Astrophysics Data System (ADS)

    Kumar, Srividya Ravindra; Kurian, Ciji Pearl; Gomes-Borges, Marcos Eduardo

    2017-10-01

    Daylight illuminants are widely used as references for color quality testing and optical vision testing applications. Presently used daylight simulators make use of fluorescent bulbs that are not tunable and occupy more space inside the quality testing chambers. By designing a spectrally tunable LED light source with an optimal number of LEDs, cost, space, and energy can be saved. This paper describes an application of the generalized extremal optimization (GEO) algorithm for selection of the appropriate quantity and quality of LEDs that compose the light source. The multiobjective approach of this algorithm tries to get the best spectral simulation with minimum fitness error toward the target spectrum, correlated color temperature (CCT) the same as the target spectrum, high color rendering index (CRI), and luminous flux as required for testing applications. GEO is a global search algorithm based on phenomena of natural evolution and is especially designed to be used in complex optimization problems. Several simulations have been conducted to validate the performance of the algorithm. The methodology applied to model the LEDs, together with the theoretical basis for CCT and CRI calculation, is presented in this paper. A comparative result analysis of M-GEO evolutionary algorithm with the Levenberg-Marquardt conventional deterministic algorithm is also presented.

  3. A comparative analysis of support vector machines and extreme learning machines.

    PubMed

    Liu, Xueyi; Gao, Chuanhou; Li, Ping

    2012-09-01

    The theory of extreme learning machines (ELMs) has recently become increasingly popular. As a new learning algorithm for single-hidden-layer feed-forward neural networks, an ELM offers the advantages of low computational cost, good generalization ability, and ease of implementation. Hence the comparison and model selection between ELMs and other kinds of state-of-the-art machine learning approaches has become significant and has attracted many research efforts. This paper performs a comparative analysis of the basic ELMs and support vector machines (SVMs) from two viewpoints that are different from previous works: one is the Vapnik-Chervonenkis (VC) dimension, and the other is their performance under different training sample sizes. It is shown that the VC dimension of an ELM is equal to the number of hidden nodes of the ELM with probability one. Additionally, their generalization ability and computational complexity are exhibited with changing training sample size. ELMs have weaker generalization ability than SVMs for small sample but can generalize as well as SVMs for large sample. Remarkably, great superiority in computational speed especially for large-scale sample problems is found in ELMs. The results obtained can provide insight into the essential relationship between them, and can also serve as complementary knowledge for their past experimental and theoretical comparisons. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Extreme Facial Expressions Classification Based on Reality Parameters

    NASA Astrophysics Data System (ADS)

    Rahim, Mohd Shafry Mohd; Rad, Abdolvahab Ehsani; Rehman, Amjad; Altameem, Ayman

    2014-09-01

    Extreme expressions are really type of emotional expressions that are basically stimulated through the strong emotion. An example of those extreme expression is satisfied through tears. So to be able to provide these types of features; additional elements like fluid mechanism (particle system) plus some of physics techniques like (SPH) are introduced. The fusion of facile animation with SPH exhibits promising results. Accordingly, proposed fluid technique using facial animation is the real tenor for this research to get the complex expression, like laugh, smile, cry (tears emergence) or the sadness until cry strongly, as an extreme expression classification that's happens on the human face in some cases.

  5. Evaluation of Arctic broadband surface radiation measurements

    NASA Astrophysics Data System (ADS)

    Matsui, N.; Long, C. N.; Augustine, J.; Halliwell, D.; Uttal, T.; Longenecker, D.; Niebergall, O.; Wendell, J.; Albee, R.

    2012-02-01

    The Arctic is a challenging environment for making in-situ surface radiation measurements. A standard suite of radiation sensors is typically designed to measure incoming and outgoing shortwave (SW) and thermal infrared, or longwave (LW), radiation. Enhancements may include various sensors for measuring irradiance in narrower bandwidths. Many solar radiation/thermal infrared flux sensors utilize protective glass domes and some are mounted on complex mechanical platforms (solar trackers) that keep sensors and shading devices trained on the sun along its diurnal path. High quality measurements require striking a balance between locating stations in a pristine undisturbed setting free of artificial blockage (such as from buildings and towers) and providing accessibility to allow operators to clean and maintain the instruments. Three significant sources of erroneous data in the Arctic include solar tracker malfunctions, rime/frost/snow deposition on the protective glass domes of the radiometers and operational problems due to limited operator access in extreme weather conditions. In this study, comparisons are made between the global and component sum (direct [vertical component] + diffuse) SW measurements. The difference between these two quantities (that theoretically should be zero) is used to illustrate the magnitude and seasonality of arctic radiation flux measurement problems. The problem of rime/frost/snow deposition is investigated in more detail for one case study utilizing both SW and LW measurements. Solutions to these operational problems that utilize measurement redundancy, more sophisticated heating and ventilation strategies and a more systematic program of operational support and subsequent data quality protocols are proposed.

  6. Evaluation of arctic broadband surface radiation measurements

    NASA Astrophysics Data System (ADS)

    Matsui, N.; Long, C. N.; Augustine, J.; Halliwell, D.; Uttal, T.; Longenecker, D.; Nievergall, O.; Wendell, J.; Albee, R.

    2011-08-01

    The Arctic is a challenging environment for making in-situ radiation measurements. A standard suite of radiation sensors is typically designed to measure the total, direct and diffuse components of incoming and outgoing broadband shortwave (SW) and broadband thermal infrared, or longwave (LW) radiation. Enhancements can include various sensors for measuring irradiance in various narrower bandwidths. Many solar radiation/thermal infrared flux sensors utilize protective glass domes and some are mounted on complex mechanical platforms (solar trackers) that rotate sensors and shading devices that track the sun. High quality measurements require striking a balance between locating sensors in a pristine undisturbed location free of artificial blockage (such as buildings and towers) and providing accessibility to allow operators to clean and maintain the instruments. Three significant sources of erroneous data include solar tracker malfunctions, rime/frost/snow deposition on the instruments and operational problems due to limited operator access in extreme weather conditions. In this study, a comparison is made between the global and component sum (direct [vertical component] + diffuse) shortwave measurements. The difference between these two quantities (that theoretically should be zero) is used to illustrate the magnitude and seasonality of radiation flux measurement problems. The problem of rime/frost/snow deposition is investigated in more detail for one case study utilizing both shortwave and longwave measurements. Solutions to these operational problems are proposed that utilize measurement redundancy, more sophisticated heating and ventilation strategies and a more systematic program of operational support and subsequent data quality protocols.

  7. Upper Extremity Artificial Limb Control as an Issue Related to Movement and Mobility in Daily Living

    ERIC Educational Resources Information Center

    Wallace, Steve; Anderson, David I.; Trujillo, Michael; Weeks, Douglas L.

    2005-01-01

    The 1992 NIH Research Planning Conference on Prosthetic and Orthotic Research for the 21st Century (Childress, 1992) recognized that the field of prosthetics lacks theoretical understanding and empirical studies on learning to control an upper-extremity prosthesis. We have addressed this problem using a novel approach in which persons without…

  8. Greedy algorithms in disordered systems

    NASA Astrophysics Data System (ADS)

    Duxbury, P. M.; Dobrin, R.

    1999-08-01

    We discuss search, minimal path and minimal spanning tree algorithms and their applications to disordered systems. Greedy algorithms solve these problems exactly, and are related to extremal dynamics in physics. Minimal cost path (Dijkstra) and minimal cost spanning tree (Prim) algorithms provide extremal dynamics for a polymer in a random medium (the KPZ universality class) and invasion percolation (without trapping) respectively.

  9. Interception in three dimensions - An energy formulation

    NASA Technical Reports Server (NTRS)

    Rajan, N.; Ardema, M. D.

    1983-01-01

    The problem of minimum-time interception of a target flying in three dimensional space is analyzed with the interceptor aircraft modeled through energy-state approximation. A coordinate transformation that uncouples the interceptor's extremals from the target motion in an open-loop sense is introduced, and the necessary conditions for optimality and the optimal controls are derived. Example extremals are shown.

  10. Are behaviour problems in extremely low-birthweight children related to their motor ability?

    PubMed

    Danks, Marcella; Cherry, Kate; Burns, Yvonne R; Gray, Peter H

    2017-04-01

    To investigate whether behaviour problems are independently related to mild motor impairment in 11-13-year-old children born preterm with extremely low birthweight (ELBW). The cross-sectional study included 48 (27 males) non-disabled, otherwise healthy ELBW children (<1000 g) and 55 (28 males) term-born peers. Parents reported behaviour using the Child Behaviour Checklist (CBCL). Children completed the Movement Assessment Battery for Children (Movement ABC). Extremely low birthweight children had poorer behaviour scores (CBCL Total Problem T score: mean difference = 5.89, 95% confidence interval = 10.29, 1.49, p = 0.009) and Movement ABC Total Motor Impairment Scores (ELBW group median = 17.5, IQR = 12.3; term-born group median = 7.5, IQR = 9, p < 0.01) than term-born peers. Behaviour was related to motor score (regression coefficient 2.16; 95% confidence interval 0.34, 3.97, p = 0.02) independent of gender, socio-economic factors or birthweight. Motor score had the strongest association with attention (ρ = 0.51; p < 0.01) and social behaviours (ρ = 0.50; p < 0.01). Behaviour problems of otherwise healthy 11- to 13-year-old ELBW children are not related to prematurity independent of their motor difficulties. Supporting improved motor competence in ELBW preteen children may support improved behaviour, particularly attention and social behaviours. ©2016 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  11. Pediatric Major Head Injury: Not a Minor Problem.

    PubMed

    Leetch, Aaron N; Wilson, Bryan

    2018-05-01

    Traumatic brain injury is a highly prevalent and devastating cause of morbidity and mortality in children. A rapid, stepwise approach to the traumatized child should proceed, addressing life-threatening problems first. Management focuses on preventing secondary injury from physiologic extremes such as hypoxemia, hypotension, prolonged hyperventilation, temperature extremes, and rapid changes in cerebral blood flow. Initial Glasgow Coma Score, hyperglycemia, and imaging are often prognostic of outcome. Surgically amenable lesions should be evacuated promptly. Reduction of intracranial pressure through hyperosmolar therapy, decompressive craniotomy, and seizure prophylaxis may be considered after stabilization. Nonaccidental trauma should be considered when evaluating pediatric trauma patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. The attitudes of neonatal nurses towards extremely preterm infants.

    PubMed

    Gallagher, Katie; Marlow, Neil; Edgley, Alison; Porock, Davina

    2012-08-01

    The paper is a report of a study of the attitudes of neonatal nurses towards extremely preterm infants. Alongside advancing survival at extremely preterm gestational ages, ethical debates concerning the provision of invasive care have proliferated in light of the high morbidity. Despite nurses being the healthcare professionals who work closest with the infant and their family, their potential influence is usually ignored when determining how parents come to decisions about future care for their extremely premature infant. A Q methodology was employed to explore the attitudes of neonatal nurses towards caring for extremely preterm infants. Data were collected between 2007 and 2008 and analysed using PQMethod and Card Content Analysis. Thirty-six nurses from six neonatal units in the United Kingdom participated. Although there was consensus around the professional role of the nurse, when faced with the complexities of neonatal nursing three distinguishing factors emerged: the importance of parental choice in decision-making, the belief that technology should be used to assess response to treatment, and the belief that healthcare professionals should undertake difficult decisions. Neonatal nurses report unexpected difficulties in upholding their professionally defined role through highly complex and ever varied decision-making processes. Recognition of individual attitudes to the care of extremely preterm infants and the role of the family in the face of difficult decisions should facilitate more open communication between the nurse and the parents and improve the experience of both the nurse and the family during these emotional situations. © 2011 Blackwell Publishing Ltd.

  13. Classification-Assisted Memetic Algorithms for Equality-Constrained Optimization Problems

    NASA Astrophysics Data System (ADS)

    Handoko, Stephanus Daniel; Kwoh, Chee Keong; Ong, Yew Soon

    Regressions has successfully been incorporated into memetic algorithm (MA) to build surrogate models for the objective or constraint landscape of optimization problems. This helps to alleviate the needs for expensive fitness function evaluations by performing local refinements on the approximated landscape. Classifications can alternatively be used to assist MA on the choice of individuals that would experience refinements. Support-vector-assisted MA were recently proposed to alleviate needs for function evaluations in the inequality-constrained optimization problems by distinguishing regions of feasible solutions from those of the infeasible ones based on some past solutions such that search efforts can be focussed on some potential regions only. For problems having equality constraints, however, the feasible space would obviously be extremely small. It is thus extremely difficult for the global search component of the MA to produce feasible solutions. Hence, the classification of feasible and infeasible space would become ineffective. In this paper, a novel strategy to overcome such limitation is proposed, particularly for problems having one and only one equality constraint. The raw constraint value of an individual, instead of its feasibility class, is utilized in this work.

  14. Toward Modeling the Intrinsic Complexity of Test Problems

    ERIC Educational Resources Information Center

    Shoufan, Abdulhadi

    2017-01-01

    The concept of intrinsic complexity explains why different problems of the same type, tackled by the same problem solver, can require different times to solve and yield solutions of different quality. This paper proposes a general four-step approach that can be used to establish a model for the intrinsic complexity of a problem class in terms of…

  15. Investigating the Effect of Complexity Factors in Stoichiometry Problems Using Logistic Regression and Eye Tracking

    ERIC Educational Resources Information Center

    Tang, Hui; Kirk, John; Pienta, Norbert J.

    2014-01-01

    This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we developed. The…

  16. Statistical Extremes of Turbulence and a Cascade Generalisation of Euler's Gyroscope Equation

    NASA Astrophysics Data System (ADS)

    Tchiguirinskaia, Ioulia; Scherzer, Daniel

    2016-04-01

    Turbulence refers to a rather well defined hydrodynamical phenomenon uncovered by Reynolds. Nowadays, the word turbulence is used to designate the loss of order in many different geophysical fields and the related fundamental extreme variability of environmental data over a wide range of scales. Classical statistical techniques for estimating the extremes, being largely limited to statistical distributions, do not take into account the mechanisms generating such extreme variability. An alternative approaches to nonlinear variability are based on a fundamental property of the non-linear equations: scale invariance, which means that these equations are formally invariant under given scale transforms. Its specific framework is that of multifractals. In this framework extreme variability builds up scale by scale leading to non-classical statistics. Although multifractals are increasingly understood as a basic framework for handling such variability, there is still a gap between their potential and their actual use. In this presentation we discuss how to dealt with highly theoretical problems of mathematical physics together with a wide range of geophysical applications. We use Euler's gyroscope equation as a basic element in constructing a complex deterministic system that preserves not only the scale symmetry of the Navier-Stokes equations, but some more of their symmetries. Euler's equation has been not only the object of many theoretical investigations of the gyroscope device, but also generalised enough to become the basic equation of fluid mechanics. Therefore, there is no surprise that a cascade generalisation of this equation can be used to characterise the intermittency of turbulence, to better understand the links between the multifractal exponents and the structure of a simplified, but not simplistic, version of the Navier-Stokes equations. In a given way, this approach is similar to that of Lorenz, who studied how the flap of a butterfly wing could generate a cyclone with the help of a 3D ordinary differential system. Being well supported by the extensive numerical results, the cascade generalisation of Euler's gyroscope equation opens new horizons for predictability and predictions of processes having long-range dependences.

  17. The role of activity complexes in the distribution of solar magnetic fields.

    NASA Astrophysics Data System (ADS)

    García de La Rosa, J. I.; Reyes, R. C.

    Using published data on the large-scale distribution of solar activity, the authors conclude that the longlived coronal holes are formed and maintained by the unbalanced magnetic flux which developes at both extremes of the complexes of activity.

  18. Governmental and Nongovernmental Youth Welfare in the New German Lander.

    ERIC Educational Resources Information Center

    Gawlik, Marion; And Others

    1994-01-01

    Survey of the general conditions of youth welfare departments in eastern Germany revealed severe money shortages. Increasing demands on youth welfare, rising social problems, right-wing extremism, and widespread unemployment among youths cause long-term social problems and prohibit effective youth welfare. (RJM)

  19. Multigrid Methods for Aerodynamic Problems in Complex Geometries

    NASA Technical Reports Server (NTRS)

    Caughey, David A.

    1995-01-01

    Work has been directed at the development of efficient multigrid methods for the solution of aerodynamic problems involving complex geometries, including the development of computational methods for the solution of both inviscid and viscous transonic flow problems. The emphasis is on problems of complex, three-dimensional geometry. The methods developed are based upon finite-volume approximations to both the Euler and the Reynolds-Averaged Navier-Stokes equations. The methods are developed for use on multi-block grids using diagonalized implicit multigrid methods to achieve computational efficiency. The work is focused upon aerodynamic problems involving complex geometries, including advanced engine inlets.

  20. Quantifying uncertainties in wind energy assessment

    NASA Astrophysics Data System (ADS)

    Patlakas, Platon; Galanis, George; Kallos, George

    2015-04-01

    The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.

  1. Robust Decision Making: The Cognitive and Computational Modeling of Team Problem Solving for Decision Making under Complex and Dynamic Conditions

    DTIC Science & Technology

    2015-07-14

    AFRL-OSR-VA-TR-2015-0202 Robust Decision Making: The Cognitive and Computational Modeling of Team Problem Solving for Decision Making under Complex...Computational Modeling of Team Problem Solving for Decision Making Under Complex and Dynamic Conditions 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-12-1...functioning as they solve complex problems, and propose the means to improve the performance of teams, under changing or adversarial conditions. By

  2. Screening for Autism in Extremely Preterm Infants: Problems in Interpretation

    ERIC Educational Resources Information Center

    Moore, Tamanna; Johnson, Samantha; Hennessy, Enid; Marlow, Neil

    2012-01-01

    Aim: The aim of this article was to report the prevalence of, and risk factors for, positive autism screens using the Modified Checklist for Autism in Toddlers (M-CHAT) in children born extremely preterm in England. Method: All children born at not more than 26 weeks' gestational age in England during 2006 were recruited to the EPICure-2 study. At…

  3. The Extreme Ultraviolet Explorer science instruments development - Lessons learned

    NASA Technical Reports Server (NTRS)

    Malina, Roger F.; Battel, S.

    1991-01-01

    The science instruments development project for the Extreme Ultraviolet Explorer (EUVE) satellite is reviewed. Issues discussed include the philosophical basis of the program, the establishment of a tight development team, the approach to planning and phasing activities, the handling of the most difficult technical problems, and the assessment of the work done during the preimplemntation period of the project.

  4. Deformation mechanisms in a coal mine roadway in extremely swelling soft rock.

    PubMed

    Li, Qinghai; Shi, Weiping; Yang, Renshu

    2016-01-01

    The problem of roadway support in swelling soft rock was one of the challenging problems during mining. For most geological conditions, combinations of two or more supporting approaches could meet the requirements of most roadways; however, in extremely swelling soft rock, combined approaches even could not control large deformations. The purpose of this work was to probe the roadway deformation mechanisms in extremely swelling soft rock. Based on the main return air-way in a coal mine, deformation monitoring and geomechanical analysis were conducted, as well as plastic zone mechanical model was analysed. Results indicated that this soft rock was potentially very swelling. When the ground stress acted alone, the support strength needed in situ was not too large and combined supporting approaches could meet this requirement; however, when this potential released, the roadway would undergo permanent deformation. When the loose zone reached 3 m within surrounding rock, remote stress p ∞ and supporting stress P presented a linear relationship. Namely, the greater the swelling stress, the more difficult it would be in roadway supporting. So in this extremely swelling soft rock, a better way to control roadway deformation was to control the releasing of surrounding rock's swelling potential.

  5. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, F.

    2016-12-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management and possible impacts of climate change led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high resolution groundwater level simulation was carried out. A decision support process with a very intensive stakeholder interaction combined with high resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  6. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, Frido; Kramer, Stefanie; Koch, Thomas; Seifert, Sven; Monninkhoff, Bertram; Pfützner, Bernd

    2017-04-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management and possible impacts of climate change led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high resolution groundwater level simulation was carried out. A decision support process with a very intensive stakeholder interaction combined with high resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  7. An innovative early warning system for floods and operational risks in harbours

    NASA Astrophysics Data System (ADS)

    Smets, Steven; Bolle, Annelies; Mollaert, Justine; Buitrago, Saul; Gruwez, Vincent

    2016-04-01

    Early Warning Systems (EWS) are nowadays becoming fairly standard in river flood forecasting or in large scale hydrometeorological predictions. For complex coastal morphodynamic problems or in the vicinity of complex coastal structures, such as harbours, EWS are much less used because they are both technically and computationally still very challenging. To advance beyond the state-of-the-art, the EU FP7 project Risc-KIT (www.risc-kit.eu) is developing prototype EWS which address specifically these topics. This paper describes the prototype EWS which IMDC has developed for the case study site of the harbour of Zeebrugge. The harbour of Zeebrugge is the largest industrial seaport on the coast of Belgium, extending more than 3 km into the sea. Two long breakwaters provide shelter for the inner quays and docks for regular conditions and frequent storms. Extreme storms surges and waves can however still enter the harbour and create risks for the harbour operations and infrastructure. The prediction of the effects of storm surges and waves inside harbours are typically very complex and challenging, due to the need of different types of numerical models for representing all different physical processes. In general, waves inside harbours are a combination of locally wind generated waves and offshore wave penetration at the port entrance. During extreme conditions, the waves could overtop the quays and breakwaters and flood the port facilities. Outside a prediction environment, the conditions inside the harbour could be assessed by superimposing processes. The assessment can be carried out by using a combination of a spectral wave model (i.e. SWAN) for the wind generated waves and a Boussinesq type wave model (i.e. Mike 21 BW) for the wave penetration from offshore. Finally, a 2D hydrodynamic model (i.e. TELEMAC) can be used to simulate the overland flooding inside the port facilities. To reproduce these processes in an EWS environment, an additional challenge is to cope with the limitations of the calculation engines. This is especially true with the Boussinesq model. A model train is proposed that integrates processed based modelling, for wind generated waves, with an intelligent simplification of the Boussinesq model for the wave penetration effects. These wave conditions together with the extreme water levels (including storm surge) can then be used to simulate the overtopping/overflow behaviour for the quays. Finally, the hydrodynamic model TELEMAC is run for the inundation forecast inside the port facilities. The complete model train was integrated into the Deltares Delft FEWS software to showcase the potential for real time operations.

  8. Implicitly defined criteria for vector optimization in technological process of hydroponic germination of wheat grain

    NASA Astrophysics Data System (ADS)

    Koneva, M. S.; Rudenko, O. V.; Usatikov, S. V.; Bugaets, N. A.; Tereshchenko, I. V.

    2018-05-01

    To reduce the duration of the process and to ensure the microbiological purity of the germinated material, an improved method of germination has been developed based on the complex use of physical factors: electrochemically activated water (ECHA-water), electromagnetic field of extremely low frequencies (EMF ELF) with round-the-clock artificial illumination by LED lamps. The increase in the efficiency of the "numerical" technology for solving computational problems of parametric optimization of the technological process of hydroponic germination of wheat grains is considered. In this situation, the quality criteria are contradictory and part of them is given by implicit functions of many variables. A solution algorithm is offered without the construction of a Pareto set in which a relatively small number of elements of a set of alternatives is used to obtain a linear convolution of the criteria with given weights, normalized to their "ideal" values from the solution of the problems of single-criterion private optimizations. The use of the proposed mathematical models describing the processes of hydroponic germination of wheat grains made it possible to intensify the germination process and to shorten the time of obtaining wheat sprouts "Altayskaya 105" for 27 hours.

  9. Promoting Active Learning: The Use of Computational Software Programs

    NASA Astrophysics Data System (ADS)

    Dickinson, Tom

    The increased emphasis on active learning in essentially all disciplines is proving beneficial in terms of a student's depth of learning, retention, and completion of challenging courses. Formats labeled flipped, hybrid and blended facilitate face-to-face active learning. To be effective, students need to absorb a significant fraction of the course material prior to class, e.g., using online lectures and reading assignments. Getting students to assimilate and at least partially understand this material prior to class can be extremely difficult. As an aid to achieving this preparation as well as enhancing depth of understanding, we find the use of software programs such as Mathematica®or MatLab®, very helpful. We have written several Mathematica®applications and student exercises for use in a blended format two semester E&M course. Formats include tutorials, simulations, graded and non-graded quizzes, walk-through problems, exploration and interpretation exercises, and numerical solutions of complex problems. A good portion of this activity involves student-written code. We will discuss the efficacy of these applications, their role in promoting active learning, and the range of possible uses of this basic scheme in other classes.

  10. Pathophysiology of wound healing and alterations in venous leg ulcers-review.

    PubMed

    Raffetto, Joseph D

    2016-03-01

    Venous leg ulcer (VLU) is one of the most common lower extremity ulcerated wound, and is a significant healthcare problem with implications that affect social, economic, and the well-being of a patient. VLU can have debilitating related problems which require weekly medical care and may take months to years to heal. The pathophysiology of VLU is complex, and healing is delayed in many patients due to a persistent inflammatory condition. Patient genetic and environmental factors predispose individuals to chronic venous diseases including VLU. Changes in shear stress affecting the glycocalyx are likely initiating events, leading to activation of adhesion molecules on endothelial cells, and leukocyte activation with attachment and migration into vein wall, microcirculation, and in the interstitial space. Multiple chemokines, cytokines, growth factors, proteases and matrix metalloproteinases are produced. The pathology of VLU involves an imbalance of inflammation, inflammatory modulators, oxidative stress, and proteinase activity. Understanding the cellular and biochemical events that lead to the progression of VLU is critical. With further understanding of inflammatory pathways and potential mechanisms, certain biomarkers could be revealed and studied as both involvement in the pathophysiology of VLU but also as therapeutic targets for VLU healing. © The Author(s) 2016.

  11. Framing climate change and spatial planning: how risk communication can be improved.

    PubMed

    de Boer, J

    2007-01-01

    Taking the role of frames into account may significantly add to the tools that have been developed for communication and learning on complex risks and benefits. As part of a larger multidisciplinary study into climate-related forms of sense-making this paper explores which frames are used by the citizens of Western European countries and, in particular, the Netherlands. Three recent multi-national public opinion surveys were analysed to examine beliefs about climate change in the context of beliefs about energy technology and concerns about other environmental issues, such as natural disasters. It appeared that many citizens had only vague ideas about the energy situation and that these do not constitute an unequivocal frame for climate issues. In contrast, the results suggest that the long-lasting rainfall and severe floods in Central Europe have had a significant impact. Climate change was often framed in a way that articulates its associations with rain- and river-based problems. This result is extremely important for risk communication, because especially in the Netherlands with its vulnerable coastal zones climate change may produce many more consequences than rain- and river-based problems only.

  12. Revisiting the Quantum Brain Hypothesis: Toward Quantum (Neuro)biology?

    PubMed Central

    Jedlicka, Peter

    2017-01-01

    The nervous system is a non-linear dynamical complex system with many feedback loops. A conventional wisdom is that in the brain the quantum fluctuations are self-averaging and thus functionally negligible. However, this intuition might be misleading in the case of non-linear complex systems. Because of an extreme sensitivity to initial conditions, in complex systems the microscopic fluctuations may be amplified and thereby affect the system’s behavior. In this way quantum dynamics might influence neuronal computations. Accumulating evidence in non-neuronal systems indicates that biological evolution is able to exploit quantum stochasticity. The recent rise of quantum biology as an emerging field at the border between quantum physics and the life sciences suggests that quantum events could play a non-trivial role also in neuronal cells. Direct experimental evidence for this is still missing but future research should address the possibility that quantum events contribute to an extremely high complexity, variability and computational power of neuronal dynamics. PMID:29163041

  13. Variable complexity online sequential extreme learning machine, with applications to streamflow prediction

    NASA Astrophysics Data System (ADS)

    Lima, Aranildo R.; Hsieh, William W.; Cannon, Alex J.

    2017-12-01

    In situations where new data arrive continually, online learning algorithms are computationally much less costly than batch learning ones in maintaining the model up-to-date. The extreme learning machine (ELM), a single hidden layer artificial neural network with random weights in the hidden layer, is solved by linear least squares, and has an online learning version, the online sequential ELM (OSELM). As more data become available during online learning, information on the longer time scale becomes available, so ideally the model complexity should be allowed to change, but the number of hidden nodes (HN) remains fixed in OSELM. A variable complexity VC-OSELM algorithm is proposed to dynamically add or remove HN in the OSELM, allowing the model complexity to vary automatically as online learning proceeds. The performance of VC-OSELM was compared with OSELM in daily streamflow predictions at two hydrological stations in British Columbia, Canada, with VC-OSELM significantly outperforming OSELM in mean absolute error, root mean squared error and Nash-Sutcliffe efficiency at both stations.

  14. Winds, Mountains, and Wildland Fire: Improved Understanding of Coupled Atmosphere-Topography-Fire Interactions Through Large-Eddy Simulation

    NASA Astrophysics Data System (ADS)

    Munoz-Esparza, D.; Sauer, J.; Linn, R.

    2015-12-01

    Anomalous and unexpected fire behavior in complex terrain continues to result in substantial loss of property and extremely dangerous conditions for firefighting field personnel. We briefly discuss proposed hypotheses of fire interactions with atmospheric flows over complex terrain that can lead to poorly-understood and potentially catastrophic scenarios. Then, our recent results of numerical investigations via large-eddy simulation of coupled atmosphere-topography-fire phenomenology with the Los Alamos National Laboratory, HiGrad-Firetec model are presented as an example of the potential for increased understanding of these complex processes. This investigation focuses on the influence of downslope surface wind enhancement through stably stratified flow over an isolated hill, and the resulting dramatic changes in fire behavior including spread rate, and intensity. Implications with respect to counter-intuitive fire behavior and extreme fire events are discussed. This work demonstrates a tremendous opportunity to immediately create safer and more effective policy for field personnel through improved predictability of atmospheric conditions over complex terrain

  15. Revisiting the Quantum Brain Hypothesis: Toward Quantum (Neuro)biology?

    PubMed

    Jedlicka, Peter

    2017-01-01

    The nervous system is a non-linear dynamical complex system with many feedback loops. A conventional wisdom is that in the brain the quantum fluctuations are self-averaging and thus functionally negligible. However, this intuition might be misleading in the case of non-linear complex systems. Because of an extreme sensitivity to initial conditions, in complex systems the microscopic fluctuations may be amplified and thereby affect the system's behavior. In this way quantum dynamics might influence neuronal computations. Accumulating evidence in non-neuronal systems indicates that biological evolution is able to exploit quantum stochasticity. The recent rise of quantum biology as an emerging field at the border between quantum physics and the life sciences suggests that quantum events could play a non-trivial role also in neuronal cells. Direct experimental evidence for this is still missing but future research should address the possibility that quantum events contribute to an extremely high complexity, variability and computational power of neuronal dynamics.

  16. [Starving in childhood and diabetes mellitus in elderly age].

    PubMed

    Khoroshinina, L P; Zhavoronkova, N V

    2008-01-01

    The long-term consequences of the protracted starvation or inadequate nutrition of children is a problem in which considerable interest has been shown in recent decades. Between June 1941 and January 1944 the civilian population of Leningrad was besieged for two and a half years. The non-combatant population of this large European city lived through lengthy periods of starvation or malnutrition against a background of additional complex stress factors (including cold, bombing, death of relatives and acquaintances, and lack of means of transport and communication). It may be assumed that the health in adulthood of those who were children and young people in Leningrad during the siege differed from that of people of the same age who were spared those extreme conditions. Impact of starvation in childhood on prevalence rate of diabetes mellitus in elderly age, time of onset, clinical features of the disease course were studied. The results confirm that insulin-independent diabetes without obesity develops more often and earlier in women who got through the Siege of Leningrad in their childhood. Health status of elderly people who underwent continuous starvation in their childhood is the actual problem, because health status of young people in this country who got through 90's, when one of three children in the age of 2 years starved, suggests developing of medical and social problems because of forthcoming changes in the illness patterns of the population in modern Russia.

  17. Microfluidic colloid filtration

    PubMed Central

    Linkhorst, John; Beckmann, Torsten; Go, Dennis; Kuehne, Alexander J. C.; Wessling, Matthias

    2016-01-01

    Filtration of natural and colloidal matter is an essential process in today’s water treatment processes. The colloidal matter is retained with the help of micro- and nanoporous synthetic membranes. Colloids are retained in a “cake layer” – often coined fouling layer. Membrane fouling is the most substantial problem in membrane filtration: colloidal and natural matter build-up leads to an increasing resistance and thus decreasing water transport rate through the membrane. Theoretical models exist to describe macroscopically the hydrodynamic resistance of such transport and rejection phenomena; however, visualization of the various phenomena occurring during colloid retention is extremely demanding. Here we present a microfluidics based methodology to follow filter cake build up as well as transport phenomena occuring inside of the fouling layer. The microfluidic colloidal filtration methodology enables the study of complex colloidal jamming, crystallization and melting processes as well as translocation at the single particle level. PMID:26927706

  18. Detecting text in natural scenes with multi-level MSER and SWT

    NASA Astrophysics Data System (ADS)

    Lu, Tongwei; Liu, Renjun

    2018-04-01

    The detection of the characters in the natural scene is susceptible to factors such as complex background, variable viewing angle and diverse forms of language, which leads to poor detection results. Aiming at these problems, a new text detection method was proposed, which consisted of two main stages, candidate region extraction and text region detection. At first stage, the method used multiple scale transformations of original image and multiple thresholds of maximally stable extremal regions (MSER) to detect the text regions which could detect character regions comprehensively. At second stage, obtained SWT maps by using the stroke width transform (SWT) algorithm to compute the candidate regions, then using cascaded classifiers to propose non-text regions. The proposed method was evaluated on the standard benchmark datasets of ICDAR2011 and the datasets that we made our own data sets. The experiment results showed that the proposed method have greatly improved that compared to other text detection methods.

  19. Native phasing of x-ray free-electron laser data for a G protein-coupled receptor.

    PubMed

    Batyuk, Alexander; Galli, Lorenzo; Ishchenko, Andrii; Han, Gye Won; Gati, Cornelius; Popov, Petr A; Lee, Ming-Yue; Stauch, Benjamin; White, Thomas A; Barty, Anton; Aquila, Andrew; Hunter, Mark S; Liang, Mengning; Boutet, Sébastien; Pu, Mengchen; Liu, Zhi-Jie; Nelson, Garrett; James, Daniel; Li, Chufeng; Zhao, Yun; Spence, John C H; Liu, Wei; Fromme, Petra; Katritch, Vsevolod; Weierstall, Uwe; Stevens, Raymond C; Cherezov, Vadim

    2016-09-01

    Serial femtosecond crystallography (SFX) takes advantage of extremely bright and ultrashort pulses produced by x-ray free-electron lasers (XFELs), allowing for the collection of high-resolution diffraction intensities from micrometer-sized crystals at room temperature with minimal radiation damage, using the principle of "diffraction-before-destruction." However, de novo structure factor phase determination using XFELs has been difficult so far. We demonstrate the ability to solve the crystallographic phase problem for SFX data collected with an XFEL using the anomalous signal from native sulfur atoms, leading to a bias-free room temperature structure of the human A 2A adenosine receptor at 1.9 Å resolution. The advancement was made possible by recent improvements in SFX data analysis and the design of injectors and delivery media for streaming hydrated microcrystals. This general method should accelerate structural studies of novel difficult-to-crystallize macromolecules and their complexes.

  20. A technique for computation of noise temperature due to a beam waveguide shroud

    NASA Technical Reports Server (NTRS)

    Veruttipong, W.; Franco, M. M.

    1993-01-01

    Direct analytical computation of the noise temperature of real beam waveguide (BWG) systems, including all mirrors and the surrounding shroud, is an extremely complex problem and virtually impossible to achieve. Yet the DSN antennas are required to be ultra low-noise in order to be effective, and a reasonably accurate prediction is essential. This article presents a relatively simple technique to compute a real BWG system noise temperature by combining analytical techniques with data from experimental tests. Specific expressions and parameters for X-band (8.45-GHz) BWG noise computation are obtained for DSS 13 and DSS 24, now under construction. These expressions are also valid for various conditions of the BWG feed systems, including horn sizes and positions, and mirror sizes, curvatures, and positions. Parameters for S- and Ka-bands (2.3 and 32.0 GHz) have not been determined; however, those can be obtained following the same procedure as for X-band.

  1. The Fluid Dynamics of Competitive Swimming

    NASA Astrophysics Data System (ADS)

    Wei, Timothy; Mark, Russell; Hutchison, Sean

    2014-01-01

    Nowhere in sport is performance so dependent on the interaction of the athlete with the surrounding medium than in competitive swimming. As a result, understanding (at least implicitly) and controlling (explicitly) the fluid dynamics of swimming are essential to earning a spot on the medal stand. This is an extremely complex, highly multidisciplinary problem with a broad spectrum of research approaches. This review attempts to provide a historical framework for the fluid dynamics-related aspects of human swimming research, principally conducted roughly over the past five decades, with an emphasis on the past 25 years. The literature is organized below to show a continuous integration of computational and experimental technologies into the sport. Illustrations from the authors' collaborations over a 10-year period, coupling the knowledge and experience of an elite-level coach, a lead biomechanician at USA Swimming, and an experimental fluid dynamicist, are intended to bring relevance and immediacy to the review.

  2. Current Status of Obstetric Anaesthesia: Improving Satisfaction and Safety

    PubMed Central

    Ranasinghe, J Sudharma; Birnbach, David

    2009-01-01

    Summary The Centers for Disease Control and Prevention (CDC) reported in 2003 that although the maternal mortality rate has decreased by 99% since 1900, there has been no further decrease in the last two decades1. A more recent report indicates a rate of 11.8 per 100,000 live births2, although anaesthesia-related maternal mortality and morbidity has considerably decreased over the last few decades. Despite the growing complexity of problems and increasing challenges such as pre-existing maternal disease, obesity, and the increasing age of pregnant mothers, anaesthesia related maternal mortality is extremely rare in the developed world. The current safety has been achieved through changes in training, service, technical advances and multidisciplinary approach to care. The rates of general anaesthesia for cesarean delivery have decreased and neuraxial anaesthetics have become the most commonly used techniques. Neuraxial techniques are largely safe and effective, but potential complications, though rare, can be severe. PMID:20640111

  3. Information Gain Based Dimensionality Selection for Classifying Text Documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumidu Wijayasekara; Milos Manic; Miles McQueen

    2013-06-01

    Selecting the optimal dimensions for various knowledge extraction applications is an essential component of data mining. Dimensionality selection techniques are utilized in classification applications to increase the classification accuracy and reduce the computational complexity. In text classification, where the dimensionality of the dataset is extremely high, dimensionality selection is even more important. This paper presents a novel, genetic algorithm based methodology, for dimensionality selection in text mining applications that utilizes information gain. The presented methodology uses information gain of each dimension to change the mutation probability of chromosomes dynamically. Since the information gain is calculated a priori, the computational complexitymore » is not affected. The presented method was tested on a specific text classification problem and compared with conventional genetic algorithm based dimensionality selection. The results show an improvement of 3% in the true positives and 1.6% in the true negatives over conventional dimensionality selection methods.« less

  4. Learning classification with auxiliary probabilistic information

    PubMed Central

    Nguyen, Quang; Valizadegan, Hamed; Hauskrecht, Milos

    2012-01-01

    Finding ways of incorporating auxiliary information or auxiliary data into the learning process has been the topic of active data mining and machine learning research in recent years. In this work we study and develop a new framework for classification learning problem in which, in addition to class labels, the learner is provided with an auxiliary (probabilistic) information that reflects how strong the expert feels about the class label. This approach can be extremely useful for many practical classification tasks that rely on subjective label assessment and where the cost of acquiring additional auxiliary information is negligible when compared to the cost of the example analysis and labelling. We develop classification algorithms capable of using the auxiliary information to make the learning process more efficient in terms of the sample complexity. We demonstrate the benefit of the approach on a number of synthetic and real world data sets by comparing it to the learning with class labels only. PMID:25309141

  5. MAFIA Version 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weiland, T.; Bartsch, M.; Becker, U.

    1997-02-01

    MAFIA Version 4.0 is an almost completely new version of the general purpose electromagnetic simulator known since 13 years. The major improvements concern the new graphical user interface based on state of the art technology as well as a series of new solvers for new physics problems. MAFIA now covers heat distribution, electro-quasistatics, S-parameters in frequency domain, particle beam tracking in linear accelerators, acoustics and even elastodynamics. The solvers that were available in earlier versions have also been improved and/or extended, as for example the complex eigenmode solver, the 2D--3D coupled PIC solvers. Time domain solvers have new waveguide boundarymore » conditions with an extremely low reflection even near cutoff frequency, concentrated elements are available as well as a variety of signal processing options. Probably the most valuable addition are recursive sub-grid capabilities that enable modeling of very small details in large structures. {copyright} {ital 1997 American Institute of Physics.}« less

  6. MAFIA Version 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weiland, T.; Bartsch, M.; Becker, U.

    1997-02-01

    MAFIA Version 4.0 is an almost completely new version of the general purpose electromagnetic simulator known since 13 years. The major improvements concern the new graphical user interface based on state of the art technology as well as a series of new solvers for new physics problems. MAFIA now covers heat distribution, electro-quasistatics, S-parameters in frequency domain, particle beam tracking in linear accelerators, acoustics and even elastodynamics. The solvers that were available in earlier versions have also been improved and/or extended, as for example the complex eigenmode solver, the 2D-3D coupled PIC solvers. Time domain solvers have new waveguide boundarymore » conditions with an extremely low reflection even near cutoff frequency, concentrated elements are available as well as a variety of signal processing options. Probably the most valuable addition are recursive sub-grid capabilities that enable modeling of very small details in large structures.« less

  7. Optimizing the triple-axis spectrometer PANDA at the MLZ for small samples and complex sample environment conditions

    NASA Astrophysics Data System (ADS)

    Utschick, C.; Skoulatos, M.; Schneidewind, A.; Böni, P.

    2016-11-01

    The cold-neutron triple-axis spectrometer PANDA at the neutron source FRM II has been serving an international user community studying condensed matter physics problems. We report on a new setup, improving the signal-to-noise ratio for small samples and pressure cell setups. Analytical and numerical Monte Carlo methods are used for the optimization of elliptic and parabolic focusing guides. They are placed between the monochromator and sample positions, and the flux at the sample is compared to the one achieved by standard monochromator focusing techniques. A 25 times smaller spot size is achieved, associated with a factor of 2 increased intensity, within the same divergence limits, ± 2 ° . This optional neutron focusing guide shall establish a top-class spectrometer for studying novel exotic properties of matter in combination with more stringent sample environment conditions such as extreme pressures associated with small sample sizes.

  8. A symbolic/subsymbolic interface protocol for cognitive modeling

    PubMed Central

    Simen, Patrick; Polk, Thad

    2009-01-01

    Researchers studying complex cognition have grown increasingly interested in mapping symbolic cognitive architectures onto subsymbolic brain models. Such a mapping seems essential for understanding cognition under all but the most extreme viewpoints (namely, that cognition consists exclusively of digitally implemented rules; or instead, involves no rules whatsoever). Making this mapping reduces to specifying an interface between symbolic and subsymbolic descriptions of brain activity. To that end, we propose parameterization techniques for building cognitive models as programmable, structured, recurrent neural networks. Feedback strength in these models determines whether their components implement classically subsymbolic neural network functions (e.g., pattern recognition), or instead, logical rules and digital memory. These techniques support the implementation of limited production systems. Though inherently sequential and symbolic, these neural production systems can exploit principles of parallel, analog processing from decision-making models in psychology and neuroscience to explain the effects of brain damage on problem solving behavior. PMID:20711520

  9. The BEFWM system for detection and phase conjugation of a weak laser beam

    NASA Astrophysics Data System (ADS)

    Khizhnyak, Anatoliy; Markov, Vladimir

    2007-09-01

    Real environmental conditions, such as atmospheric turbulence and aero-optics effects, make practical implementation of the object-in-the-loop (TIL) algorithm a very difficult task, especially when the system is set to operate with a signal from the diffuse surface image-resolved object. The problem becomes even more complex since for the remote object the intensity of the returned signal is extremely low. This presentation discusses the results of an analysis and experimental verification of a thresholdless coherent signal receiving system, capable not only in high-sensitivity detection of an ultra weak object-scattered light, but also in its high-gain amplification and phase conjugation. The process of coherent detection by using the Brillouin Enhanced Four Wave Mixing (BEFWM) enables retrieval of complete information on the received signal, including accurate measurement of its wavefront. This information can be used for direct real-time control of the adaptive mirror.

  10. LSENS, a general chemical kinetics and sensitivity analysis code for gas-phase reactions: User's guide

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, Krishnan; Bittker, David A.

    1993-01-01

    A general chemical kinetics and sensitivity analysis code for complex, homogeneous, gas-phase reactions is described. The main features of the code, LSENS, are its flexibility, efficiency and convenience in treating many different chemical reaction models. The models include static system, steady, one-dimensional, inviscid flow, shock initiated reaction, and a perfectly stirred reactor. In addition, equilibrium computations can be performed for several assigned states. An implicit numerical integration method, which works efficiently for the extremes of very fast and very slow reaction, is used for solving the 'stiff' differential equation systems that arise in chemical kinetics. For static reactions, sensitivity coefficients of all dependent variables and their temporal derivatives with respect to the initial values of dependent variables and/or the rate coefficient parameters can be computed. This paper presents descriptions of the code and its usage, and includes several illustrative example problems.

  11. Management of optics. [for HEAO-2 X ray telescope

    NASA Technical Reports Server (NTRS)

    Kirchner, T. E.; Russell, M.

    1981-01-01

    American Science and Engineering, Inc., designed the large X-ray optic for the HEAO-2 X-ray Telescope. The key element in this project was the High Resolution Mirror Assembly (HRMA), subcontracting the fabrication of the optical surfaces and their assembly and alignment. The roles and organization of the key participants in the creation of HRMA are defined, and the degree of interaction between the groups is described. Management of this effort was extremely complex because of the intricate weaving of responsibilities, and AS&E, as HEAO-2 Program managers, needed to be well versed in the scientific objectives, the technical requirements, the program requirements, and the subcontract management. Understanding these factors was essential for implementing both technical and management controls, such as schedule and budget constraints, in-process control, residence requirements, and scientist review and feedback. Despite unforeseen technical problems and interaction differences, the HEAO-2 was built on schedule and to specification.

  12. Review of FD-TD numerical modeling of electromagnetic wave scattering and radar cross section

    NASA Technical Reports Server (NTRS)

    Taflove, Allen; Umashankar, Korada R.

    1989-01-01

    Applications of the finite-difference time-domain (FD-TD) method for numerical modeling of electromagnetic wave interactions with structures are reviewed, concentrating on scattering and radar cross section (RCS). A number of two- and three-dimensional examples of FD-TD modeling of scattering and penetration are provided. The objects modeled range in nature from simple geometric shapes to extremely complex aerospace and biological systems. Rigorous analytical or experimental validatons are provided for the canonical shapes, and it is shown that FD-TD predictive data for near fields and RCS are in excellent agreement with the benchmark data. It is concluded that with continuing advances in FD-TD modeling theory for target features relevant to the RCS problems and in vector and concurrent supercomputer technology, it is likely that FD-TD numerical modeling will occupy an important place in RCS technology in the 1990s and beyond.

  13. Mineralogy, petrology and chemistry of ANT-suite rocks from the lunar highlands

    NASA Technical Reports Server (NTRS)

    Prinz, M.; Keil, K.

    1977-01-01

    Anorthositic-noritic-troctolitic (ANT) rocks are the oldest and most abundant rocks of the lunar surface, and comprise about 90% of the suite of the lunar highlands. Consideration is given to the mineralogy, petrology, bulk chemistry, and origin of ANT-suite rocks. Problems associated in classifying and labeling lunar highland rocks because of textural complexities occurring from impact modifications are discussed. The mineralogy of ANT-suite rocks, dominated by plagioclase, olivine and pyrozene, and containing various minor minerals, is outlined. The petrology of ANT-suite rocks is reviewed along with the major element bulk composition of these rocks, noting that they are extremely depleted in K2O and P2O5. Various models describing the origin of ANT-suite rocks are summarized, and it is suggested that this origin involves a parental liquid of high-alumina basalt with low Fe/Fe+Mg.

  14. Radiation transfer in plant canopies - Transmission of direct solar radiation and the role of leaf orientation

    NASA Technical Reports Server (NTRS)

    Verstraete, Michel M.

    1987-01-01

    Understanding the details of the interaction between the radiation field and plant structures is important climatically because of the influence of vegetation on the surface water and energy balance, but also biologically, since solar radiation provides the energy necessary for photosynthesis. The problem is complex because of the extreme variety of vegetation forms in space and time, as well as within and across plant species. This one-dimensional vertical multilayer model describes the transfer of direct solar radiation through a leaf canopy, accounting explicitly for the vertical inhomogeneities of a plant stand and leaf orientation, as well as heliotropic plant behavior. This model reproduces observational results on homogeneous canopies, but it is also well adapted to describe vertically inhomogeneous canopies. Some of the implications of leaf orientation and plant structure as far as light collection is concerned are briefly reviewed.

  15. Space moving target detection and tracking method in complex background

    NASA Astrophysics Data System (ADS)

    Lv, Ping-Yue; Sun, Sheng-Li; Lin, Chang-Qing; Liu, Gao-Rui

    2018-06-01

    The background of the space-borne detectors in real space-based environment is extremely complex and the signal-to-clutter ratio is very low (SCR ≈ 1), which increases the difficulty for detecting space moving targets. In order to solve this problem, an algorithm combining background suppression processing based on two-dimensional least mean square filter (TDLMS) and target enhancement based on neighborhood gray-scale difference (GSD) is proposed in this paper. The latter can filter out most of the residual background clutter processed by the former such as cloud edge. Through this procedure, both global and local SCR have obtained substantial improvement, indicating that the target has been greatly enhanced. After removing the detector's inherent clutter region through connected domain processing, the image only contains the target point and the isolated noise, in which the isolated noise could be filtered out effectively through multi-frame association. The proposed algorithm in this paper has been compared with some state-of-the-art algorithms for moving target detection and tracking tasks. The experimental results show that the performance of this algorithm is the best in terms of SCR gain, background suppression factor (BSF) and detection results.

  16. Geometric and topological characterization of porous media: insights from eigenvector centrality

    NASA Astrophysics Data System (ADS)

    Jimenez-Martinez, J.; Negre, C.

    2017-12-01

    Solving flow and transport through complex geometries such as porous media involves an extreme computational cost. Simplifications such as pore networks, where the pores are represented by nodes and the pore throats by edges connecting pores, have been proposed. These models have the ability to preserve the connectivity of the medium. However, they have difficulties capturing preferential paths (high velocity) and stagnation zones (low velocity), as they do not consider the specific relations between nodes. Network theory approaches, where the complex network is conceptualized like a graph, can help to simplify and better understand fluid dynamics and transport in porous media. To address this issue, we propose a method based on eigenvector centrality. It has been corrected to overcome the centralization problem and modified to introduce a bias in the centrality distribution along a particular direction which allows considering the flow and transport anisotropy in porous media. The model predictions are compared with millifluidic transport experiments, showing that this technique is computationally efficient and has potential for predicting preferential paths and stagnation zones for flow and transport in porous media. Entropy computed from the eigenvector centrality probability distribution is proposed as an indicator of the "mixing capacity" of the system.

  17. Resilience Design Patterns - A Structured Approach to Resilience at Extreme Scale (version 1.0)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hukerikar, Saurabh; Engelmann, Christian

    Reliability is a serious concern for future extreme-scale high-performance computing (HPC) systems. Projections based on the current generation of HPC systems and technology roadmaps suggest that very high fault rates in future systems. The errors resulting from these faults will propagate and generate various kinds of failures, which may result in outcomes ranging from result corruptions to catastrophic application crashes. Practical limits on power consumption in HPC systems will require future systems to embrace innovative architectures, increasing the levels of hardware and software complexities. The resilience challenge for extreme-scale HPC systems requires management of various hardware and software technologies thatmore » are capable of handling a broad set of fault models at accelerated fault rates. These techniques must seek to improve resilience at reasonable overheads to power consumption and performance. While the HPC community has developed various solutions, application-level as well as system-based solutions, the solution space of HPC resilience techniques remains fragmented. There are no formal methods and metrics to investigate and evaluate resilience holistically in HPC systems that consider impact scope, handling coverage, and performance & power eciency across the system stack. Additionally, few of the current approaches are portable to newer architectures and software ecosystems, which are expected to be deployed on future systems. In this document, we develop a structured approach to the management of HPC resilience based on the concept of resilience-based design patterns. A design pattern is a general repeatable solution to a commonly occurring problem. We identify the commonly occurring problems and solutions used to deal with faults, errors and failures in HPC systems. The catalog of resilience design patterns provides designers with reusable design elements. We define a design framework that enhances our understanding of the important constraints and opportunities for solutions deployed at various layers of the system stack. The framework may be used to establish mechanisms and interfaces to coordinate flexible fault management across hardware and software components. The framework also enables optimization of the cost-benefit trade-os among performance, resilience, and power consumption. The overall goal of this work is to enable a systematic methodology for the design and evaluation of resilience technologies in extreme-scale HPC systems that keep scientific applications running to a correct solution in a timely and cost-ecient manner in spite of frequent faults, errors, and failures of various types.« less

  18. Students' conceptual performance on synthesis physics problems with varying mathematical complexity

    NASA Astrophysics Data System (ADS)

    Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan

    2017-06-01

    A body of research on physics problem solving has focused on single-concept problems. In this study we use "synthesis problems" that involve multiple concepts typically taught in different chapters. We use two types of synthesis problems, sequential and simultaneous synthesis tasks. Sequential problems require a consecutive application of fundamental principles, and simultaneous problems require a concurrent application of pertinent concepts. We explore students' conceptual performance when they solve quantitative synthesis problems with varying mathematical complexity. Conceptual performance refers to the identification, follow-up, and correct application of the pertinent concepts. Mathematical complexity is determined by the type and the number of equations to be manipulated concurrently due to the number of unknowns in each equation. Data were collected from written tasks and individual interviews administered to physics major students (N =179 ) enrolled in a second year mechanics course. The results indicate that mathematical complexity does not impact students' conceptual performance on the sequential tasks. In contrast, for the simultaneous problems, mathematical complexity negatively influences the students' conceptual performance. This difference may be explained by the students' familiarity with and confidence in particular concepts coupled with cognitive load associated with manipulating complex quantitative equations. Another explanation pertains to the type of synthesis problems, either sequential or simultaneous task. The students split the situation presented in the sequential synthesis tasks into segments but treated the situation in the simultaneous synthesis tasks as a single event.

  19. A machine learning approach for ranking clusters of docked protein‐protein complexes by pairwise cluster comparison

    PubMed Central

    Pfeiffenberger, Erik; Chaleil, Raphael A.G.; Moal, Iain H.

    2017-01-01

    ABSTRACT Reliable identification of near‐native poses of docked protein–protein complexes is still an unsolved problem. The intrinsic heterogeneity of protein–protein interactions is challenging for traditional biophysical or knowledge based potentials and the identification of many false positive binding sites is not unusual. Often, ranking protocols are based on initial clustering of docked poses followed by the application of an energy function to rank each cluster according to its lowest energy member. Here, we present an approach of cluster ranking based not only on one molecular descriptor (e.g., an energy function) but also employing a large number of descriptors that are integrated in a machine learning model, whereby, an extremely randomized tree classifier based on 109 molecular descriptors is trained. The protocol is based on first locally enriching clusters with additional poses, the clusters are then characterized using features describing the distribution of molecular descriptors within the cluster, which are combined into a pairwise cluster comparison model to discriminate near‐native from incorrect clusters. The results show that our approach is able to identify clusters containing near‐native protein–protein complexes. In addition, we present an analysis of the descriptors with respect to their power to discriminate near native from incorrect clusters and how data transformations and recursive feature elimination can improve the ranking performance. Proteins 2017; 85:528–543. © 2016 Wiley Periodicals, Inc. PMID:27935158

  20. Modeling Composite Laminate Crushing for Crash Analysis

    NASA Technical Reports Server (NTRS)

    Fleming, David C.; Jones, Lisa (Technical Monitor)

    2002-01-01

    Crash modeling of composite structures remains limited in application and has not been effectively demonstrated as a predictive tool. While the global response of composite structures may be well modeled, when composite structures act as energy-absorbing members through direct laminate crushing the modeling accuracy is greatly reduced. The most efficient composite energy absorbing structures, in terms of energy absorbed per unit mass, are those that absorb energy through a complex progressive crushing response in which fiber and matrix fractures on a small scale dominate the behavior. Such failure modes simultaneously include delamination of plies, failure of the matrix to produce fiber bundles, and subsequent failure of fiber bundles either in bending or in shear. In addition, the response may include the significant action of friction, both internally (between delaminated plies or fiber bundles) or externally (between the laminate and the crushing surface). A figure shows the crushing damage observed in a fiberglass composite tube specimen, illustrating the complexity of the response. To achieve a finite element model of such complex behavior is an extremely challenging problem. A practical crushing model based on detailed modeling of the physical mechanisms of crushing behavior is not expected in the foreseeable future. The present research describes attempts to model composite crushing behavior using a novel hybrid modeling procedure. Experimental testing is done is support of the modeling efforts, and a test specimen is developed to provide data for validating laminate crushing models.

  1. 'I don't know how we coped before': a study of respite care for children in the home and hospice.

    PubMed

    Eaton, Nicola

    2008-12-01

    To describe the experiences of families, whose children have life-limiting and life-threatening conditions and who have complex healthcare needs, of receiving respite care at home or in a hospice. Respite provision is an extremely important service in assisting families to cope with the extra stresses and problems of coping with children with complex healthcare needs. There are different issues when the venue is home or a hospice. Semi-structured interviews were carried out with families of children with complex healthcare needs, receiving respite care at home or in a hospice. A convenience sample of 11 families was interviewed using an interview schedule, exploring their experiences of the service and their views on the service. The areas of concern identified as significant to all the families were referral to respite service, service organisation, communication, relinquishing control to respite carers and satisfaction with service. Within the provision of respite care, there needs to be more overt referral systems and criteria, negotiation of appropriate roles, continuity of care, regular assessment of need and acknowledgement of the difficulty, which parents have in relinquishing control to respite carers. High-quality respite care for families involves more than just organising a respite session. Healthcare professionals organising and providing care could manage a service more effectively, if taking the above issues into consideration.

  2. Harnessing expert knowledge: Defining a Bayesian network decision model with limited data-Model structure for the vibration qualification problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rizzo, Davinia B.; Blackburn, Mark R.

    As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less

  3. Quality Control of Meteorological Observations

    NASA Technical Reports Server (NTRS)

    Collins, William; Dee, Dick; Rukhovets, Leonid

    1999-01-01

    For the first time, a problem of the meteorological observation quality control (QC) was formulated by L.S. Gandin at the Main Geophysical Observatory in the 70's. Later in 1988 L.S. Gandin began adapting his ideas in complex quality control (CQC) to the operational environment at the National Centers for Environmental Prediction. The CQC was first applied by L.S.Gandin and his colleagues to detection and correction of errors in rawinsonde heights and temperatures using a complex of hydrostatic residuals.Later, a full complex of residuals, vertical and horizontal optimal interpolations and baseline checks were added for the checking and correction of a wide range of meteorological variables. some other of Gandin's ideas were applied and substantially developed at other meteorological centers. A new statistical QC was recently implemented in the Goddard Data Assimilation System. The central component of any quality control is a buddy check which is a test of individual suspect observations against available nearby non-suspect observations. A novel feature of this test is that the error variances which are used for QC decision are re-estimated on-line. As a result, the allowed tolerances for suspect observations can depend on local atmospheric conditions. The system is then better able to accept extreme values observed in deep cyclones, jet streams and so on. The basic statements of this adaptive buddy check are described. Some results of the on-line QC including moisture QC are presented.

  4. Harnessing expert knowledge: Defining a Bayesian network decision model with limited data-Model structure for the vibration qualification problem

    DOE PAGES

    Rizzo, Davinia B.; Blackburn, Mark R.

    2018-03-30

    As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less

  5. Forest operations, extreme flooding events, and considerations for hydrologic modeling in the Appalachians--A review

    Treesearch

    M.A. Eisenbies; W.M. Aust; J.A. Burger; M.B. Adams

    2007-01-01

    The connection between forests and water resources is well established, but the relationships among controlling factors are only partly understood. Concern over the effects of forestry operations, particularly harvesting, on extreme flooding events is a recurrent issue in forest and watershed management. Due to the complexity of the system, and the cost of installing...

  6. Microbial communities and their predicted metabolic functions in a desiccating acid salt lake.

    PubMed

    Zaikova, Elena; Benison, Kathleen C; Mormile, Melanie R; Johnson, Sarah Stewart

    2018-05-01

    The waters of Lake Magic in Western Australia are among the most geochemically extreme on Earth. This ephemeral saline lake is characterized by pH as low as 1.6 salinity as high as 32% total dissolved solids, and unusually complex geochemistry, including extremely high concentrations of aluminum, silica, and iron. We examined the microbial composition and putative function in this extreme acid brine environment by analyzing lake water, groundwater, and sediment samples collected during the austral summer near peak evapoconcentration. Our results reveal that the lake water metagenome, surprisingly, was comprised of mostly eukaryote sequences, particularly fungi and to a lesser extent, green algae. Groundwater and sediment samples were dominated by acidophilic Firmicutes, with eukaryotic community members only detected at low abundances. The lake water bacterial community was less diverse than that in groundwater and sediment, and was overwhelmingly represented by a single OTU affiliated with Salinisphaera. Pathways associated with halotolerance were found in the metagenomes, as were genes associated with biosynthesis of protective carotenoids. During periods of complete desiccation of the lake, we hypothesize that dormancy and entrapment in fluid inclusions in halite crystals may increase long-term survival, leading to the resilience of complex eukaryotes in this extreme environment.

  7. Formative feedback and scaffolding for developing complex problem solving and modelling outcomes

    NASA Astrophysics Data System (ADS)

    Frank, Brian; Simper, Natalie; Kaupp, James

    2018-07-01

    This paper discusses the use and impact of formative feedback and scaffolding to develop outcomes for complex problem solving in a required first-year course in engineering design and practice at a medium-sized research-intensive Canadian university. In 2010, the course began to use team-based, complex, open-ended contextualised problems to develop problem solving, communications, teamwork, modelling, and professional skills. Since then, formative feedback has been incorporated into: task and process-level feedback on scaffolded tasks in-class, formative assignments, and post-assignment review. Development in complex problem solving and modelling has been assessed through analysis of responses from student surveys, direct criterion-referenced assessment of course outcomes from 2013 to 2015, and an external longitudinal study. The findings suggest that students are improving in outcomes related to complex problem solving over the duration of the course. Most notably, the addition of new feedback and scaffolding coincided with improved student performance.

  8. Conscious thought beats deliberation without attention in diagnostic decision-making: at least when you are an expert

    PubMed Central

    Schmidt, Henk G.; Rikers, Remy M. J. P.; Custers, Eugene J. F. M.; Splinter, Ted A. W.; van Saase, Jan L. C. M.

    2010-01-01

    Contrary to what common sense makes us believe, deliberation without attention has recently been suggested to produce better decisions in complex situations than deliberation with attention. Based on differences between cognitive processes of experts and novices, we hypothesized that experts make in fact better decisions after consciously thinking about complex problems whereas novices may benefit from deliberation-without-attention. These hypotheses were confirmed in a study among doctors and medical students. They diagnosed complex and routine problems under three conditions, an immediate-decision condition and two delayed conditions: conscious thought and deliberation-without-attention. Doctors did better with conscious deliberation when problems were complex, whereas reasoning mode did not matter in simple problems. In contrast, deliberation-without-attention improved novices’ decisions, but only in simple problems. Experts benefit from consciously thinking about complex problems; for novices thinking does not help in those cases. PMID:20354726

  9. Conscious thought beats deliberation without attention in diagnostic decision-making: at least when you are an expert.

    PubMed

    Mamede, Sílvia; Schmidt, Henk G; Rikers, Remy M J P; Custers, Eugene J F M; Splinter, Ted A W; van Saase, Jan L C M

    2010-11-01

    Contrary to what common sense makes us believe, deliberation without attention has recently been suggested to produce better decisions in complex situations than deliberation with attention. Based on differences between cognitive processes of experts and novices, we hypothesized that experts make in fact better decisions after consciously thinking about complex problems whereas novices may benefit from deliberation-without-attention. These hypotheses were confirmed in a study among doctors and medical students. They diagnosed complex and routine problems under three conditions, an immediate-decision condition and two delayed conditions: conscious thought and deliberation-without-attention. Doctors did better with conscious deliberation when problems were complex, whereas reasoning mode did not matter in simple problems. In contrast, deliberation-without-attention improved novices' decisions, but only in simple problems. Experts benefit from consciously thinking about complex problems; for novices thinking does not help in those cases.

  10. Preparing new nurses with complexity science and problem-based learning.

    PubMed

    Hodges, Helen F

    2011-01-01

    Successful nurses function effectively with adaptability, improvability, and interconnectedness, and can see emerging and unpredictable complex problems. Preparing new nurses for complexity requires a significant change in prevalent but dated nursing education models for rising graduates. The science of complexity coupled with problem-based learning and peer review contributes a feasible framework for a constructivist learning environment to examine real-time systems data; explore uncertainty, inherent patterns, and ambiguity; and develop skills for unstructured problem solving. This article describes a pilot study of a problem-based learning strategy guided by principles of complexity science in a community clinical nursing course. Thirty-five senior nursing students participated during a 3-year period. Assessments included peer review, a final project paper, reflection, and a satisfaction survey. Results were higher than expected levels of student satisfaction, increased breadth and analysis of complex data, acknowledgment of community as complex adaptive systems, and overall higher level thinking skills than in previous years. 2011, SLACK Incorporated.

  11. Aerodynamic Shape Optimization Using A Real-Number-Encoded Genetic Algorithm

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Pulliam, Thomas H.

    2001-01-01

    A new method for aerodynamic shape optimization using a genetic algorithm with real number encoding is presented. The algorithm is used to optimize three different problems, a simple hill climbing problem, a quasi-one-dimensional nozzle problem using an Euler equation solver and a three-dimensional transonic wing problem using a nonlinear potential solver. Results indicate that the genetic algorithm is easy to implement and extremely reliable, being relatively insensitive to design space noise.

  12. Stride search: A general algorithm for storm detection in high resolution climate data

    DOE PAGES

    Bosler, Peter Andrew; Roesler, Erika Louise; Taylor, Mark A.; ...

    2015-09-08

    This article discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared. The commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. Stride Search is designed to work at all latitudes, while grid point searches may fail in polar regions. Results from the two algorithms are compared for the application of tropicalmore » cyclone detection, and shown to produce similar results for the same set of storm identification criteria. The time required for both algorithms to search the same data set is compared. Furthermore, Stride Search's ability to search extreme latitudes is demonstrated for the case of polar low detection.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosler, Peter Andrew; Roesler, Erika Louise; Taylor, Mark A.

    This article discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared. The commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. Stride Search is designed to work at all latitudes, while grid point searches may fail in polar regions. Results from the two algorithms are compared for the application of tropicalmore » cyclone detection, and shown to produce similar results for the same set of storm identification criteria. The time required for both algorithms to search the same data set is compared. Furthermore, Stride Search's ability to search extreme latitudes is demonstrated for the case of polar low detection.« less

  14. Explicitly solvable complex Chebyshev approximation problems related to sine polynomials

    NASA Technical Reports Server (NTRS)

    Freund, Roland

    1989-01-01

    Explicitly solvable real Chebyshev approximation problems on the unit interval are typically characterized by simple error curves. A similar principle is presented for complex approximation problems with error curves induced by sine polynomials. As an application, some new explicit formulae for complex best approximations are derived.

  15. Addressing Complex Challenges through Adaptive Leadership: A Promising Approach to Collaborative Problem Solving

    ERIC Educational Resources Information Center

    Nelson, Tenneisha; Squires, Vicki

    2017-01-01

    Organizations are faced with solving increasingly complex problems. Addressing these issues requires effective leadership that can facilitate a collaborative problem solving approach where multiple perspectives are leveraged. In this conceptual paper, we critique the effectiveness of earlier leadership models in tackling complex organizational…

  16. Advanced imaging in acute and chronic deep vein thrombosis

    PubMed Central

    Karande, Gita Yashwantrao; Sanchez, Yadiel; Baliyan, Vinit; Mishra, Vishala; Ganguli, Suvranu; Prabhakar, Anand M.

    2016-01-01

    Deep venous thrombosis (DVT) affecting the extremities is a common clinical problem. Prompt imaging aids in rapid diagnosis and adequate treatment. While ultrasound (US) remains the workhorse of detection of extremity venous thrombosis, CT and MRI are commonly used as the problem-solving tools either to visualize the thrombosis in central veins like superior or inferior vena cava (IVC) or to test for the presence of complications like pulmonary embolism (PE). The cross-sectional modalities also offer improved visualization of venous collaterals. The purpose of this article is to review the established modalities used for characterization and diagnosis of DVT, and further explore promising innovations and recent advances in this field. PMID:28123971

  17. A Zone for Deliberation? Methodological Challenges in Fields of Political Unrest

    ERIC Educational Resources Information Center

    Westrheim, Kariane; Lillejord, Solvi

    2007-01-01

    This article outlines certain problems and challenges facing the qualitative researcher who enters fields that are either extremely difficult to access or potentially hostile towards outsiders. Problems and dilemmas in such contexts are highlighted by reference to fieldwork research among PKK (Kurdistan Worker's Party) guerrillas in North…

  18. Fuelwood Problems and Solutions

    Treesearch

    D. Evan Mercer; John Soussan

    1992-01-01

    Concern over the "fuelwood crisis" facing the world's poor has been widespread since the late 1970s (Eckholm et al. 1984; Soussan 1988; Agarwal 1986). At first the problem was frequently overstated. In the extreme, analysts (foresters, economists, and others) in many countries made erroneous projections of the rapid total destruction of the biomass...

  19. The Place and Purpose of Combinatorics

    ERIC Educational Resources Information Center

    Hurdle, Zach; Warshauer, Max; White, Alex

    2016-01-01

    The desire to persuade students to avoid strictly memorizing formulas is a recurring theme throughout discussions of curriculum and problem solving. In combinatorics, a branch of discrete mathematics, problems can be easy to write--identify a few categories, add a few restrictions, specify an outcome--yet extremely challenging to solve. A lesson…

  20. Solution of the multiextreme optimization problem for low-thrust spacecraft flight to the asteroid apophis

    NASA Astrophysics Data System (ADS)

    Ivashkin, V. V.; Krylov, I. V.

    2015-09-01

    A method to optimize the flight trajectories to the asteroid Apophis that allows reliably to form a set of Pontryagin extremals for various boundary conditions of the flight, as well as effectively to search for a global problem optimum amongst its elements, is developed.

  1. Integrated case management for work-related upper-extremity disorders: impact of patient satisfaction on health and work status.

    PubMed

    Feuerstein, Michael; Huang, Grant D; Ortiz, Jose M; Shaw, William S; Miller, Virginia I; Wood, Patricia M

    2003-08-01

    An integrated case management (ICM) approach (ergonomic and problem-solving intervention) to work-related upper-extremity disorders was examined in relation to patient satisfaction, future symptom severity, function, and return to work (RTW). Federal workers with work-related upper-extremity disorder workers' compensation claims (n = 205) were randomly assigned to usual care or ICM intervention. Patient satisfaction was assessed after the 4-month intervention period. Questionnaires on clinical outcomes and ergonomic exposure were administered at baseline and at 6- and 12-months postintervention. Time from intervention to RTW was obtained from an administrative database. ICM group assignment was significantly associated with greater patient satisfaction. Regression analyses found higher patient satisfaction levels predicted decreased symptom severity and functional limitations at 6 months and a shorter RTW. At 12 months, predictors of positive outcomes included male gender, lower distress, lower levels of reported ergonomic exposure, and receipt of ICM. Findings highlight the utility of targeting workplace ergonomic and problem solving skills.

  2. Comparative outcome of bomb explosion injuries versus high-powered gunshot injuries of the upper extremity in a civilian setting.

    PubMed

    Luria, Shai; Rivkin, Gurion; Avitzour, Malka; Liebergall, Meir; Mintz, Yoav; Mosheiff, Ram

    2013-03-01

    Explosion injuries to the upper extremity have specific clinical characteristics that differ from injuries due to other mechanisms. To evaluate the upper extremity injury pattern of attacks on civilian targets, comparing bomb explosion injuries to gunshot injuries and their functional recovery using standard outcome measures. Of 157 patients admitted to the hospital between 2000 and 2004, 72 (46%) sustained explosion injuries and 85 (54%) gunshot injuries. The trauma registry files were reviewed and the patients completed the DASH Questionnaire (Disabilities of Arm, Shoulder and Hand) and SF-12 (Short Form-12) after a minimum period of 1 year. Of the 157 patients, 72 (46%) had blast injuries and 85 (54%) had shooting injuries. The blast casualties had higher Injury Severity Scores (47% vs. 22% with a score of > 16, P = 0.02) and higher percent of patients treated in intensive care units (47% vs. 28%, P = 0.02). Although the Abbreviated Injury Scale score of the upper extremity injury was similar in the two groups, the blast casualties were found to have more bilateral and complex soft tissue injuries and were treated surgically more often. No difference was found in the SF-12 or DASH scores between the groups at follow up. The casualties with upper extremity blast injuries were more severely injured and sustained more bilateral and complex soft tissue injuries to the upper extremity. However, the rating of the local injury to the isolated limb is similar, as was the subjective functional recovery.

  3. Monsoon Forecasting based on Imbalanced Classification Techniques

    NASA Astrophysics Data System (ADS)

    Ribera, Pedro; Troncoso, Alicia; Asencio-Cortes, Gualberto; Vega, Inmaculada; Gallego, David

    2017-04-01

    Monsoonal systems are quasiperiodic processes of the climatic system that control seasonal precipitation over different regions of the world. The Western North Pacific Summer Monsoon (WNPSM) is one of those monsoons and it is known to have a great impact both over the global climate and over the total precipitation of very densely populated areas. The interannual variability of the WNPSM along the last 50-60 years has been related to different climatic indices such as El Niño, El Niño Modoki, the Indian Ocean Dipole or the Pacific Decadal Oscillation. Recently, a new and longer series characterizing the monthly evolution of the WNPSM, the WNP Directional Index (WNPDI), has been developed, extending its previous length from about 50 years to more than 100 years (1900-2007). Imbalanced classification techniques have been applied to the WNPDI in order to check the capability of traditional climate indices to capture and forecast the evolution of the WNPSM. The problem of forecasting has been transformed into a binary classification problem, in which the positive class represents the occurrence of an extreme monsoon event. Given that the number of extreme monsoons is much lower than the number of non-extreme monsoons, the resultant classification problem is highly imbalanced. The complete dataset is composed of 1296 instances, where only 71 (5.47%) samples correspond to extreme monsoons. Twenty predictor variables based on the cited climatic indices have been proposed, and namely, models based on trees, black box models such as neural networks, support vector machines and nearest neighbors, and finally ensemble-based techniques as random forests have been used in order to forecast the occurrence of extreme monsoons. It can be concluded that the methodology proposed here reports promising results according to the quality parameters evaluated and predicts extreme monsoons for a temporal horizon of a month with a high accuracy. From a climatological point of view, models based on trees show that the index of the El Niño Modoki in the months previous to an extreme monsoon acts as its best predictor. In most cases, the value of the Indian Ocean Dipole index acts as a second order classifier. But El Niño index, more frequently, or the Pacific Decadal Oscillation index, only in one case, do also modulate the intensity of the WNPSM in some cases.

  4. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiu, Dongbin

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  5. Picture archiving and communication in radiology.

    PubMed

    Napoli, Marzia; Nanni, Marinella; Cimarra, Stefania; Crisafulli, Letizia; Campioni, Paolo; Marano, Pasquale

    2003-01-01

    After over 80 years of exclusive archiving of radiologic films, at present, in Radiology, digital archiving is increasingly gaining ground. Digital archiving allows a considerable reduction in costs and space saving, but most importantly, immediate or remote consultation of all examinations and reports in the hospital clinical wards, is feasible. The RIS system, in this case, is the starting point of the process of electronic archiving which however is the task of PACS. The latter can be used as radiologic archive in accordance with the law provided that it is in conformance with some specifications as the use of optical long-term storage media or with electronic track of change. PACS archives, in a hierarchical system, all digital images produced by each diagnostic imaging modality. Images and patient data can be retrieved and used for consultation or remote consultation by the reporting radiologist who requires images and reports of previous radiologic examinations or by the referring physician of the ward. Modern PACS owing to the WEB server allow remote access to extremely simplified images and data however ensuring the due regulations and access protections. Since the PACS enables a simpler data communication within the hospital, security and patient privacy should be protected. A secure and reliable PACS should be able to minimize the risk of accidental data destruction, and should prevent non authorized access to the archive with adequate security measures in relation to the acquired knowledge and based on the technological advances. Archiving of data produced by modern digital imaging is a problem now present also in small Radiology services. The technology is able to readily solve problems which were extremely complex up to some years ago as the connection between equipment and archiving system owing also to the universalization of the DICOM 3.0 standard. The evolution of communication networks and the use of standard protocols as TCP/IP can minimize problems of data and image remote transmission within the healthcare enterprise as well as over the territory. However, new problems are appearing as that of digital data security profiles and of the different systems which should ensure it. Among these, algorithms of electronic signature should be mentioned. In Italy they are validated by law and therefore can be used in digital archives in accordance with the law.

  6. Binary optimization for source localization in the inverse problem of ECG.

    PubMed

    Potyagaylo, Danila; Cortés, Elisenda Gil; Schulze, Walther H W; Dössel, Olaf

    2014-09-01

    The goal of ECG-imaging (ECGI) is to reconstruct heart electrical activity from body surface potential maps. The problem is ill-posed, which means that it is extremely sensitive to measurement and modeling errors. The most commonly used method to tackle this obstacle is Tikhonov regularization, which consists in converting the original problem into a well-posed one by adding a penalty term. The method, despite all its practical advantages, has however a serious drawback: The obtained solution is often over-smoothed, which can hinder precise clinical diagnosis and treatment planning. In this paper, we apply a binary optimization approach to the transmembrane voltage (TMV)-based problem. For this, we assume the TMV to take two possible values according to a heart abnormality under consideration. In this work, we investigate the localization of simulated ischemic areas and ectopic foci and one clinical infarction case. This affects only the choice of the binary values, while the core of the algorithms remains the same, making the approximation easily adjustable to the application needs. Two methods, a hybrid metaheuristic approach and the difference of convex functions (DC), algorithm were tested. For this purpose, we performed realistic heart simulations for a complex thorax model and applied the proposed techniques to the obtained ECG signals. Both methods enabled localization of the areas of interest, hence showing their potential for application in ECGI. For the metaheuristic algorithm, it was necessary to subdivide the heart into regions in order to obtain a stable solution unsusceptible to the errors, while the analytical DC scheme can be efficiently applied for higher dimensional problems. With the DC method, we also successfully reconstructed the activation pattern and origin of a simulated extrasystole. In addition, the DC algorithm enables iterative adjustment of binary values ensuring robust performance.

  7. Complexity-aware simple modeling.

    PubMed

    Gómez-Schiavon, Mariana; El-Samad, Hana

    2018-02-26

    Mathematical models continue to be essential for deepening our understanding of biology. On one extreme, simple or small-scale models help delineate general biological principles. However, the parsimony of detail in these models as well as their assumption of modularity and insulation make them inaccurate for describing quantitative features. On the other extreme, large-scale and detailed models can quantitatively recapitulate a phenotype of interest, but have to rely on many unknown parameters, making them often difficult to parse mechanistically and to use for extracting general principles. We discuss some examples of a new approach-complexity-aware simple modeling-that can bridge the gap between the small-scale and large-scale approaches. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Progress on high-performance rapid prototype aluminum mirrors

    NASA Astrophysics Data System (ADS)

    Woodard, Kenneth S.; Myrick, Bruce H.

    2017-05-01

    Near net shape parts can be produced using some very old processes (investment casting) and the relatively new direct metal laser sintering (DMLS) process. These processes have significant advantages for complex blank lightweighting and costs but are not inherently suited for producing high performance mirrors. The DMLS process can provide extremely complex lightweight structures but the high residual stresses left in the material results in unstable mirror figure retention. Although not to the extreme intricacy of DMLS, investment casting can also provide complex lightweight structures at considerably lower costs than DMLS and even conventional wrought mirror blanks but the less than 100% density for casting (and also DMLS) limits finishing quality. This paper will cover the progress that has been made to make both the DMLS and investment casting processes into viable near net shape blank options for high performance aluminum mirrors. Finish and figure results will be presented to show performance commensurate with existing conventional processes.

  9. Guidelines to Support Professional Copyright Practice

    ERIC Educational Resources Information Center

    Dryden, Jean

    2012-01-01

    Copyright is extremely complex, and it is difficult to convey its complexities in a clear and concise form. Through decades of experience, archivists developed informal best practices for dealing with copyright in the analog world; however the application of copyright in the digital environment is evolving in response to rapidly changing…

  10. Current problems in communication from the weather forecast in the prevention of hydraulic and hydrogeological risk

    NASA Astrophysics Data System (ADS)

    Fazzini, Massimiliano; Vaccaro, Carmela

    2014-05-01

    The Italian territory is one of the most fragile hydraulic and hydro geologic of the world, due to its complexity physiographic, lithological and above meteo-climatic too. Moreover, In recent years, the unhappy urbanization, the abandonment of mountain areas and countryside have fostered hydro geological instability, ever more devastating, in relation to the extremes of meteorological events. After the dramatic floods and landscapes of the last 24 months - in which more than 50 people died - it is actually open a public debate on the issues related to prevention, forecasting and management of hydro-meteorological risk. Aim of the correct weather forecasting at different spatial and temporal scales is to avoid or minimize the potential occurrence of damage or human losses resulting from the increasingly of frequent extreme weather events. In Italy, there are two major complex problems that do not allow for effective dissemination of the correct weather forecasting. First, the absence of a national meteorological service - which can ensure the quality of information. In this regard, it is at an advanced stage the establishment of a unified national weather service - formed by technicians to national and regional civil protection and the Meteorological Service of the Air Force, which will ensure the quality of the prediction, especially through exclusive processing of national and local weather forecasting and hydro geological weather alert. At present, however, this lack favors the increasing diffusion of meteorological sites more or less professional - often totally not "ethical" - which, at different spatial scales, tend to amplify the signals from the weather prediction models, describing them the users of the web such as exceptional or rare phenomena and often causing unjustified alarmism. This behavior is almost always aimed at the desire of give a forecast before other sites and therefore looking for new commercial sponsors, with easy profits. On the other hand, however, the almost complete absence of education to environmental risks - also from as primary school - does not allow the users to know to select the information ethically and technically correct, increasingly favoring the proliferation of most of the "weather-commercial" or private weather websites. It would seem therefore essential to implement the activities of specific information by the universities and public institutions responsible for forecasting and prevention-hydrological forecast.

  11. The Bright Side of Being Blue: Depression as an Adaptation for Analyzing Complex Problems

    ERIC Educational Resources Information Center

    Andrews, Paul W.; Thomson, J. Anderson, Jr.

    2009-01-01

    Depression is the primary emotional condition for which help is sought. Depressed people often report persistent rumination, which involves analysis, and complex social problems in their lives. Analysis is often a useful approach for solving complex problems, but it requires slow, sustained processing, so disruption would interfere with problem…

  12. Accounting for Parameter Uncertainty in Complex Atmospheric Models, With an Application to Greenhouse Gas Emissions Evaluation

    NASA Astrophysics Data System (ADS)

    Swallow, B.; Rigby, M. L.; Rougier, J.; Manning, A.; Thomson, D.; Webster, H. N.; Lunt, M. F.; O'Doherty, S.

    2016-12-01

    In order to understand underlying processes governing environmental and physical phenomena, a complex mathematical model is usually required. However, there is an inherent uncertainty related to the parameterisation of unresolved processes in these simulators. Here, we focus on the specific problem of accounting for uncertainty in parameter values in an atmospheric chemical transport model. Systematic errors introduced by failing to account for these uncertainties have the potential to have a large effect on resulting estimates in unknown quantities of interest. One approach that is being increasingly used to address this issue is known as emulation, in which a large number of forward runs of the simulator are carried out, in order to approximate the response of the output to changes in parameters. However, due to the complexity of some models, it is often unfeasible to run large numbers of training runs that is usually required for full statistical emulators of the environmental processes. We therefore present a simplified model reduction method for approximating uncertainties in complex environmental simulators without the need for very large numbers of training runs. We illustrate the method through an application to the Met Office's atmospheric transport model NAME. We show how our parameter estimation framework can be incorporated into a hierarchical Bayesian inversion, and demonstrate the impact on estimates of UK methane emissions, using atmospheric mole fraction data. We conclude that accounting for uncertainties in the parameterisation of complex atmospheric models is vital if systematic errors are to be minimized and all relevant uncertainties accounted for. We also note that investigations of this nature can prove extremely useful in highlighting deficiencies in the simulator that might otherwise be missed.

  13. Extreme ultraviolet and X-ray spectroheliograph for OSO-H

    NASA Technical Reports Server (NTRS)

    Sterk, A. A.; Kieser, F.; Peck, S.; Knox, E.

    1972-01-01

    A complex scientific instrument was designed, fabricated, tested, and calibrated for launch onboard OSO-H. This instrument consisted of four spectroheliographs and an X-ray polarimeter. The instrument is designed to study solar radiation at selected wavelengths in the X-ray and the extreme ultraviolet ranges, make observations at the H-alpha wavelength, and measure the degree of polarization of X-ray emissions.

  14. Upper Extremity Injuries in Tennis Players: Diagnosis, Treatment, and Management

    PubMed Central

    Chung, Kevin C.; Lark, Meghan E.

    2016-01-01

    Synopsis Upper extremity tennis injuries are most commonly characterized as overuse injuries to the wrist, elbow and shoulder. The complex anatomy of these structures and their interaction with biomechanical properties of tennis strokes contributes to the diagnostic challenges. A thorough understanding of tennis kinetics, in combination with the current literature surrounding diagnostic and treatment methods, will improve clinical decision-making. PMID:27886833

  15. Acidic Ribosomal Proteins from the Extreme ’Halobacterium cutirubrum’,

    DTIC Science & Technology

    the extreme halophilic bacterium, Halobacterium cutirubrum. The identification of the protein moieties involved in these and other interactions in...the halophile ribosome requires a rapid and reproducible screening method for the separation, enumeration and identification of these acidic...polypeptides in the complex ribosomal protein mixtures. In this paper the authors present the results of analyses of the halophile ribosomal proteins using a

  16. Life in extreme environments: how will humans perform on Mars?

    NASA Technical Reports Server (NTRS)

    Newman, D. J.

    2000-01-01

    This review of astronaut extravehicular activity (EVA) and the details of American and Soviet/Russian spacesuit design focuses on design recommendations to enhance astronaut safety and effectiveness. Innovative spacesuit design is essential, given the challenges of future exploration-class missions in which astronauts will be called upon to perform increasingly complex and physically demanding tasks in the extreme environments of microgravity and partial gravity.

  17. Settlement-Size Scaling among Prehistoric Hunter-Gatherer Settlement Systems in the New World

    PubMed Central

    Haas, W. Randall; Klink, Cynthia J.; Maggard, Greg J.; Aldenderfer, Mark S.

    2015-01-01

    Settlement size predicts extreme variation in the rates and magnitudes of many social and ecological processes in human societies. Yet, the factors that drive human settlement-size variation remain poorly understood. Size variation among economically integrated settlements tends to be heavy tailed such that the smallest settlements are extremely common and the largest settlements extremely large and rare. The upper tail of this size distribution is often formalized mathematically as a power-law function. Explanations for this scaling structure in human settlement systems tend to emphasize complex socioeconomic processes including agriculture, manufacturing, and warfare—behaviors that tend to differentially nucleate and disperse populations hierarchically among settlements. But, the degree to which heavy-tailed settlement-size variation requires such complex behaviors remains unclear. By examining the settlement patterns of eight prehistoric New World hunter-gatherer settlement systems spanning three distinct environmental contexts, this analysis explores the degree to which heavy-tailed settlement-size scaling depends on the aforementioned socioeconomic complexities. Surprisingly, the analysis finds that power-law models offer plausible and parsimonious statistical descriptions of prehistoric hunter-gatherer settlement-size variation. This finding reveals that incipient forms of hierarchical settlement structure may have preceded socioeconomic complexity in human societies and points to a need for additional research to explicate how mobile foragers came to exhibit settlement patterns that are more commonly associated with hierarchical organization. We propose that hunter-gatherer mobility with preferential attachment to previously occupied locations may account for the observed structure in site-size variation. PMID:26536241

  18. The Influence of Coral Reef Benthic Condition on Associated Fish Assemblages

    PubMed Central

    Chong-Seng, Karen M.; Mannering, Thomas D.; Pratchett, Morgan S.; Bellwood, David R.; Graham, Nicholas A. J.

    2012-01-01

    Accumulative disturbances can erode a coral reef’s resilience, often leading to replacement of scleractinian corals by macroalgae or other non-coral organisms. These degraded reef systems have been mostly described based on changes in the composition of the reef benthos, and there is little understanding of how such changes are influenced by, and in turn influence, other components of the reef ecosystem. This study investigated the spatial variation in benthic communities on fringing reefs around the inner Seychelles islands. Specifically, relationships between benthic composition and the underlying substrata, as well as the associated fish assemblages were assessed. High variability in benthic composition was found among reefs, with a gradient from high coral cover (up to 58%) and high structural complexity to high macroalgae cover (up to 95%) and low structural complexity at the extremes. This gradient was associated with declining species richness of fishes, reduced diversity of fish functional groups, and lower abundance of corallivorous fishes. There were no reciprocal increases in herbivorous fish abundances, and relationships with other fish functional groups and total fish abundance were weak. Reefs grouping at the extremes of complex coral habitats or low-complexity macroalgal habitats displayed markedly different fish communities, with only two species of benthic invertebrate feeding fishes in greater abundance in the macroalgal habitat. These results have negative implications for the continuation of many coral reef ecosystem processes and services if more reefs shift to extreme degraded conditions dominated by macroalgae. PMID:22870294

  19. The influence of coral reef benthic condition on associated fish assemblages.

    PubMed

    Chong-Seng, Karen M; Mannering, Thomas D; Pratchett, Morgan S; Bellwood, David R; Graham, Nicholas A J

    2012-01-01

    Accumulative disturbances can erode a coral reef's resilience, often leading to replacement of scleractinian corals by macroalgae or other non-coral organisms. These degraded reef systems have been mostly described based on changes in the composition of the reef benthos, and there is little understanding of how such changes are influenced by, and in turn influence, other components of the reef ecosystem. This study investigated the spatial variation in benthic communities on fringing reefs around the inner Seychelles islands. Specifically, relationships between benthic composition and the underlying substrata, as well as the associated fish assemblages were assessed. High variability in benthic composition was found among reefs, with a gradient from high coral cover (up to 58%) and high structural complexity to high macroalgae cover (up to 95%) and low structural complexity at the extremes. This gradient was associated with declining species richness of fishes, reduced diversity of fish functional groups, and lower abundance of corallivorous fishes. There were no reciprocal increases in herbivorous fish abundances, and relationships with other fish functional groups and total fish abundance were weak. Reefs grouping at the extremes of complex coral habitats or low-complexity macroalgal habitats displayed markedly different fish communities, with only two species of benthic invertebrate feeding fishes in greater abundance in the macroalgal habitat. These results have negative implications for the continuation of many coral reef ecosystem processes and services if more reefs shift to extreme degraded conditions dominated by macroalgae.

  20. Interpersonal Problems and Developmental Trajectories of Binge Eating Disorder

    PubMed Central

    Blomquist, Kerstin K.; Ansell, Emily B.; White, Marney A.; Masheb, Robin M.; Grilo, Carlos M.

    2012-01-01

    Objective To explore associations between specific interpersonal constructs and the developmental progression of behaviors leading to binge eating disorder (BED). Method Eighty-four consecutively evaluated, treatment-seeking obese (BMI ≥ 30) men and women with BED were assessed with structured diagnostic and clinical interviews and completed a battery of established measures to assess the current and developmental eating- and weight-related variables as well as interpersonal functioning. Results Using the interpersonal circumplex structural summary method, amplitude, elevation, the affiliation dimension, and the quadratic coefficient for the dominance dimension were associated with eating and weight-related developmental variables. The amplitude coefficient and more extreme interpersonal problems on the dominance dimension (quadratic)—i.e., problems with being extremely high (domineering) or low in dominance (submissive)—were significantly associated with ayounger age at onset of binge eating, BED, and overweight as well as accounted for significant variance in age at binge eating, BED, and overweight onset. Greater interpersonal problems with having an overly affiliative interpersonal style were significantly associated with, and accounted for significant variance in, ayounger age at diet onset. Discussion Findings provide further support for the importance of interpersonal problems among adults with BED and converge with recent work highlighting the importance of specific types of interpersonal problems for understanding heterogeneity and different developmental trajectories of individuals with BED. PMID:22727087

Top