Sample records for problems involving complex

  1. Multigrid Methods for Aerodynamic Problems in Complex Geometries

    NASA Technical Reports Server (NTRS)

    Caughey, David A.

    1995-01-01

    Work has been directed at the development of efficient multigrid methods for the solution of aerodynamic problems involving complex geometries, including the development of computational methods for the solution of both inviscid and viscous transonic flow problems. The emphasis is on problems of complex, three-dimensional geometry. The methods developed are based upon finite-volume approximations to both the Euler and the Reynolds-Averaged Navier-Stokes equations. The methods are developed for use on multi-block grids using diagonalized implicit multigrid methods to achieve computational efficiency. The work is focused upon aerodynamic problems involving complex geometries, including advanced engine inlets.

  2. Solving Complex Problems: A Convergent Approach to Cognitive Load Measurement

    ERIC Educational Resources Information Center

    Zheng, Robert; Cook, Anne

    2012-01-01

    The study challenged the current practices in cognitive load measurement involving complex problem solving by manipulating the presence of pictures in multiple rule-based problem-solving situations and examining the cognitive load resulting from both off-line and online measures associated with complex problem solving. Forty-eight participants…

  3. The Bright Side of Being Blue: Depression as an Adaptation for Analyzing Complex Problems

    ERIC Educational Resources Information Center

    Andrews, Paul W.; Thomson, J. Anderson, Jr.

    2009-01-01

    Depression is the primary emotional condition for which help is sought. Depressed people often report persistent rumination, which involves analysis, and complex social problems in their lives. Analysis is often a useful approach for solving complex problems, but it requires slow, sustained processing, so disruption would interfere with problem…

  4. The Roles of Internal Representation and Processing in Problem Solving Involving Insight: A Computational Complexity Perspective

    ERIC Educational Resources Information Center

    Wareham, Todd

    2017-01-01

    In human problem solving, there is a wide variation between individuals in problem solution time and success rate, regardless of whether or not this problem solving involves insight. In this paper, we apply computational and parameterized analysis to a plausible formalization of extended representation change theory (eRCT), an integration of…

  5. Proportional Reasoning in the Learning of Chemistry: Levels of Complexity

    ERIC Educational Resources Information Center

    Ramful, Ajay; Narod, Fawzia Bibi

    2014-01-01

    This interdisciplinary study sketches the ways in which proportional reasoning is involved in the solution of chemistry problems, more specifically, problems involving quantities in chemical reactions (commonly referred to as stoichiometry problems). By building on the expertise of both mathematics and chemistry education research, the present…

  6. Diverse knowledges and competing interests: an essay on socio-technical problem-solving.

    PubMed

    di Norcia, Vincent

    2002-01-01

    Solving complex socio-technical problems, this paper claims, involves diverse knowledges (cognitive diversity), competing interests (social diversity), and pragmatism. To explain this view, this paper first explores two different cases: Canadian pulp and paper mill pollution and siting nuclear reactors in systematically sensitive areas of California. Solving such socio-technically complex problems involves cognitive diversity as well as social diversity and pragmatism. Cognitive diversity requires one to not only recognize relevant knowledges but also to assess their validity. Finally, it is suggested, integrating the resultant set of diverse relevant and valid knowledges determines the parameters of the solution space for the problem.

  7. Problem solving using soft systems methodology.

    PubMed

    Land, L

    This article outlines a method of problem solving which considers holistic solutions to complex problems. Soft systems methodology allows people involved in the problem situation to have control over the decision-making process.

  8. Student Learning of Complex Earth Systems: A Model to Guide Development of Student Expertise in Problem-Solving

    ERIC Educational Resources Information Center

    Holder, Lauren N.; Scherer, Hannah H.; Herbert, Bruce E.

    2017-01-01

    Engaging students in problem-solving concerning environmental issues in near-surface complex Earth systems involves developing student conceptualization of the Earth as a system and applying that scientific knowledge to the problems using practices that model those used by professionals. In this article, we review geoscience education research…

  9. Seeing around a Ball: Complex, Technology-Based Problems in Calculus with Applications in Science and Engineering-Redux

    ERIC Educational Resources Information Center

    Winkel, Brian

    2008-01-01

    A complex technology-based problem in visualization and computation for students in calculus is presented. Strategies are shown for its solution and the opportunities for students to put together sequences of concepts and skills to build for success are highlighted. The problem itself involves placing an object under water in order to actually see…

  10. From problem solving to problem definition: scrutinizing the complex nature of clinical practice.

    PubMed

    Cristancho, Sayra; Lingard, Lorelei; Regehr, Glenn

    2017-02-01

    In medical education, we have tended to present problems as being singular, stable, and solvable. Problem solving has, therefore, drawn much of medical education researchers' attention. This focus has been important but it is limited in terms of preparing clinicians to deal with the complexity of the 21st century healthcare system in which they will provide team-based care for patients with complex medical illness. In this paper, we use the Soft Systems Engineering principles to introduce the idea that in complex, team-based situations, problems usually involve divergent views and evolve with multiple solution iterations. As such we need to shift the conversation from (1) problem solving to problem definition, and (2) from a problem definition derived exclusively at the level of the individual to a definition derived at the level of the situation in which the problem is manifested. Embracing such a focus on problem definition will enable us to advocate for novel educational practices that will equip trainees to effectively manage the problems they will encounter in complex, team-based healthcare.

  11. Computer-Based Assessment of Complex Problem Solving: Concept, Implementation, and Application

    ERIC Educational Resources Information Center

    Greiff, Samuel; Wustenberg, Sascha; Holt, Daniel V.; Goldhammer, Frank; Funke, Joachim

    2013-01-01

    Complex Problem Solving (CPS) skills are essential to successfully deal with environments that change dynamically and involve a large number of interconnected and partially unknown causal influences. The increasing importance of such skills in the 21st century requires appropriate assessment and intervention methods, which in turn rely on adequate…

  12. Making transboundary risks governable: reducing complexity, constructing spatial identity, and ascribing capabilities.

    PubMed

    Lidskog, Rolf; Uggla, Ylva; Soneryd, Linda

    2011-03-01

    Environmental problems that cross national borders are attracting increasing public and political attention; regulating them involves coordinating the goals and activities of various governments, which often presupposes simplifying and standardizing complex knowledge, and finding ways to manage uncertainty. This article explores how transboundary environmental problems are dealt with to render complex issues governable. By discussing oil pollution in the Baltic Sea and the gas pipeline between Russia and Germany, we elucidate how boundaries are negotiated to make issues governable. Three processes are found to be particularly relevant to how involved actors render complex issues governable: complexity reduction, construction of a spatial identity for an issue, and ascription of capabilities to new or old actor constellations. We conclude that such regulation is always provisional, implying that existing regulation is always open for negotiation and criticism.

  13. Designing and Developing Assessments of Complex Thinking in Mathematics for the Middle Grades

    ERIC Educational Resources Information Center

    Graf, Edith Aurora; Arieli-Attali, Meirav

    2015-01-01

    Designing an assessment system for complex thinking in mathematics involves decisions at every stage, from how to represent the target competencies to how to interpret evidence from student performances. Beyond learning to solve particular problems in a particular area, learning mathematics with understanding involves comprehending connections…

  14. Guidelines for Teaching the Holocaust: Avoiding Common Pedagogical Errors

    ERIC Educational Resources Information Center

    Lindquist, David H.

    2006-01-01

    Teaching the Holocaust is a complex undertaking involving twists and turns that can frustrate and even intimidate educators who teach the Holocaust. This complexity involves both the event's history and its pedagogy. In this article, the author considers eight pedagogical approaches that often cause problems in teaching the event. He states each…

  15. Solving complex band structure problems with the FEAST eigenvalue algorithm

    NASA Astrophysics Data System (ADS)

    Laux, S. E.

    2012-08-01

    With straightforward extension, the FEAST eigenvalue algorithm [Polizzi, Phys. Rev. B 79, 115112 (2009)] is capable of solving the generalized eigenvalue problems representing traveling-wave problems—as exemplified by the complex band-structure problem—even though the matrices involved are complex, non-Hermitian, and singular, and hence outside the originally stated range of applicability of the algorithm. The obtained eigenvalues/eigenvectors, however, contain spurious solutions which must be detected and removed. The efficiency and parallel structure of the original algorithm are unaltered. The complex band structures of Si layers of varying thicknesses and InAs nanowires of varying radii are computed as test problems.

  16. Symmetry, Contingency, Complexity: Accommodating Uncertainty in Public Relations Theory.

    ERIC Educational Resources Information Center

    Murphy, Priscilla

    2000-01-01

    Explores the potential of complexity theory as a unifying theory in public relations, where scholars have recently raised problems involving flux, uncertainty, adaptiveness, and loss of control. Describes specific complexity-based methodologies and their potential for public relations studies. Offers an account of complexity theory, its…

  17. Discovering Steiner Triple Systems through Problem Solving

    ERIC Educational Resources Information Center

    Sriraman, Bharath

    2004-01-01

    An attempt to implement problem solving as a teacher of ninth grade algebra is described. The problems selected were not general ones, they involved combinations and represented various situations and were more complex which lead to the discovery of Steiner triple systems.

  18. Ordinal optimization and its application to complex deterministic problems

    NASA Astrophysics Data System (ADS)

    Yang, Mike Shang-Yu

    1998-10-01

    We present in this thesis a new perspective to approach a general class of optimization problems characterized by large deterministic complexities. Many problems of real-world concerns today lack analyzable structures and almost always involve high level of difficulties and complexities in the evaluation process. Advances in computer technology allow us to build computer models to simulate the evaluation process through numerical means, but the burden of high complexities remains to tax the simulation with an exorbitant computing cost for each evaluation. Such a resource requirement makes local fine-tuning of a known design difficult under most circumstances, let alone global optimization. Kolmogorov equivalence of complexity and randomness in computation theory is introduced to resolve this difficulty by converting the complex deterministic model to a stochastic pseudo-model composed of a simple deterministic component and a white-noise like stochastic term. The resulting randomness is then dealt with by a noise-robust approach called Ordinal Optimization. Ordinal Optimization utilizes Goal Softening and Ordinal Comparison to achieve an efficient and quantifiable selection of designs in the initial search process. The approach is substantiated by a case study in the turbine blade manufacturing process. The problem involves the optimization of the manufacturing process of the integrally bladed rotor in the turbine engines of U.S. Air Force fighter jets. The intertwining interactions among the material, thermomechanical, and geometrical changes makes the current FEM approach prohibitively uneconomical in the optimization process. The generalized OO approach to complex deterministic problems is applied here with great success. Empirical results indicate a saving of nearly 95% in the computing cost.

  19. Tracking problem solving by multivariate pattern analysis and Hidden Markov Model algorithms.

    PubMed

    Anderson, John R

    2012-03-01

    Multivariate pattern analysis can be combined with Hidden Markov Model algorithms to track the second-by-second thinking as people solve complex problems. Two applications of this methodology are illustrated with a data set taken from children as they interacted with an intelligent tutoring system for algebra. The first "mind reading" application involves using fMRI activity to track what students are doing as they solve a sequence of algebra problems. The methodology achieves considerable accuracy at determining both what problem-solving step the students are taking and whether they are performing that step correctly. The second "model discovery" application involves using statistical model evaluation to determine how many substates are involved in performing a step of algebraic problem solving. This research indicates that different steps involve different numbers of substates and these substates are associated with different fluency in algebra problem solving. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Using Programmable Calculators to Solve Electrostatics Problems.

    ERIC Educational Resources Information Center

    Yerian, Stephen C.; Denker, Dennis A.

    1985-01-01

    Provides a simple routine which allows first-year physics students to use programmable calculators to solve otherwise complex electrostatic problems. These problems involve finding electrostatic potential and electric field on the axis of a uniformly charged ring. Modest programing skills are required of students. (DH)

  1. Moving Material into Space Without Rockets.

    ERIC Educational Resources Information Center

    Cheng, R. S.; Trefil, J. S.

    1985-01-01

    In response to conventional rocket demands on fuel supplies, electromagnetic launches were developed to give payloads high velocity using a stationary energy source. Several orbital mechanics problems are solved including a simple problem (radial launch with no rotation) and a complex problem involving air resistance and gravity. (DH)

  2. The Role of Problem Solving in Complex Intraverbal Repertoires

    ERIC Educational Resources Information Center

    Sautter, Rachael A.; LeBlanc, Linda A.; Jay, Allison A.; Goldsmith, Tina R.; Carr, James E.

    2011-01-01

    We examined whether typically developing preschoolers could learn to use a problem-solving strategy that involved self-prompting with intraverbal chains to provide multiple responses to intraverbal categorization questions. Teaching the children to use the problem-solving strategy did not produce significant increases in target responses until…

  3. Problem Solving and Comprehension. Third Edition.

    ERIC Educational Resources Information Center

    Whimbey, Arthur; Lochhead, Jack

    This book is directed toward increasing students' ability to analyze problems and comprehend what they read and hear. It outlines and illustrates the methods that good problem solvers use in attacking complex ideas, and provides practice in applying these methods to a variety of questions involving comprehension and reasoning. Chapter I includes a…

  4. Atwood's Machine as a Tool to Introduce Variable Mass Systems

    ERIC Educational Resources Information Center

    de Sousa, Celia A.

    2012-01-01

    This article discusses an instructional strategy which explores eventual similarities and/or analogies between familiar problems and more sophisticated systems. In this context, the Atwood's machine problem is used to introduce students to more complex problems involving ropes and chains. The methodology proposed helps students to develop the…

  5. Warning: Parental Involvement May Be Hazardous.

    ERIC Educational Resources Information Center

    Cooper, Mark J.; Mosley, Mary H.

    1999-01-01

    Principals should not presume that all parental involvement is good while ignoring adverse home conditions (such as divorce, abuse and neglect, coercive family interactions, mental-health problems, poverty, and unemployment) that may interfere with quality involvement. School-parent alliances are vital but will grow more complex as society…

  6. Students' conceptual performance on synthesis physics problems with varying mathematical complexity

    NASA Astrophysics Data System (ADS)

    Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan

    2017-06-01

    A body of research on physics problem solving has focused on single-concept problems. In this study we use "synthesis problems" that involve multiple concepts typically taught in different chapters. We use two types of synthesis problems, sequential and simultaneous synthesis tasks. Sequential problems require a consecutive application of fundamental principles, and simultaneous problems require a concurrent application of pertinent concepts. We explore students' conceptual performance when they solve quantitative synthesis problems with varying mathematical complexity. Conceptual performance refers to the identification, follow-up, and correct application of the pertinent concepts. Mathematical complexity is determined by the type and the number of equations to be manipulated concurrently due to the number of unknowns in each equation. Data were collected from written tasks and individual interviews administered to physics major students (N =179 ) enrolled in a second year mechanics course. The results indicate that mathematical complexity does not impact students' conceptual performance on the sequential tasks. In contrast, for the simultaneous problems, mathematical complexity negatively influences the students' conceptual performance. This difference may be explained by the students' familiarity with and confidence in particular concepts coupled with cognitive load associated with manipulating complex quantitative equations. Another explanation pertains to the type of synthesis problems, either sequential or simultaneous task. The students split the situation presented in the sequential synthesis tasks into segments but treated the situation in the simultaneous synthesis tasks as a single event.

  7. Tourette Syndrome: Overview and Classroom Interventions. A Complex Neurobehavioral Disorder Which May Involve Learning Problems, Attention Deficit Hyperactivity Disorder, Obsessive Compulsive Symptoms, and Stereotypical Behaviors.

    ERIC Educational Resources Information Center

    Fisher, Ramona A.; Collins, Edward C.

    Tourette Syndrome is conceptualized as a neurobehavioral disorder, with behavioral aspects that are sometimes difficult for teachers to understand and deal with. The disorder has five layers of complexity: (1) observable multiple motor, vocal, and cognitive tics and sensory involvement; (2) Attention Deficit Hyperactivity Disorder; (3)…

  8. Numerical Leak Detection in a Pipeline Network of Complex Structure with Unsteady Flow

    NASA Astrophysics Data System (ADS)

    Aida-zade, K. R.; Ashrafova, E. R.

    2017-12-01

    An inverse problem for a pipeline network of complex loopback structure is solved numerically. The problem is to determine the locations and amounts of leaks from unsteady flow characteristics measured at some pipeline points. The features of the problem include impulse functions involved in a system of hyperbolic differential equations, the absence of classical initial conditions, and boundary conditions specified as nonseparated relations between the states at the endpoints of adjacent pipeline segments. The problem is reduced to a parametric optimal control problem without initial conditions, but with nonseparated boundary conditions. The latter problem is solved by applying first-order optimization methods. Results of numerical experiments are presented.

  9. An evaluation of superminicomputers for thermal analysis

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Vidal, J. B.; Jones, G. K.

    1982-01-01

    The use of superminicomputers for solving a series of increasingly complex thermal analysis problems is investigated. The approach involved (1) installation and verification of the SPAR thermal analyzer software on superminicomputers at Langley Research Center and Goddard Space Flight Center, (2) solution of six increasingly complex thermal problems on this equipment, and (3) comparison of solution (accuracy, CPU time, turnaround time, and cost) with solutions on large mainframe computers.

  10. On Complex Water Conflicts: Role of Enabling Conditions for Pragmatic Resolution

    NASA Astrophysics Data System (ADS)

    Islam, S.; Choudhury, E.

    2016-12-01

    Many of our current and emerging water problems are interconnected and cross boundaries, domains, scales, and sectors. These boundary crossing water problems are neither static nor linear; but often are interconnected nonlinearly with other problems and feedback. The solution space for these complex problems - involving interdependent variables, processes, actors, and institutions - can't be pre-stated. We need to recognize the disconnect among values, interests, and tools as well as problems, policies, and politics. Scientific and technological solutions are desired for efficiency and reliability, but need to be politically feasible and actionable. Governing and managing complex water problems require difficult tradeoffs in exploring and sharing benefits and burdens through carefully crafted negotiation processes. The crafting of such negotiation process, we argue, constitutes a pragmatic approach to negotiation - one that is based on the identification of enabling conditions - as opposed to mechanistic casual explanations, and rooted in contextual conditions to specify and ensure the principles of equity and sustainability. We will use two case studies to demonstrate the efficacy of the proposed principled pragmatic approcah to address complex water problems.

  11. Use of a Computer Language in Teaching Dynamic Programming. Final Report.

    ERIC Educational Resources Information Center

    Trimble, C. J.; And Others

    Most optimization problems of any degree of complexity must be solved using a computer. In the teaching of dynamic programing courses, it is often desirable to use a computer in problem solution. The solution process involves conceptual formulation and computational Solution. Generalized computer codes for dynamic programing problem solution…

  12. 77 FR 32183 - Transmission Planning and Cost Allocation by Transmission Owning and Operating Public Utilities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-31

    ... it would not wait for systemic problems to undermine transmission planning before action is taken... that the development of transmission facilities can involve long lead times and complex problems... rather than allowing the problems in transmission planning and cost allocation to continue or to increase...

  13. Differentiating Developmental Trajectories for Conduct, Emotion, and Peer Problems Following Early Deprivation

    ERIC Educational Resources Information Center

    Sonuga-Barke, Edmund J.; Schlotz, Wolff; Kreppner, Jana

    2010-01-01

    The development of conduct and emotional problems involves a complex interplay between environmental and genetic factors. The child-rearing environment contributes to this process. Gross deviations, such as those seen in abusive or neglectful homes, or where the parent has serious mental health problems, have been shown to contribute to the…

  14. DUII control system performance measures for Oregon counties 1991-2001

    DOT National Transportation Integrated Search

    2002-06-01

    Driving Under the Influence of Intoxicants (DUII) is a complex social problem that has origins in both internal and external system factors. Due to its complexity, Oregon communities and involved agencies must concentrate on addressing the negative r...

  15. Geometric and Algebraic Approaches in the Concept of Complex Numbers

    ERIC Educational Resources Information Center

    Panaoura, A.; Elia, I.; Gagatsis, A.; Giatilis, G.-P.

    2006-01-01

    This study explores pupils' performance and processes in tasks involving equations and inequalities of complex numbers requiring conversions from a geometric representation to an algebraic representation and conversions in the reverse direction, and also in complex numbers problem solving. Data were collected from 95 pupils of the final grade from…

  16. Division in a Binary Representation for Complex Numbers

    ERIC Educational Resources Information Center

    Blest, David C.; Jamil, Tariq

    2003-01-01

    Computer operations involving complex numbers, essential in such applications as Fourier transforms or image processing, are normally performed in a "divide-and-conquer" approach dealing separately with real and imaginary parts. A number of proposals have treated complex numbers as a single unit but all have foundered on the problem of the…

  17. Involving Hispanic Parents in Their Children's Education: Strategies that Work

    ERIC Educational Resources Information Center

    Murray, John Christopher

    2012-01-01

    The number of Hispanic children entering public schools continues to increase at a staggering pace. With such a change in diversity, educators are struggling with the absence of Hispanic parent involvement in schools. Many teachers consider this lack of parent involvement as uncaring about their children. The problem is much more complex in…

  18. Estimating occupancy rates with imperfect detection under complex survey designs

    EPA Science Inventory

    Monitoring the occurrence of specific amphibian species is of interest. Typically, the monitoring design is a complex design that involves stratification and unequal probability of selection. When conducting field visits to selected sites, a common problem is that during a singl...

  19. Providing Formative Assessment to Students Solving Multipath Engineering Problems with Complex Arrangements of Interacting Parts: An Intelligent Tutor Approach

    ERIC Educational Resources Information Center

    Steif, Paul S.; Fu, Luoting; Kara, Levent Burak

    2016-01-01

    Problems faced by engineering students involve multiple pathways to solution. Students rarely receive effective formative feedback on handwritten homework. This paper examines the potential for computer-based formative assessment of student solutions to multipath engineering problems. In particular, an intelligent tutor approach is adopted and…

  20. Geometric Series: A New Solution to the Dog Problem

    ERIC Educational Resources Information Center

    Dion, Peter; Ho, Anthony

    2013-01-01

    This article describes what is often referred to as the dog, beetle, mice, ant, or turtle problem. Solutions to this problem exist, some being variations of each other, which involve mathematics of a wide range of complexity. Herein, the authors describe the intuitive solution and the calculus solution and then offer a completely new solution…

  1. Sustainable aggregate production planning in the chemical process industry - A benchmark problem and dataset.

    PubMed

    Brandenburg, Marcus; Hahn, Gerd J

    2018-06-01

    Process industries typically involve complex manufacturing operations and thus require adequate decision support for aggregate production planning (APP). The need for powerful and efficient approaches to solve complex APP problems persists. Problem-specific solution approaches are advantageous compared to standardized approaches that are designed to provide basic decision support for a broad range of planning problems but inadequate to optimize under consideration of specific settings. This in turn calls for methods to compare different approaches regarding their computational performance and solution quality. In this paper, we present a benchmarking problem for APP in the chemical process industry. The presented problem focuses on (i) sustainable operations planning involving multiple alternative production modes/routings with specific production-related carbon emission and the social dimension of varying operating rates and (ii) integrated campaign planning with production mix/volume on the operational level. The mutual trade-offs between economic, environmental and social factors can be considered as externalized factors (production-related carbon emission and overtime working hours) as well as internalized ones (resulting costs). We provide data for all problem parameters in addition to a detailed verbal problem statement. We refer to Hahn and Brandenburg [1] for a first numerical analysis based on and for future research perspectives arising from this benchmarking problem.

  2. Medical Problem-Solving: A Critique of the Literature.

    ERIC Educational Resources Information Center

    McGuire, Christine H.

    1985-01-01

    Prescriptive, decision-analysis of medical problem-solving has been based on decision theory that involves calculation and manipulation of complex probability and utility values to arrive at optimal decisions that will maximize patient benefits. The studies offer a methodology for improving clinical judgment. (Author/MLW)

  3. Inviting Uncertainty into the Classroom

    ERIC Educational Resources Information Center

    Beghetto, Ronald A.

    2017-01-01

    Most teachers try to avoid having students experience uncertainty in their schoolwork. But if we want to prepare students to tackle complex problems (and the uncertainty that accompanies such problems), we must give them learning experiences that involve feeling unsure and sometimes even confused. Beghetto presents five strategies that help…

  4. Examining the Effects of Principals' Transformational Leadership on Teachers' Creative Practices and Students' Performance in Problem-Solving

    ERIC Educational Resources Information Center

    Owoh, Jeremy Strickland

    2015-01-01

    In today's technology enriched schools and workforces, creative problem-solving is involved in many aspects of a person's life. The educational systems of developed nations are designed to raise students who are creative and skillful in solving complex problems. Technology and the age of information require nations to develop generations of…

  5. An Optimization Model for Scheduling Problems with Two-Dimensional Spatial Resource Constraint

    NASA Technical Reports Server (NTRS)

    Garcia, Christopher; Rabadi, Ghaith

    2010-01-01

    Traditional scheduling problems involve determining temporal assignments for a set of jobs in order to optimize some objective. Some scheduling problems also require the use of limited resources, which adds another dimension of complexity. In this paper we introduce a spatial resource-constrained scheduling problem that can arise in assembly, warehousing, cross-docking, inventory management, and other areas of logistics and supply chain management. This scheduling problem involves a twodimensional rectangular area as a limited resource. Each job, in addition to having temporal requirements, has a width and a height and utilizes a certain amount of space inside the area. We propose an optimization model for scheduling the jobs while respecting all temporal and spatial constraints.

  6. Learning to See the (W)holes

    ERIC Educational Resources Information Center

    Burns, Barbara A.; Jordan, Thomas M.

    2006-01-01

    Business managers are faced with complex decisions involving a wide range of issues--technical, social, environmental, and financial--and their interaction. Our education system focuses heavily on presenting structured problems and teaching students to apply a set of tools or methods to solve these problems. Yet the most difficult thing to teach…

  7. Design and Implementation of the Game-Design and Learning Program

    ERIC Educational Resources Information Center

    Akcaoglu, Mete

    2016-01-01

    Design involves solving complex, ill-structured problems. Design tasks are consequently, appropriate contexts for children to exercise higher-order thinking and problem-solving skills. Although creating engaging and authentic design contexts for young children is difficult within the confines of traditional schooling, recently, game-design has…

  8. Outdoor Recreation Management

    ERIC Educational Resources Information Center

    Jubenville, Alan

    The complex problems facing the manager of an outdoor recreation area are outlined and discussed. Eighteen chapters cover the following primary concerns of the manager of such a facility: (1) an overview of the management process; (2) the basic outdoor recreation management model; (3) the problem-solving process; (4) involvement of the public in…

  9. Early Career Teacher Attrition: Intentions of Teachers Beginning

    ERIC Educational Resources Information Center

    Clandinin, D. Jean; Long, Julie; Schaefer, Lee; Downey, C. Aiden; Steeves, Pam; Pinnegar, Eliza; McKenzie Robblee, Sue; Wnuk, Sheri

    2015-01-01

    Early career teacher attrition has most often been conceptualized as either a problem associated with individual factors (e.g. burnout) or a problem associated with contextual factors (e.g. support and salary). This study considered early career teacher attrition as an identity making process that involves a complex negotiation between individual…

  10. Examining diversity inequities in fisheries science: a call to action

    Treesearch

    Ivan Arismendi; Brooke E. Penaluna

    2016-01-01

    A diverse workforce in science can bring about competitive advantages, innovation, and new knowledge, skills, and experiences for understanding complex problems involving the science and management of natural resources. In particular, fisheries sciences confronts exceptional challenges because of complicated societal-level problems from the overexploitation and...

  11. Unsteady, one-dimensional gas dynamics computations using a TVD type sequential solver

    NASA Technical Reports Server (NTRS)

    Thakur, Siddharth; Shyy, Wei

    1992-01-01

    The efficacy of high resolution convection schemes to resolve sharp gradient in unsteady, 1D flows is examined using the TVD concept based on a sequential solution algorithm. Two unsteady flow problems are considered which include the problem involving the interaction of the various waves in a shock tube with closed reflecting ends and the problem involving the unsteady gas dynamics in a tube with closed ends subject to an initial pressure perturbation. It is concluded that high accuracy convection schemes in a sequential solution framework are capable of resolving discontinuities in unsteady flows involving complex gas dynamics. However, a sufficient amount of dissipation is required to suppress oscillations near discontinuities in the sequential approach, which leads to smearing of the solution profiles.

  12. [Questions concerning humanitarian action].

    PubMed

    Simonnot, C

    2002-01-01

    Although development of humanitarian action is rooted historical events, the dynamics behind today's international relief organizations can only be understood within the context of the modern world. Relief organizations are currently confronted with major challenges and paradoxes. The challenges include the need to enhance professionalization and standardization of assistance operations and exposure to greater risks. The paradoxes involve the need to implement complex, highly publicized programs in a simplistic manner and problems involved in managing the complex relationship between relief workers and victims, tainted with the almighty powers of the actors.

  13. HIA, the next step: Defining models and roles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Putters, Kim

    If HIA is to be an effective instrument for optimising health interests in the policy making process it has to recognise the different contests in which policy is made and the relevance of both technical rationality and political rationality. Policy making may adopt a rational perspective in which there is a systematic and orderly progression from problem formulation to solution or a network perspective in which there are multiple interdependencies, extensive negotiation and compromise, and the steps from problem to formulation are not followed sequentially or in any particular order. Policy problems may be simple with clear causal pathways andmore » responsibilities or complex with unclear causal pathways and disputed responsibilities. Network analysis is required to show which stakeholders are involved, their support for health issues and the degree of consensus. From this analysis three models of HIA emerge. The first is the phases model which is fitted to simple problems and a rational perspective of policymaking. This model involves following structured steps. The second model is the rounds (Echternach) model that is fitted to complex problems and a network perspective of policymaking. This model is dynamic and concentrates on network solutions taking these steps in no particular order. The final model is the 'garbage can' model fitted to contexts which combine simple and complex problems. In this model HIA functions as a problem solver and signpost keeping all possible solutions and stakeholders in play and allowing solutions to emerge over time. HIA models should be the beginning rather than the conclusion of discussion the worlds of HIA and policymaking.« less

  14. COED Transactions, Vol. IX, No. 3, March 1977. Evaluation of a Complex Variable Using Analog/Hybrid Computation Techniques.

    ERIC Educational Resources Information Center

    Marcovitz, Alan B., Ed.

    Described is the use of an analog/hybrid computer installation to study those physical phenomena that can be described through the evaluation of an algebraic function of a complex variable. This is an alternative way to study such phenomena on an interactive graphics terminal. The typical problem used, involving complex variables, is that of…

  15. Challenges to interdisciplinary discourse

    Treesearch

    David N. Wear

    1999-01-01

    Many of the world's critical problems involve hu­man interactions with nature and their long-term implications for environmental quality and the sustainability of resource/ecological systems. These problems are complex defined by the collective behaviors of people as well as by the structure and function of ecosystems suggesting that both the social and the...

  16. Exploring Essential Conditions: A Commentary on Bull et al. (2008)

    ERIC Educational Resources Information Center

    Borthwick, Arlene; Hansen, Randall; Gray, Lucy; Ziemann, Irina

    2008-01-01

    The editorial by Bull et al. (2008) on connections between informal and formal learning made explicit one element of solving what Koehler and Mishra (2008) termed a "wicked problem." This wicked (complex, ill-structured) problem involves working with teachers for effective integration of technology in support of student learning. The…

  17. Research on the Caretaking of Children of Incarcerated Parents: Findings and Their Service Delivery Implications

    PubMed Central

    Hanlon, Thomas E.; Carswell, Steven B.; Rose, Marc

    2007-01-01

    This paper reviews research findings on caretaking-related problems associated with the absence of parents from the home following incarceration. It focuses on the impact of incarceration on the welfare and adjustment of urban African American children and on the assumption of caretaking responsibilities by other caretakers, principally maternal grandmothers. Noting the complex situational difficulties involved and the potential burdens associated with surrogate parenting in general, and with this population in particular, the service-provider implications of this parenting arrangement are considered in this review. Findings indicate that problems associated with incarceration of parents tend to be intergenerational and vary considerably in complexity and severity. To the extent that they impact the children involved, these issues should be addressed in coordinated service delivery focusing on prevention. PMID:18311320

  18. Potential Uses of Bayesian Networks as Tools for Synthesis of Systematic Reviews of Complex Interventions

    ERIC Educational Resources Information Center

    Stewart, G. B.; Mengersen, K.; Meader, N.

    2014-01-01

    Bayesian networks (BNs) are tools for representing expert knowledge or evidence. They are especially useful for synthesising evidence or belief concerning a complex intervention, assessing the sensitivity of outcomes to different situations or contextual frameworks and framing decision problems that involve alternative types of intervention.…

  19. Derived heuristics-based consistent optimization of material flow in a gold processing plant

    NASA Astrophysics Data System (ADS)

    Myburgh, Christie; Deb, Kalyanmoy

    2018-01-01

    Material flow in a chemical processing plant often follows complicated control laws and involves plant capacity constraints. Importantly, the process involves discrete scenarios which when modelled in a programming format involves if-then-else statements. Therefore, a formulation of an optimization problem of such processes becomes complicated with nonlinear and non-differentiable objective and constraint functions. In handling such problems using classical point-based approaches, users often have to resort to modifications and indirect ways of representing the problem to suit the restrictions associated with classical methods. In a particular gold processing plant optimization problem, these facts are demonstrated by showing results from MATLAB®'s well-known fmincon routine. Thereafter, a customized evolutionary optimization procedure which is capable of handling all complexities offered by the problem is developed. Although the evolutionary approach produced results with comparatively less variance over multiple runs, the performance has been enhanced by introducing derived heuristics associated with the problem. In this article, the development and usage of derived heuristics in a practical problem are presented and their importance in a quick convergence of the overall algorithm is demonstrated.

  20. High order solution of Poisson problems with piecewise constant coefficients and interface jumps

    NASA Astrophysics Data System (ADS)

    Marques, Alexandre Noll; Nave, Jean-Christophe; Rosales, Rodolfo Ruben

    2017-04-01

    We present a fast and accurate algorithm to solve Poisson problems in complex geometries, using regular Cartesian grids. We consider a variety of configurations, including Poisson problems with interfaces across which the solution is discontinuous (of the type arising in multi-fluid flows). The algorithm is based on a combination of the Correction Function Method (CFM) and Boundary Integral Methods (BIM). Interface and boundary conditions can be treated in a fast and accurate manner using boundary integral equations, and the associated BIM. Unfortunately, BIM can be costly when the solution is needed everywhere in a grid, e.g. fluid flow problems. We use the CFM to circumvent this issue. The solution from the BIM is used to rewrite the problem as a series of Poisson problems in rectangular domains-which requires the BIM solution at interfaces/boundaries only. These Poisson problems involve discontinuities at interfaces, of the type that the CFM can handle. Hence we use the CFM to solve them (to high order of accuracy) with finite differences and a Fast Fourier Transform based fast Poisson solver. We present 2-D examples of the algorithm applied to Poisson problems involving complex geometries, including cases in which the solution is discontinuous. We show that the algorithm produces solutions that converge with either 3rd or 4th order of accuracy, depending on the type of boundary condition and solution discontinuity.

  1. An Algorithm for Integrated Subsystem Embodiment and System Synthesis

    NASA Technical Reports Server (NTRS)

    Lewis, Kemper

    1997-01-01

    Consider the statement,'A system has two coupled subsystems, one of which dominates the design process. Each subsystem consists of discrete and continuous variables, and is solved using sequential analysis and solution.' To address this type of statement in the design of complex systems, three steps are required, namely, the embodiment of the statement in terms of entities on a computer, the mathematical formulation of subsystem models, and the resulting solution and system synthesis. In complex system decomposition, the subsystems are not isolated, self-supporting entities. Information such as constraints, goals, and design variables may be shared between entities. But many times in engineering problems, full communication and cooperation does not exist, information is incomplete, or one subsystem may dominate the design. Additionally, these engineering problems give rise to mathematical models involving nonlinear functions of both discrete and continuous design variables. In this dissertation an algorithm is developed to handle these types of scenarios for the domain-independent integration of subsystem embodiment, coordination, and system synthesis using constructs from Decision-Based Design, Game Theory, and Multidisciplinary Design Optimization. Implementation of the concept in this dissertation involves testing of the hypotheses using example problems and a motivating case study involving the design of a subsonic passenger aircraft.

  2. Expanding the Reach of Physics-Engaging Students in Interdisciplinary Research Involving complex, real-world situation

    NASA Astrophysics Data System (ADS)

    Bililign, Solomon

    2014-03-01

    Physics plays a very important role in most interdisciplinary efforts and can provide a solid foundation for students. Retention of students in STEM areas can be facilitated by enhanced interdisciplinary education and research since students are strongly attracted to research with societal relevance and show increasing enthusiasm about problems that have practical consequences. One such area of research is a collaborative Earth System Science. The Earth System is dynamic and complex. It is comprised of diverse components that interact. By providing students the opportunities to work in interdisciplinary groups on a problem that reflects a complex, real-world situation they can see the linkages between components of the Earth system that encompass climate and all its components (weather precipitation, temperature, etc.) and technology development and deployment of sensors and sensor networks and social impacts. By involving students in the creation of their own personalized professional development plan, students are more focused and engaged and are more likely to remain in the program.

  3. Using three dimensional silicone ``boots`` to solve complex remedial design problems in curtain walls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Y.J.

    1998-12-31

    Stick system curtain wall leak problems are frequently caused by water entry at the splice joints of the curtain wall frame and failure of the internal metal joinery seals. Remedial solutions involving occupied buildings inevitably face the multiple constraints of existing construction and business operations not present during the original curtain wall construction. In most cases, even partial disassembly of the curtain wall for internal seal repairs is not feasible. Remedial solutions which must be executed from the exterior of the curtain wall often involve wet-applied or preformed sealant tape bridge joints. However, some of the more complex joints cannotmore » be repaired effectively or economically with the conventional bridge joint. Fortunately, custom fabricated three-dimensional preformed sealant boots are becoming available to address these situations. This paper discusses the design considerations and the selective use of three-dimensional preformed boots in sealing complex joint geometry that would not be effective with the conventional two-dimensional bridge joint.« less

  4. Prebiotic coordination chemistry: The potential role of transition-metal complexes in the chemical evolution

    NASA Technical Reports Server (NTRS)

    Beck, M.

    1979-01-01

    In approaching the extremely involved and complex problem of the origin of life, consideration of the coordination chemistry appeared not only as a possibility but as a necessity. The first model experiments appear to be promising because of prebiotic-type synthesis by means of transition-metal complexes. It is especially significant that in some instances various types of vitally important substances (nucleic bases, amino acids) are formed simultaneously. There is ground to hope that systematic studies in this field will clarify the role of transition-metal complexes in the organizatorial phase of chemical evolution. It is obvious that researchers working in the fields of the chemistry of cyano and carbonyl complexes, and of the catalytic effect of transition-metal complexes are best suited to study these aspects of the attractive and interesting problem of the origin of life.

  5. Problems in Calculating and Comparing Dropout Rates. ERS Research Digest.

    ERIC Educational Resources Information Center

    Ligon, Glynn; And Others

    1990-01-01

    This paper dramatizes the complexity and the problems involved in calculating the rates of student dropouts from school. To compare the dropout formulas used by various agencies, states, and local school systems, responses from a national survey are presented and used to calculate a range of dropout rates for the Austin (Texas) public schools. By…

  6. Biological system interactions.

    PubMed Central

    Adomian, G; Adomian, G E; Bellman, R E

    1984-01-01

    Mathematical modeling of cellular population growth, interconnected subsystems of the body, blood flow, and numerous other complex biological systems problems involves nonlinearities and generally randomness as well. Such problems have been dealt with by mathematical methods often changing the actual model to make it tractable. The method presented in this paper (and referenced works) allows much more physically realistic solutions. PMID:6585837

  7. The PLATO PPTK System: An Alternative Keyboard Using the PLATO Computer-Based Education System for the Orthopedically Handicapped.

    ERIC Educational Resources Information Center

    Goodman, William J.

    Developed in response to the complex problems involved in providing equal educational opportunities for the intellectually alert orthopedically handicapped, the PLATO Programmable Terminal Keyset (PPTK) system makes the resources of PLATO compatible to the functional problems of a wide range of orthopedic conditions. This report describes the…

  8. The Representation of Anatomical Structures through Computer Animation for Scientific, Educational and Artistic Applications.

    ERIC Educational Resources Information Center

    Stredney, Donald Larry

    An overview of computer animation and the techniques involved in its creation is provided in the introduction to this masters thesis, which focuses on the problems encountered by students in learning the forms and functions of complex anatomical structures and ways in which computer animation can address these problems. The objectives for,…

  9. Getting Along: Negotiating Authority in High Schools. Final Report.

    ERIC Educational Resources Information Center

    Farrar, Eleanor; Neufeld, Barbara

    Appropriate responses to the authority problem in schools can be informed by a more complex understanding of the issue. Also of importance is knowledge of the ways in which schools and society at large are involved with both the creation of and the solution to the problem of student/teacher authority relations. School people are referring…

  10. Student Learning of Complex Earth Systems: Conceptual Frameworks of Earth Systems and Instructional Design

    ERIC Educational Resources Information Center

    Scherer, Hannah H.; Holder, Lauren; Herbert, Bruce

    2017-01-01

    Engaging students in authentic problem solving concerning environmental issues in near-surface complex Earth systems involves both developing student conceptualization of Earth as a system and applying that scientific knowledge using techniques that model those used by professionals. In this first paper of a two-part series, we review the state of…

  11. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    PubMed

    Laghari, Samreen; Niazi, Muaz A

    2016-01-01

    Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  12. Stalking the IQ Quark.

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    1979-01-01

    An information-processing framework is presented for understanding intelligence. Two levels of processing are discussed: the steps involved in solving a complex intellectual task, and higher-order processes used to decide how to solve the problem. (MH)

  13. USSR and Eastern Europe Scientific Abstracts, Cybernetics, Computers and Automation Technology, Number 35

    DTIC Science & Technology

    1978-09-12

    the population. Only a socialist, planned economy can cope with such problems. However, the in- creasing complexity of the tasks faced’ by...the development of systems allowing man-machine dialogue does not decrease, but rather increase the complexity of the systems involved, simply...shifting the complexity to another sphere, where it is invisible to the human utilizing the system. Figures 5; refer- ences 3: 2 Russian, 1 Western

  14. Critical education in resource and environmental management: Learning and empowerment for a sustainable future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diduck, A.

    1999-10-01

    Changing rom current patterns of resource use to a sustainable and equitable economy is a complex and intractable problem. This paper suggests that critical education may form part of the solution. Critical environmental assessment (EA) education, the model explored in this paper, offers a tool for resource and environmental managers to use in managing public involvement processes. This model challenges current patterns of resource use and addresses criticisms of public involvement processes. Critical EA education, involving both cognitive development and personal empowerment, focuses on critical intelligence, problem solving and social action. The concept is offered as a means to facilitatemore » and improve public involvement and, thereby, empower local communities to take greater control of resource use decisions affecting their lives. Positive implications of critical EA education for change, complexity, uncertainty and conflict, which are four enduring themes in resource and environmental management, are discussed in the paper. The implications include: cognitive development and personal empowerment at the level of local resource communities; simplification of the often complex discourse encountered in resource management; reduction in feelings of powerlessness often experienced by members of the public in environmental assessment scenarios; a reduction of ignorance and indeterminacy regarding resource management issues; conflict resolution at the cognitive level; and, clarification of the opposing values, interests or actions at the heart of a conflict.« less

  15. An approach to complex acid-base problems

    PubMed Central

    Herd, Anthony M.

    2005-01-01

    OBJECTIVE To review rules and formulas for solving even the most complex acid-base problems. SOURCES OF INFORMATION MEDLINE was searched from January 1966 to December 2003. The search was limited to English-language review articles involving human subjects. Nine relevant review papers were found and provide the background. As this information is well established and widely accepted, it is not judged for strength of evidence, as is standard practice. MAIN MESSAGE An understanding of the body’s responses to acidemia or alkalemia can be gained through a set of four rules and two formulas that can be used to interpret almost any acid-base problems. Physicians should, however, remember the “golden rule” of acid-base interpretation: always look at a patient’s clinical condition. CONCLUSION Physicians practising in acute care settings commonly encounter acid-base disturbances. While some of these are relatively simple and easy to interpret, some are more complex. Even complex cases can be resolved using the four rules and two formulas. PMID:15751566

  16. Advancing efforts to address youth violence involvement.

    PubMed

    Weist, M D; Cooley-Quille, M

    2001-06-01

    Discusses the increased public attention on violence-related problems among youth and the concomitant increased diversity in research. Youth violence involvement is a complex construct that includes violence experienced in multiple settings (home, school, neighborhood) and in multiple forms (as victims, witnesses, perpetrators, and through family members, friends, and the media). Potential impacts of such violence involvement are considerable, including increased internalizing and externalizing behaviors among youth and future problems in school adjustment and life-course development. This introductory article reviews key dimensions of youth-related violence, describes an American Psychological Association Task Force (Division 12) developed to advance relevant research, and presents examples of national resources and efforts that attempt to address this critical public health issue.

  17. Aeropropulsion 1987. Session 2: Aeropropulsion Structures Research

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Aeropropulsion systems present unique problems to the structural engineer. The extremes in operating temperatures, rotational effects, and behaviors of advanced material systems combine into complexities that require advances in many scientific disciplines involved in structural analysis and design procedures. This session provides an overview of the complexities of aeropropulsion structures and the theoretical, computational, and experimental research conducted to achieve the needed advances.

  18. The Complexities of Participatory Action Research and the Problems of Power, Identity and Influence

    ERIC Educational Resources Information Center

    Hawkins, Karen A.

    2015-01-01

    This article highlights the complexity of participatory action research (PAR) in that the study outlined was carried out with and by, as opposed to on, participants. The project was contextualised in two prior-to-school settings in Australia, with the early childhood professionals and, to some extent, the preschoolers involved in this PAR project…

  19. An Integrated Constraint Programming Approach to Scheduling Sports Leagues with Divisional and Round-robin Tournaments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlsson, Mats; Johansson, Mikael; Larson, Jeffrey

    Previous approaches for scheduling a league with round-robin and divisional tournaments involved decomposing the problem into easier subproblems. This approach, used to schedule the top Swedish handball league Elitserien, reduces the problem complexity but can result in suboptimal schedules. This paper presents an integrated constraint programming model that allows to perform the scheduling in a single step. Particular attention is given to identifying implied and symmetry-breaking constraints that reduce the computational complexity significantly. The experimental evaluation of the integrated approach takes considerably less computational effort than the previous approach.

  20. The bright side of being blue: Depression as an adaptation for analyzing complex problems

    PubMed Central

    Andrews, Paul W.; Thomson, J. Anderson

    2009-01-01

    Depression ranks as the primary emotional problem for which help is sought. Depressed people often have severe, complex problems, and rumination is a common feature. Depressed people often believe that their ruminations give them insight into their problems, but clinicians often view depressive rumination as pathological because it is difficult to disrupt and interferes with the ability to concentrate on other things. Abundant evidence indicates that depressive rumination involves the analysis of episode-related problems. Because analysis is time consuming and requires sustained processing, disruption would interfere with problem-solving. The analytical rumination (AR) hypothesis proposes that depression is an adaptation that evolved as a response to complex problems and whose function is to minimize disruption of rumination and sustain analysis of complex problems. It accomplishes this by giving episode-related problems priority access to limited processing resources, by reducing the desire to engage in distracting activities (anhedonia), and by producing psychomotor changes that reduce exposure to distracting stimuli. Because processing resources are limited, the inability to concentrate on other things is a tradeoff that must be made to sustain analysis of the triggering problem. The AR hypothesis is supported by evidence from many levels, including genes, neurotransmitters and their receptors, neurophysiology, neuroanatomy, neuroenergetics, pharmacology, cognition and behavior, and the efficacy of treatments. In addition, we address and provide explanations for puzzling findings in the cognitive and behavioral genetics literatures on depression. In the process, we challenge the belief that serotonin transmission is low in depression. Finally, we discuss implications of the hypothesis for understanding and treating depression. PMID:19618990

  1. Scaffolding Students' Skill Development by First Introducing Advanced Techniques through the Synthesis and [superscript 15]N NMR Analysis of Cinnamamides

    ERIC Educational Resources Information Center

    Shuldburg, Sara; Carroll, Jennifer

    2017-01-01

    An advanced undergraduate experiment involving the synthesis and characterization of a series of six unique cinnamamides is described. This experiment allows for a progressive mastery of skills students need to tackle more complex NMR structure elucidation problems. Characterization of the products involves IR spectroscopy, GCMS, and proton,…

  2. Landscape change in the southern Piedmont: challenges, solutions, and uncertainty across scales

    USGS Publications Warehouse

    Conroy, M.J.; Allen, Craig R.; Peterson, J.T.; Pritchard, L.J.; Moore, C.T.

    2003-01-01

    The southern Piedmont of the southeastern United States epitomizes the complex and seemingly intractable problems and hard decisions that result from uncontrolled urban and suburban sprawl. Here we consider three recurrent themes in complicated problems involving complex systems: (1) scale dependencies and cross-scale, often nonlinear relationships; (2) resilience, in particular the potential for complex systems to move to alternate stable states with decreased ecological and/or economic value; and (3) uncertainty in the ability to understand and predict outcomes, perhaps particularly those that occur as a result of human impacts. We consider these issues in the context of landscape-level decision making, using as an example water resources and lotic systems in the Piedmont region of the southeastern United States.

  3. Equilibrium expert: an add-in to Microsoft Excel for multiple binding equilibrium simulations and parameter estimations.

    PubMed

    Raguin, Olivier; Gruaz-Guyon, Anne; Barbet, Jacques

    2002-11-01

    An add-in to Microsoft Excel was developed to simulate multiple binding equilibriums. A partition function, readily written even when the equilibrium is complex, describes the experimental system. It involves the concentrations of the different free molecular species and of the different complexes present in the experiment. As a result, the software is not restricted to a series of predefined experimental setups but can handle a large variety of problems involving up to nine independent molecular species. Binding parameters are estimated by nonlinear least-square fitting of experimental measurements as supplied by the user. The fitting process allows user-defined weighting of the experimental data. The flexibility of the software and the way it may be used to describe common experimental situations and to deal with usual problems such as tracer reactivity or nonspecific binding is demonstrated by a few examples. The software is available free of charge upon request.

  4. Multicriteria decision analysis: Overview and implications for environmental decision making

    USGS Publications Warehouse

    Hermans, Caroline M.; Erickson, Jon D.; Erickson, Jon D.; Messner, Frank; Ring, Irene

    2007-01-01

    Environmental decision making involving multiple stakeholders can benefit from the use of a formal process to structure stakeholder interactions, leading to more successful outcomes than traditional discursive decision processes. There are many tools available to handle complex decision making. Here we illustrate the use of a multicriteria decision analysis (MCDA) outranking tool (PROMETHEE) to facilitate decision making at the watershed scale, involving multiple stakeholders, multiple criteria, and multiple objectives. We compare various MCDA methods and their theoretical underpinnings, examining methods that most realistically model complex decision problems in ways that are understandable and transparent to stakeholders.

  5. An Optimization of Manufacturing Systems using a Feedback Control Scheduling Model

    NASA Astrophysics Data System (ADS)

    Ikome, John M.; Kanakana, Grace M.

    2018-03-01

    In complex production system that involves multiple process, unplanned disruption often turn to make the entire production system vulnerable to a number of problems which leads to customer’s dissatisfaction. However, this problem has been an ongoing problem that requires a research and methods to streamline the entire process or develop a model that will address it, in contrast to this, we have developed a feedback scheduling model that can minimize some of this problem and after a number of experiment, it shows that some of this problems can be eliminated if the correct remedial actions are implemented on time.

  6. Dependency visualization for complex system understanding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smart, J. Allison Cory

    1994-09-01

    With the volume of software in production use dramatically increasing, the importance of software maintenance has become strikingly apparent. Techniques now sought and developed for reverse engineering and design extraction and recovery. At present, numerous commercial products and research tools exist which are capable of visualizing a variety of programming languages and software constructs. The list of new tools and services continues to grow rapidly. Although the scope of the existing commercial and academic product set is quite broad, these tools still share a common underlying problem. The ability of each tool to visually organize object representations is increasingly impairedmore » as the number of components and component dependencies within systems increases. Regardless of how objects are defined, complex ``spaghetti`` networks result in nearly all large system cases. While this problem is immediately apparent in modem systems analysis involving large software implementations, it is not new. As will be discussed in Chapter 2, related problems involving the theory of graphs were identified long ago. This important theoretical foundation provides a useful vehicle for representing and analyzing complex system structures. While the utility of directed graph based concepts in software tool design has been demonstrated in literature, these tools still lack the capabilities necessary for large system comprehension. This foundation must therefore be expanded with new organizational and visualization constructs necessary to meet this challenge. This dissertation addresses this need by constructing a conceptual model and a set of methods for interactively exploring, organizing, and understanding the structure of complex software systems.« less

  7. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach

    PubMed Central

    2016-01-01

    Background Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. Purpose It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. Method We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. Results The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach. PMID:26812235

  8. A survey of automated methods for sensemaking support

    NASA Astrophysics Data System (ADS)

    Llinas, James

    2014-05-01

    Complex, dynamic problems in general present a challenge for the design of analysis support systems and tools largely because there is limited reliable a priori procedural knowledge descriptive of the dynamic processes in the environment. Problem domains that are non-cooperative or adversarial impute added difficulties involving suboptimal observational data and/or data containing the effects of deception or covertness. The fundamental nature of analysis in these environments is based on composite approaches involving mining or foraging over the evidence, discovery and learning processes, and the synthesis of fragmented hypotheses; together, these can be labeled as sensemaking procedures. This paper reviews and analyzes the features, benefits, and limitations of a variety of automated techniques that offer possible support to sensemaking processes in these problem domains.

  9. Potential-splitting approach applied to the Temkin-Poet model for electron scattering off the hydrogen atom and the helium ion

    NASA Astrophysics Data System (ADS)

    Yarevsky, E.; Yakovlev, S. L.; Larson, Å; Elander, N.

    2015-06-01

    The study of scattering processes in few body systems is a difficult problem especially if long range interactions are involved. In order to solve such problems, we develop here a potential-splitting approach for three-body systems. This approach is based on splitting the reaction potential into a finite range core part and a long range tail part. The solution to the Schrödinger equation for the long range tail Hamiltonian is found analytically, and used as an incoming wave in the three body scattering problem. This reformulation of the scattering problem makes it suitable for treatment by the exterior complex scaling technique in the sense that the problem after the complex dilation is reduced to a boundary value problem with zero boundary conditions. We illustrate the method with calculations on the electron scattering off the hydrogen atom and the positive helium ion in the frame of the Temkin-Poet model.

  10. Design and Diagnosis Problem Solving with Multifunctional Technical Knowledge Bases

    DTIC Science & Technology

    1992-09-29

    STRUCTURE METHODOLOGY Design problem solving is a complex activity involving a number of subtasks. and a number of alternative methods potentially available...Conference on Artificial Intelligence. London: The British Computer Society, pp. 621-633. Friedland, P. (1979). Knowledge-based experimental design ...Computing Milieuxl: Management of Computing and Information Systems- -ty,*m man- agement General Terms: Design . Methodology Additional Key Words and Phrases

  11. On the Integration of Logic Programming and Functional Programming.

    DTIC Science & Technology

    1985-06-01

    be performed with simple handtools and devices. However, if the problem is more complex, say involving the cylinders, camshaft , or drive train, then...f(x,x) with f(y, g(y)), and would bind x to g(x) (Ref. 7]. The problem, of course, is that the attempt to prune the search tree allows circularity...combinatorial-explosion, since the search trees generated can grow very unpredictably (Re£. 19: p. 2293. Somewhat akin to the halting problem, it means that a

  12. Learning to Leave: Problems of Graduating--Clinical Observations and Strategies.

    ERIC Educational Resources Information Center

    Margolis, Gary

    1980-01-01

    Graduating from college involves a series of complex psychological feelings and behaviors. Students experiencing stress as a result of impending graduation can be helped by counseling that takes into consideration the specific process of leaving college. (JN)

  13. Adaptively-refined overlapping grids for the numerical solution of systems of hyperbolic conservation laws

    NASA Technical Reports Server (NTRS)

    Brislawn, Kristi D.; Brown, David L.; Chesshire, Geoffrey S.; Saltzman, Jeffrey S.

    1995-01-01

    Adaptive mesh refinement (AMR) in conjunction with higher-order upwind finite-difference methods have been used effectively on a variety of problems in two and three dimensions. In this paper we introduce an approach for resolving problems that involve complex geometries in which resolution of boundary geometry is important. The complex geometry is represented by using the method of overlapping grids, while local resolution is obtained by refining each component grid with the AMR algorithm, appropriately generalized for this situation. The CMPGRD algorithm introduced by Chesshire and Henshaw is used to automatically generate the overlapping grid structure for the underlying mesh.

  14. Modelling the contribution of changes in family life to time trends in adolescent conduct problems.

    PubMed

    Collishaw, Stephan; Goodman, Robert; Pickles, Andrew; Maughan, Barbara

    2007-12-01

    The past half-century has seen significant changes in family life, including an increase in parental divorce, increases in the numbers of lone parent and stepfamilies, changes in socioeconomic well being, and a decrease in family size. Evidence also shows substantial time trends in adolescent mental health, including a marked increase in conduct problems over the last 25 years of the 20th Century in the UK. The aim of this study was to examine how these two sets of trends may be related. To illustrate the complexity of the issues involved, we focused on three well-established family risks for conduct problems: family type, income and family size. Three community samples of adolescents from England, Scotland and Wales were compared: 10,348 16-year olds assessed in 1974 as part of the National Child Development Study, 7234 16-year olds assessed in 1986 as part of the British Cohort Study, and 860 15-year olds assessed in the 1999 British Child and Adolescent Mental Health Survey. Parents completed comparable ratings of conduct problems in each survey and provided information on family type, income and size. Findings highlight important variations in both the prevalence of these family variables and their associations with conduct problems over time, underscoring the complex conceptual issues involved in testing causes of trends in mental health.

  15. Relational complexity modulates activity in the prefrontal cortex during numerical inductive reasoning: an fMRI study.

    PubMed

    Feng, Xiao; Peng, Li; Chang-Quan, Long; Yi, Lei; Hong, Li

    2014-09-01

    Most previous studies investigating relational reasoning have used visuo-spatial materials. This fMRI study aimed to determine how relational complexity affects brain activity during inductive reasoning, using numerical materials. Three numerical relational levels of the number series completion task were adopted for use: 0-relational (e.g., "23 23 23"), 1-relational ("32 30 28") and 2-relational ("12 13 15") problems. The fMRI results revealed that the bilateral dorsolateral prefrontal cortex (DLPFC) showed enhanced activity associated with relational complexity. Bilateral inferior parietal lobule (IPL) activity was greater during the 1- and 2-relational level problems than during the 0-relational level problems. In addition, the left fronto-polar cortex (FPC) showed selective activity during the 2-relational level problems. The bilateral DLPFC may be involved in the process of hypothesis generation, whereas the bilateral IPL may be sensitive to calculation demands. Moreover, the sensitivity of the left FPC to the multiple relational problems may be related to the integration of numerical relations. The present study extends our knowledge of the prefrontal activity pattern underlying numerical relational processing. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.

    1981-01-01

    Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.

  17. Identification and addressing reduction-related misconceptions

    NASA Astrophysics Data System (ADS)

    Gal-Ezer, Judith; Trakhtenbrot, Mark

    2016-07-01

    Reduction is one of the key techniques used for problem-solving in computer science. In particular, in the theory of computation and complexity (TCC), mapping and polynomial reductions are used for analysis of decidability and computational complexity of problems, including the core concept of NP-completeness. Reduction is a highly abstract technique that involves revealing close non-trivial connections between problems that often seem to have nothing in common. As a result, proper understanding and application of reduction is a serious challenge for students and a source of numerous misconceptions. The main contribution of this paper is detection of such misconceptions, analysis of their roots, and proposing a way to address them in an undergraduate TCC course. Our observations suggest that the main source of the misconceptions is the false intuitive rule "the bigger is a set/problem, the harder it is to solve". Accordingly, we developed a series of exercises for proactive prevention of these misconceptions.

  18. Multiple crack detection in 3D using a stable XFEM and global optimization

    NASA Astrophysics Data System (ADS)

    Agathos, Konstantinos; Chatzi, Eleni; Bordas, Stéphane P. A.

    2018-02-01

    A numerical scheme is proposed for the detection of multiple cracks in three dimensional (3D) structures. The scheme is based on a variant of the extended finite element method (XFEM) and a hybrid optimizer solution. The proposed XFEM variant is particularly well-suited for the simulation of 3D fracture problems, and as such serves as an efficient solution to the so-called forward problem. A set of heuristic optimization algorithms are recombined into a multiscale optimization scheme. The introduced approach proves effective in tackling the complex inverse problem involved, where identification of multiple flaws is sought on the basis of sparse measurements collected near the structural boundary. The potential of the scheme is demonstrated through a set of numerical case studies of varying complexity.

  19. Science of the science, drug discovery and artificial neural networks.

    PubMed

    Patel, Jigneshkumar

    2013-03-01

    Drug discovery process many times encounters complex problems, which may be difficult to solve by human intelligence. Artificial Neural Networks (ANNs) are one of the Artificial Intelligence (AI) technologies used for solving such complex problems. ANNs are widely used for primary virtual screening of compounds, quantitative structure activity relationship studies, receptor modeling, formulation development, pharmacokinetics and in all other processes involving complex mathematical modeling. Despite having such advanced technologies and enough understanding of biological systems, drug discovery is still a lengthy, expensive, difficult and inefficient process with low rate of new successful therapeutic discovery. In this paper, author has discussed the drug discovery science and ANN from very basic angle, which may be helpful to understand the application of ANN for drug discovery to improve efficiency.

  20. Environmental Sensing of Expert Knowledge in a Computational Evolution System for Complex Problem Solving in Human Genetics

    NASA Astrophysics Data System (ADS)

    Greene, Casey S.; Hill, Douglas P.; Moore, Jason H.

    The relationship between interindividual variation in our genomes and variation in our susceptibility to common diseases is expected to be complex with multiple interacting genetic factors. A central goal of human genetics is to identify which DNA sequence variations predict disease risk in human populations. Our success in this endeavour will depend critically on the development and implementation of computational intelligence methods that are able to embrace, rather than ignore, the complexity of the genotype to phenotype relationship. To this end, we have developed a computational evolution system (CES) to discover genetic models of disease susceptibility involving complex relationships between DNA sequence variations. The CES approach is hierarchically organized and is capable of evolving operators of any arbitrary complexity. The ability to evolve operators distinguishes this approach from artificial evolution approaches using fixed operators such as mutation and recombination. Our previous studies have shown that a CES that can utilize expert knowledge about the problem in evolved operators significantly outperforms a CES unable to use this knowledge. This environmental sensing of external sources of biological or statistical knowledge is important when the search space is both rugged and large as in the genetic analysis of complex diseases. We show here that the CES is also capable of evolving operators which exploit one of several sources of expert knowledge to solve the problem. This is important for both the discovery of highly fit genetic models and because the particular source of expert knowledge used by evolved operators may provide additional information about the problem itself. This study brings us a step closer to a CES that can solve complex problems in human genetics in addition to discovering genetic models of disease.

  1. Detection of Oil in Water Column, Final Report: Detection Prototype Tests

    DTIC Science & Technology

    2014-07-01

    first phase of the project involved initial development and testing of three technologies to address the detection problem . This second phase...important oceanic phenomena such as density stratification and naturally occurring particulate matter, which will affect the performance of sensors in the ...2 UNCLAS//Public | CG-926 RDC | M. Fitzpatrick, et al.| Public July 2014 spills of submerged oil is far more complex due to the problems

  2. Outcomes-Based Authentic Learning, Portfolio Assessment, and a Systems Approach to "Complex Problem-Solving": Related Pillars for Enhancing the Innovative Role of PBL in Future Higher Education

    ERIC Educational Resources Information Center

    Richards, Cameron

    2015-01-01

    The challenge of better reconciling individual and collective aspects of innovative problem-solving can be productively addressed to enhance the role of PBL as a key focus of the creative process in future higher education. This should involve "active learning" approaches supported by related processes of teaching, assessment and…

  3. A numerical approach for simulating fluid structure interaction of flexible thin shells undergoing arbitrarily large deformations in complex domains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilmanov, Anvar, E-mail: agilmano@umn.edu; Le, Trung Bao, E-mail: lebao002@umn.edu; Sotiropoulos, Fotis, E-mail: fotis@umn.edu

    We present a new numerical methodology for simulating fluid–structure interaction (FSI) problems involving thin flexible bodies in an incompressible fluid. The FSI algorithm uses the Dirichlet–Neumann partitioning technique. The curvilinear immersed boundary method (CURVIB) is coupled with a rotation-free finite element (FE) model for thin shells enabling the efficient simulation of FSI problems with arbitrarily large deformation. Turbulent flow problems are handled using large-eddy simulation with the dynamic Smagorinsky model in conjunction with a wall model to reconstruct boundary conditions near immersed boundaries. The CURVIB and FE solvers are coupled together on the flexible solid–fluid interfaces where the structural nodalmore » positions, displacements, velocities and loads are calculated and exchanged between the two solvers. Loose and strong coupling FSI schemes are employed enhanced by the Aitken acceleration technique to ensure robust coupling and fast convergence especially for low mass ratio problems. The coupled CURVIB-FE-FSI method is validated by applying it to simulate two FSI problems involving thin flexible structures: 1) vortex-induced vibrations of a cantilever mounted in the wake of a square cylinder at different mass ratios and at low Reynolds number; and 2) the more challenging high Reynolds number problem involving the oscillation of an inverted elastic flag. For both cases the computed results are in excellent agreement with previous numerical simulations and/or experiential measurements. Grid convergence tests/studies are carried out for both the cantilever and inverted flag problems, which show that the CURVIB-FE-FSI method provides their convergence. Finally, the capability of the new methodology in simulations of complex cardiovascular flows is demonstrated by applying it to simulate the FSI of a tri-leaflet, prosthetic heart valve in an anatomic aorta and under physiologic pulsatile conditions.« less

  4. New Developments of Computational Fluid Dynamics and Their Applications to Practical Engineering Problems

    NASA Astrophysics Data System (ADS)

    Chen, Hudong

    2001-06-01

    There have been considerable advances in Lattice Boltzmann (LB) based methods in the last decade. By now, the fundamental concept of using the approach as an alternative tool for computational fluid dynamics (CFD) has been substantially appreciated and validated in mainstream scientific research and in industrial engineering communities. Lattice Boltzmann based methods possess several major advantages: a) less numerical dissipation due to the linear Lagrange type advection operator in the Boltzmann equation; b) local dynamic interactions suitable for highly parallel processing; c) physical handling of boundary conditions for complicated geometries and accurate control of fluxes; d) microscopically consistent modeling of thermodynamics and of interface properties in complex multiphase flows. It provides a great opportunity to apply the method to practical engineering problems encountered in a wide range of industries from automotive, aerospace to chemical, biomedical, petroleum, nuclear, and others. One of the key challenges is to extend the applicability of this alternative approach to regimes of highly turbulent flows commonly encountered in practical engineering situations involving high Reynolds numbers. Over the past ten years, significant efforts have been made on this front at Exa Corporation in developing a lattice Boltzmann based commercial CFD software, PowerFLOW. It has become a useful computational tool for the simulation of turbulent aerodynamics in practical engineering problems involving extremely complex geometries and flow situations, such as in new automotive vehicle designs world wide. In this talk, we present an overall LB based algorithm concept along with certain key extensions in order to accurately handle turbulent flows involving extremely complex geometries. To demonstrate the accuracy of turbulent flow simulations, we provide a set of validation results for some well known academic benchmarks. These include straight channels, backward-facing steps, flows over a curved hill and typical NACA airfoils at various angles of attack including prediction of stall angle. We further provide numerous engineering cases, ranging from external aerodynamics around various car bodies to internal flows involved in various industrial devices. We conclude with a discussion of certain future extensions for complex fluids.

  5. On Chaotic and Hyperchaotic Complex Nonlinear Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Mahmoud, Gamal M.

    Dynamical systems described by real and complex variables are currently one of the most popular areas of scientific research. These systems play an important role in several fields of physics, engineering, and computer sciences, for example, laser systems, control (or chaos suppression), secure communications, and information science. Dynamical basic properties, chaos (hyperchaos) synchronization, chaos control, and generating hyperchaotic behavior of these systems are briefly summarized. The main advantage of introducing complex variables is the reduction of phase space dimensions by a half. They are also used to describe and simulate the physics of detuned laser and thermal convection of liquid flows, where the electric field and the atomic polarization amplitudes are both complex. Clearly, if the variables of the system are complex the equations involve twice as many variables and control parameters, thus making it that much harder for a hostile agent to intercept and decipher the coded message. Chaotic and hyperchaotic complex systems are stated as examples. Finally there are many open problems in the study of chaotic and hyperchaotic complex nonlinear dynamical systems, which need further investigations. Some of these open problems are given.

  6. Perceptual learning modules in mathematics: enhancing students' pattern recognition, structure extraction, and fluency.

    PubMed

    Kellman, Philip J; Massey, Christine M; Son, Ji Y

    2010-04-01

    Learning in educational settings emphasizes declarative and procedural knowledge. Studies of expertise, however, point to other crucial components of learning, especially improvements produced by experience in the extraction of information: perceptual learning (PL). We suggest that such improvements characterize both simple sensory and complex cognitive, even symbolic, tasks through common processes of discovery and selection. We apply these ideas in the form of perceptual learning modules (PLMs) to mathematics learning. We tested three PLMs, each emphasizing different aspects of complex task performance, in middle and high school mathematics. In the MultiRep PLM, practice in matching function information across multiple representations improved students' abilities to generate correct graphs and equations from word problems. In the Algebraic Transformations PLM, practice in seeing equation structure across transformations (but not solving equations) led to dramatic improvements in the speed of equation solving. In the Linear Measurement PLM, interactive trials involving extraction of information about units and lengths produced successful transfer to novel measurement problems and fraction problem solving. Taken together, these results suggest (a) that PL techniques have the potential to address crucial, neglected dimensions of learning, including discovery and fluent processing of relations; (b) PL effects apply even to complex tasks that involve symbolic processing; and (c) appropriately designed PL technology can produce rapid and enduring advances in learning. Copyright © 2009 Cognitive Science Society, Inc.

  7. Towards a conceptual multi-agent-based framework to simulate the spatial group decision-making process

    NASA Astrophysics Data System (ADS)

    Ghavami, Seyed Morsal; Taleai, Mohammad

    2017-04-01

    Most spatial problems are multi-actor, multi-issue and multi-phase in nature. In addition to their intrinsic complexity, spatial problems usually involve groups of actors from different organizational and cognitive backgrounds, all of whom participate in a social structure to resolve or reduce the complexity of a given problem. Hence, it is important to study and evaluate what different aspects influence the spatial problem resolution process. Recently, multi-agent systems consisting of groups of separate agent entities all interacting with each other have been put forward as appropriate tools to use to study and resolve such problems. In this study, then in order to generate a better level of understanding regarding the spatial problem group decision-making process, a conceptual multi-agent-based framework is used that represents and specifies all the necessary concepts and entities needed to aid group decision making, based on a simulation of the group decision-making process as well as the relationships that exist among the different concepts involved. The study uses five main influencing entities as concepts in the simulation process: spatial influence, individual-level influence, group-level influence, negotiation influence and group performance measures. Further, it explains the relationship among different concepts in a descriptive rather than explanatory manner. To illustrate the proposed framework, the approval process for an urban land use master plan in Zanjan—a provincial capital in Iran—is simulated using MAS, the results highlighting the effectiveness of applying an MAS-based framework when wishing to study the group decision-making process used to resolve spatial problems.

  8. Biologically-inspired approaches for self-organization, adaptation, and collaboration of heterogeneous autonomous systems

    NASA Astrophysics Data System (ADS)

    Steinberg, Marc

    2011-06-01

    This paper presents a selective survey of theoretical and experimental progress in the development of biologicallyinspired approaches for complex surveillance and reconnaissance problems with multiple, heterogeneous autonomous systems. The focus is on approaches that may address ISR problems that can quickly become mathematically intractable or otherwise impractical to implement using traditional optimization techniques as the size and complexity of the problem is increased. These problems require dealing with complex spatiotemporal objectives and constraints at a variety of levels from motion planning to task allocation. There is also a need to ensure solutions are reliable and robust to uncertainty and communications limitations. First, the paper will provide a short introduction to the current state of relevant biological research as relates to collective animal behavior. Second, the paper will describe research on largely decentralized, reactive, or swarm approaches that have been inspired by biological phenomena such as schools of fish, flocks of birds, ant colonies, and insect swarms. Next, the paper will discuss approaches towards more complex organizational and cooperative mechanisms in team and coalition behaviors in order to provide mission coverage of large, complex areas. Relevant team behavior may be derived from recent advances in understanding of the social and cooperative behaviors used for collaboration by tens of animals with higher-level cognitive abilities such as mammals and birds. Finally, the paper will briefly discuss challenges involved in user interaction with these types of systems.

  9. The effects of maternal working conditions and mastery on child behavior problems: studying the intergenerational transmission of social control.

    PubMed

    Rogers, S J; Parcel, T L; Menaghan, E G

    1991-06-01

    We assess the impact of maternal sense of mastery and maternal working conditions on maternal perceptions of children's behavior problems as a means to study the transmission of social control across generations. We use a sample of 521 employed mothers and their four-to six-year-old children from the National Longitudinal Survey's Youth Cohort in 1986. Regarding working conditions, we consider mother's hourly wage, work hours, and job content including involvement with things (vs. people), the requisite level of physical activity, and occupational complexity. We also consider maternal and child background and current family characteristics, including marital status, family size, and home environment. Maternal mastery was related to fewer reported behavior problems among children. Lower involvement with people and higher involvement with things, as well as low physical activity, were related significantly to higher levels of perceived problems. In addition, recent changes in maternal marital status, including maternal marriage or remarriage, increased reports of problems; stronger home environments had the opposite effect. We interpret these findings as suggesting how maternal experiences of control in the workplace and personal resources of control can influence the internalization of control in children.

  10. WE-D-303-00: Computational Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, John; Brigham and Women’s Hospital and Dana-Farber Cancer Institute, Boston, MA

    2015-06-15

    Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less

  11. Digitized adiabatic quantum computing with a superconducting circuit.

    PubMed

    Barends, R; Shabani, A; Lamata, L; Kelly, J; Mezzacapo, A; Las Heras, U; Babbush, R; Fowler, A G; Campbell, B; Chen, Yu; Chen, Z; Chiaro, B; Dunsworth, A; Jeffrey, E; Lucero, E; Megrant, A; Mutus, J Y; Neeley, M; Neill, C; O'Malley, P J J; Quintana, C; Roushan, P; Sank, D; Vainsencher, A; Wenner, J; White, T C; Solano, E; Neven, H; Martinis, John M

    2016-06-09

    Quantum mechanics can help to solve complex problems in physics and chemistry, provided they can be programmed in a physical device. In adiabatic quantum computing, a system is slowly evolved from the ground state of a simple initial Hamiltonian to a final Hamiltonian that encodes a computational problem. The appeal of this approach lies in the combination of simplicity and generality; in principle, any problem can be encoded. In practice, applications are restricted by limited connectivity, available interactions and noise. A complementary approach is digital quantum computing, which enables the construction of arbitrary interactions and is compatible with error correction, but uses quantum circuit algorithms that are problem-specific. Here we combine the advantages of both approaches by implementing digitized adiabatic quantum computing in a superconducting system. We tomographically probe the system during the digitized evolution and explore the scaling of errors with system size. We then let the full system find the solution to random instances of the one-dimensional Ising problem as well as problem Hamiltonians that involve more complex interactions. This digital quantum simulation of the adiabatic algorithm consists of up to nine qubits and up to 1,000 quantum logic gates. The demonstration of digitized adiabatic quantum computing in the solid state opens a path to synthesizing long-range correlations and solving complex computational problems. When combined with fault-tolerance, our approach becomes a general-purpose algorithm that is scalable.

  12. Mutation screening of AURKB and SYCP3 in patients with reproductive problems.

    PubMed

    López-Carrasco, A; Oltra, S; Monfort, S; Mayo, S; Roselló, M; Martínez, F; Orellana, C

    2013-02-01

    Mutations in the spindle checkpoint genes can cause improper chromosome segregations and aneuploidies, which in turn may lead to reproductive problems. Two of the proteins involved in this checkpoint are Aurora kinase B (AURKB), preventing the anaphase whenever microtubule-kinetochore attachments are not the proper ones during metaphase; and synaptonemal complex protein 3 (SYCP3), which is essential for the formation of the complex and for the recombination of the homologous chromosomes. This study has attempted to clarify the possible involvement of both proteins in the reproductive problems of patients with chromosomal instability. In order to do this, we have performed a screening for genetic variants in AURKB and SYCP3 among these patients using Sanger sequencing. Only one apparently non-pathogenic deletion was found in SYCP3. On the other hand, we found six sequence variations in AURKB. The consequences of these changes on the protein were studied in silico using different bioinformatic tools. In addition, the frequency of three of the variations was studied using a high-resolution melting approach. The absence of these three variants in control samples and their position in the AURKB gene suggests their possible involvement in the patients' chromosomal instability. Interestingly, two of the identified changes in AURKB were found in each member of a couple with antecedents of spontaneous pregnancy loss, a fetal anencephaly and a deaf daughter. One of these changes is described here for the first time. Although further studies are necessary, our results are encouraging enough to propose the analysis of AURKB in couples with reproductive problems.

  13. Reflectance Modeling

    NASA Technical Reports Server (NTRS)

    Smith, J. A. (Principal Investigator)

    1985-01-01

    The overall goal of this work has been to develop a set of computational tools and media abstractions for the terrain bidirectional reflectance problem. The modeling of soil and vegetation surfaces has been emphasized with a gradual increase in the complexity of the media geometries treated. Pragmatic problems involved in the combined modeling of soil, vegetation, and atmospheric effects have been of interest and one of the objectives has been to describe the canopy reflectance problem in a classical radiative transfer sense permitting easier inclusion of our work by other workers in the radiative transfer field.

  14. Bayesian approach to inverse statistical mechanics.

    PubMed

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  15. Bayesian approach to inverse statistical mechanics

    NASA Astrophysics Data System (ADS)

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  16. The Navier-Stokes computer

    NASA Technical Reports Server (NTRS)

    Nosenchuck, D. M.; Littman, M. G.

    1986-01-01

    The Navier-Stokes computer (NSC) has been developed for solving problems in fluid mechanics involving complex flow simulations that require more speed and capacity than provided by current and proposed Class VI supercomputers. The machine is a parallel processing supercomputer with several new architectural elements which can be programmed to address a wide range of problems meeting the following criteria: (1) the problem is numerically intensive, and (2) the code makes use of long vectors. A simulation of two-dimensional nonsteady viscous flows is presented to illustrate the architecture, programming, and some of the capabilities of the NSC.

  17. Atwood's machine as a tool to introduce variable mass systems

    NASA Astrophysics Data System (ADS)

    de Sousa, Célia A.

    2012-03-01

    This article discusses an instructional strategy which explores eventual similarities and/or analogies between familiar problems and more sophisticated systems. In this context, the Atwood's machine problem is used to introduce students to more complex problems involving ropes and chains. The methodology proposed helps students to develop the ability needed to apply relevant concepts in situations not previously encountered. The pedagogical advantages are relevant for both secondary and high school students, showing that, through adequate examples, the question of the validity of Newton's second law may even be introduced to introductory level students.

  18. Efficient dual approach to distance metric learning.

    PubMed

    Shen, Chunhua; Kim, Junae; Liu, Fayao; Wang, Lei; van den Hengel, Anton

    2014-02-01

    Distance metric learning is of fundamental interest in machine learning because the employed distance metric can significantly affect the performance of many learning methods. Quadratic Mahalanobis metric learning is a popular approach to the problem, but typically requires solving a semidefinite programming (SDP) problem, which is computationally expensive. The worst case complexity of solving an SDP problem involving a matrix variable of size D×D with O(D) linear constraints is about O(D(6.5)) using interior-point methods, where D is the dimension of the input data. Thus, the interior-point methods only practically solve problems exhibiting less than a few thousand variables. Because the number of variables is D(D+1)/2, this implies a limit upon the size of problem that can practically be solved around a few hundred dimensions. The complexity of the popular quadratic Mahalanobis metric learning approach thus limits the size of problem to which metric learning can be applied. Here, we propose a significantly more efficient and scalable approach to the metric learning problem based on the Lagrange dual formulation of the problem. The proposed formulation is much simpler to implement, and therefore allows much larger Mahalanobis metric learning problems to be solved. The time complexity of the proposed method is roughly O(D(3)), which is significantly lower than that of the SDP approach. Experiments on a variety of data sets demonstrate that the proposed method achieves an accuracy comparable with the state of the art, but is applicable to significantly larger problems. We also show that the proposed method can be applied to solve more general Frobenius norm regularized SDP problems approximately.

  19. Expert-guided evolutionary algorithm for layout design of complex space stations

    NASA Astrophysics Data System (ADS)

    Qian, Zhiqin; Bi, Zhuming; Cao, Qun; Ju, Weiguo; Teng, Hongfei; Zheng, Yang; Zheng, Siyu

    2017-08-01

    The layout of a space station should be designed in such a way that different equipment and instruments are placed for the station as a whole to achieve the best overall performance. The station layout design is a typical nondeterministic polynomial problem. In particular, how to manage the design complexity to achieve an acceptable solution within a reasonable timeframe poses a great challenge. In this article, a new evolutionary algorithm has been proposed to meet such a challenge. It is called as the expert-guided evolutionary algorithm with a tree-like structure decomposition (EGEA-TSD). Two innovations in EGEA-TSD are (i) to deal with the design complexity, the entire design space is divided into subspaces with a tree-like structure; it reduces the computation and facilitates experts' involvement in the solving process. (ii) A human-intervention interface is developed to allow experts' involvement in avoiding local optimums and accelerating convergence. To validate the proposed algorithm, the layout design of one-space station is formulated as a multi-disciplinary design problem, the developed algorithm is programmed and executed, and the result is compared with those from other two algorithms; it has illustrated the superior performance of the proposed EGEA-TSD.

  20. Computational Issues Associated with Temporally Deforming Geometries Such as Thrust Vectoring Nozzles

    NASA Technical Reports Server (NTRS)

    Boyalakuntla, Kishore; Soni, Bharat K.; Thornburg, Hugh J.; Yu, Robert

    1996-01-01

    During the past decade, computational simulation of fluid flow around complex configurations has progressed significantly and many notable successes have been reported, however, unsteady time-dependent solutions are not easily obtainable. The present effort involves unsteady time dependent simulation of temporally deforming geometries. Grid generation for a complex configuration can be a time consuming process and temporally varying geometries necessitate the regeneration of such grids for every time step. Traditional grid generation techniques have been tried and demonstrated to be inadequate to such simulations. Non-Uniform Rational B-splines (NURBS) based techniques provide a compact and accurate representation of the geometry. This definition can be coupled with a distribution mesh for a user defined spacing. The present method greatly reduces cpu requirements for time dependent remeshing, facilitating the simulation of more complex unsteady problems. A thrust vectoring nozzle has been chosen to demonstrate the capability as it is of current interest in the aerospace industry for better maneuverability of fighter aircraft in close combat and in post stall regimes. This current effort is the first step towards multidisciplinary design optimization which involves coupling the aerodynamic heat transfer and structural analysis techniques. Applications include simulation of temporally deforming bodies and aeroelastic problems.

  1. Ethical School Leadership: Problems of an Elusive Role.

    ERIC Educational Resources Information Center

    Campbell, Elizabeth

    1997-01-01

    Educational literature increasingly stresses the importance of ethics in school leadership, the need to recognize professional responsibilities as basic ethical imperatives, and the need for administrator preparation programs to reflect these neglected areas. Within this context, this paper addresses the complexities involved in translating…

  2. Microcomputers, Model Rockets, and Race Cars.

    ERIC Educational Resources Information Center

    Mirus, Edward A., Jr.

    1985-01-01

    The industrial education orientation program at Wisconsin School for the Deaf (WSD) presents problem-solving situations to all seventh- and eighth-grade hearing-impaired students. WSD developed user-friendly microcomputer software to guide students individually through complex computations involving model race cars and rockets while freeing…

  3. Against the Grain: Teaching Historical Complexity

    ERIC Educational Resources Information Center

    Neumann, Dave

    2013-01-01

    Many teachers and scholars have written about the importance of inquiry in effective history instruction. At its core, inquiry involves student investigation of a significant historical problem. Experienced teachers, however, often reveal their skill in purposely teaching against the grain. Skilled teachers help students appreciate historical…

  4. Assessment of wastewater treatment alternatives for small communities: An analytic network process approach.

    PubMed

    Molinos-Senante, María; Gómez, Trinidad; Caballero, Rafael; Hernández-Sancho, Francesc; Sala-Garrido, Ramón

    2015-11-01

    The selection of the most appropriate wastewater treatment (WWT) technology is a complex problem since many alternatives are available and many criteria are involved in the decision-making process. To deal with this challenge, the analytic network process (ANP) is applied for the first time to rank a set of seven WWT technology set-ups for secondary treatment in small communities. A major advantage of ANP is that it incorporates interdependent relationships between elements. Results illustrated that extensive technologies, constructed wetlands and pond systems are the most preferred alternatives by WWT experts. The sensitivity analysis performed verified that the ranking of WWT alternatives is very stable since constructed wetlands are almost always placed in the first position. This paper showed that ANP analysis is suitable to deal with complex decision-making problems, such as the selection of the most appropriate WWT system contributing to better understand the multiple interdependences among elements involved in the assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Model verification of mixed dynamic systems. [POGO problem in liquid propellant rockets

    NASA Technical Reports Server (NTRS)

    Chrostowski, J. D.; Evensen, D. A.; Hasselman, T. K.

    1978-01-01

    A parameter-estimation method is described for verifying the mathematical model of mixed (combined interactive components from various engineering fields) dynamic systems against pertinent experimental data. The model verification problem is divided into two separate parts: defining a proper model and evaluating the parameters of that model. The main idea is to use differences between measured and predicted behavior (response) to adjust automatically the key parameters of a model so as to minimize response differences. To achieve the goal of modeling flexibility, the method combines the convenience of automated matrix generation with the generality of direct matrix input. The equations of motion are treated in first-order form, allowing for nonsymmetric matrices, modeling of general networks, and complex-mode analysis. The effectiveness of the method is demonstrated for an example problem involving a complex hydraulic-mechanical system.

  6. Complexity seems to open a way towards a new Aristotelian-Thomistic ontology.

    PubMed

    Strumia, Alberto

    2007-01-01

    Today's sciences seem to converge all towards very similar foundational questions. Such claims, both of epistemological and ontological nature, seem to rediscover, in a new fashion some of the most relevant topics of ancient Greek and Mediaeval philosophy of nature, logic and metaphysics, such as the problem of the relationship between the whole and its parts (non redictionism), the problems of the paradoxes arising from the attempt to conceive the entity like an univocal concept (analogy and analogia entis), the problem of the mind-body relationship and that of an adequate cognitive theory (abstraction and immaterial nature of the mind), the complexity of some physical, chemical and biological systems and global properties arising from information (matter-form theory), etc. Medicine too is involved in some of such relevant questions and cannot avoid to take them into a special account.

  7. Some unsolved problems in discrete mathematics and mathematical cybernetics

    NASA Astrophysics Data System (ADS)

    Korshunov, Aleksei D.

    2009-10-01

    There are many unsolved problems in discrete mathematics and mathematical cybernetics. Writing a comprehensive survey of such problems involves great difficulties. First, such problems are rather numerous and varied. Second, they greatly differ from each other in degree of completeness of their solution. Therefore, even a comprehensive survey should not attempt to cover the whole variety of such problems; only the most important and significant problems should be reviewed. An impersonal choice of problems to include is quite hard. This paper includes 13 unsolved problems related to combinatorial mathematics and computational complexity theory. The problems selected give an indication of the author's studies for 50 years; for this reason, the choice of the problems reviewed here is, to some extent, subjective. At the same time, these problems are very difficult and quite important for discrete mathematics and mathematical cybernetics. Bibliography: 74 items.

  8. Landscape change in the Southern Piedmont: Challenges, solutions and uncertainty across scales

    USGS Publications Warehouse

    Conroy, M.J.; Allen, Craig R.; Peterson, J.T.; Pritchard, L.; Moore, C.T.

    2003-01-01

    The southern Piedmont of the southeastern United States epitomizes the complex and seemingly intractable problems and hard decisions that result from uncontrolled urban and suburban sprawl. Here we consider three recurrent themes in complicated problems involving complex systems: (1) scale dependencies and cross-scale, often nonlinear relationships; (2) resilience, in particular the potential for complex systems to move to alternate stable states with decreased ecological and/or economic value; and (3) uncertainty in the ability to understand and predict outcomes, perhaps particularly those that occur as a result of human impacts. We consider these issues in the context of landscape-level decision making, using as an example water resources and lotic systems in the Piedmont region of the southeastern United States. Copyright ?? 2003 by the author(s). Published here under licence by The Resilience Alliance.

  9. The application of CFD to the modelling of fires in complex geometries

    NASA Astrophysics Data System (ADS)

    Burns, A. D.; Clarke, D. S.; Guilbert, P.; Jones, I. P.; Simcox, S.; Wilkes, N. S.

    The application of Computational Fluid Dynamics (CFD) to industrial safety is a challenging activity. In particular it involves the interaction of several different physical processes, including turbulence, combustion, radiation, buoyancy, compressible flow and shock waves in complex three-dimensional geometries. In addition, there may be multi-phase effects arising, for example, from sprinkler systems for extinguishing fires. The FLOW3D software (1-3) from Computational Fluid Dynamics Services (CFDS) is in widespread use in industrial safety problems, both within AEA Technology, and also by CFDS's commercial customers, for example references (4-13). This paper discusses some other applications of FLOW3D to safety problems. These applications illustrate the coupling of the gas flows with radiation models and combustion models, particularly for complex geometries where simpler radiation models are not applicable.

  10. Dynamic Control of Plans with Temporal Uncertainty

    NASA Technical Reports Server (NTRS)

    Morris, Paul; Muscettola, Nicola; Vidal, Thierry

    2001-01-01

    Certain planning systems that deal with quantitative time constraints have used an underlying Simple Temporal Problem solver to ensure temporal consistency of plans. However, many applications involve processes of uncertain duration whose timing cannot be controlled by the execution agent. These cases require more complex notions of temporal feasibility. In previous work, various "controllability" properties such as Weak, Strong, and Dynamic Controllability have been defined. The most interesting and useful Controllability property, the Dynamic one, has ironically proved to be the most difficult to analyze. In this paper, we resolve the complexity issue for Dynamic Controllability. Unexpectedly, the problem turns out to be tractable. We also show how to efficiently execute networks whose status has been verified.

  11. Interdisciplinarity in Adapted Physical Activity

    ERIC Educational Resources Information Center

    Bouffard, Marcel; Spencer-Cavaliere, Nancy

    2016-01-01

    It is commonly accepted that inquiry in adapted physical activity involves the use of different disciplines to address questions. It is often advanced today that complex problems of the kind frequently encountered in adapted physical activity require a combination of disciplines for their solution. At the present time, individual research…

  12. Institute for Defense Analysis. Annual Report 1995.

    DTIC Science & Technology

    1995-01-01

    staff have been involved in the community-wide development of MPI as well as in its application to specific NSA problems. 35 Parallel Groebner ...Basis Code — Symbolic Computing on Parallel Machines The Groebner basis method is a set of algorithms for reformulating very complex algebraic expres

  13. Applied mathematical problems in modern electromagnetics

    NASA Astrophysics Data System (ADS)

    Kriegsman, Gregory

    1994-05-01

    We have primarily investigated two classes of electromagnetic problems. The first contains the quantitative description of microwave heating of dispersive and conductive materials. Such problems arise, for example, when biological tissue are exposed, accidentally or purposefully, to microwave radiation. Other instances occur in ceramic processing, such as sintering and microwave assisted chemical vapor infiltration and other industrial drying processes, such as the curing of paints and concrete. The second class characterizes the scattering of microwaves by complex targets which possess two or more disparate length and/or time scales. Spatially complex scatterers arise in a variety of applications, such as large gratings and slowly changing guiding structures. The former are useful in developing microstrip energy couplers while the later can be used to model anatomical subsystems (e.g., the open guiding structure composed of two legs and the adjoining lower torso). Temporally complex targets occur in applications involving dispersive media whose relaxation times differ by orders of magnitude from thermal and/or electromagnetic time scales. For both cases the mathematical description of the problems gives rise to complicated ill-conditioned boundary value problems, whose accurate solutions require a blend of both asymptotic techniques, such as multiscale methods and matched asymptotic expansions, and numerical methods incorporating radiation boundary conditions, such as finite differences and finite elements.

  14. Aerodynamics of an airfoil with a jet issuing from its surface

    NASA Technical Reports Server (NTRS)

    Tavella, D. A.; Karamcheti, K.

    1982-01-01

    A simple, two dimensional, incompressible and inviscid model for the problem posed by a two dimensional wing with a jet issuing from its lower surface is considered and a parametric analysis is carried out to observe how the aerodynamic characteristics depend on the different parameters. The mathematical problem constitutes a boundary value problem where the position of part of the boundary is not known a priori. A nonlinear optimization approach was used to solve the problem, and the analysis reveals interesting characteristics that may help to better understand the physics involved in more complex situations in connection with high lift systems.

  15. Inverse problems in complex material design: Applications to non-crystalline solids

    NASA Astrophysics Data System (ADS)

    Biswas, Parthapratim; Drabold, David; Elliott, Stephen

    The design of complex amorphous materials is one of the fundamental problems in disordered condensed-matter science. While impressive developments of ab-initio simulation methods during the past several decades have brought tremendous success in understanding materials property from micro- to mesoscopic length scales, a major drawback is that they fail to incorporate existing knowledge of the materials in simulation methodologies. Since an essential feature of materials design is the synergy between experiment and theory, a properly developed approach to design materials should be able to exploit all available knowledge of the materials from measured experimental data. In this talk, we will address the design of complex disordered materials as an inverse problem involving experimental data and available empirical information. We show that the problem can be posed as a multi-objective non-convex optimization program, which can be addressed using a number of recently-developed bio-inspired global optimization techniques. In particular, we will discuss how a population-based stochastic search procedure can be used to determine the structure of non-crystalline solids (e.g. a-SiH, a-SiO2, amorphous graphene, and Fe and Ni clusters). The work is partially supported by NSF under Grant Nos. DMR 1507166 and 1507670.

  16. Planning Following Stroke: A Relational Complexity Approach Using the Tower of London

    PubMed Central

    Andrews, Glenda; Halford, Graeme S.; Chappell, Mark; Maujean, Annick; Shum, David H. K.

    2014-01-01

    Planning on the 4-disk version of the Tower of London (TOL4) was examined in stroke patients and unimpaired controls. Overall TOL4 solution scores indicated impaired planning in the frontal stroke but not non-frontal stroke patients. Consistent with the claim that processing the relations between current states, intermediate states, and goal states is a key process in planning, the domain-general relational complexity metric was a good indicator of the experienced difficulty of TOL4 problems. The relational complexity metric shared variance with task-specific metrics of moves to solution and search depth. Frontal stroke patients showed impaired planning compared to controls on problems at all three complexity levels, but at only two of the three levels of moves to solution, search depth and goal ambiguity. Non-frontal stroke patients showed impaired planning only on the most difficult quaternary-relational and high search depth problems. An independent measure of relational processing (viz., Latin square task) predicted TOL4 solution scores after controlling for stroke status and location, and executive processing (Trail Making Test). The findings suggest that planning involves a domain-general capacity for relational processing that depends on the frontal brain regions. PMID:25566042

  17. Challenge of biomechanics.

    PubMed

    Volokh, K Y

    2013-06-01

    The application of mechanics to biology--biomechanics--bears great challenges due to the intricacy of living things. Their dynamism, along with the complexity of their mechanical response (which in itself involves complex chemical, electrical, and thermal phenomena) makes it very difficult to correlate empirical data with theoretical models. This difficulty elevates the importance of useful biomechanical theories compared to other fields of engineering. Despite inherent imperfections of all theories, a well formulated theory is crucial in any field of science because it is the basis for interpreting observations. This is all-the-more vital, for instance, when diagnosing symptoms, or planning treatment to a disease. The notion of interpreting empirical data without theory is unscientific and unsound. This paper attempts to fortify the importance of biomechanics and invigorate research efforts for those engineers and mechanicians who are not yet involved in the field. It is not aimed here, however, to give an overview of biomechanics. Instead, three unsolved problems are formulated to challenge the readers. At the micro-scale, the problem of the structural organization and integrity of the living cell is presented. At the meso-scale, the enigma of fingerprint formation is discussed. At the macro-scale, the problem of predicting aneurysm ruptures is reviewed. It is aimed here to attract the attention of engineers and mechanicians to problems in biomechanics which, in the author's opinion, will dominate the development of engineering and mechanics in forthcoming years.

  18. Strategic control in decision-making under uncertainty.

    PubMed

    Venkatraman, Vinod; Huettel, Scott A

    2012-04-01

    Complex economic decisions - whether investing money for retirement or purchasing some new electronic gadget - often involve uncertainty about the likely consequences of our choices. Critical for resolving that uncertainty are strategic meta-decision processes, which allow people to simplify complex decision problems, evaluate outcomes against a variety of contexts, and flexibly match behavior to changes in the environment. In recent years, substantial research has implicated the dorsomedial prefrontal cortex (dmPFC) in the flexible control of behavior. However, nearly all such evidence comes from paradigms involving executive function or response selection, not complex decision-making. Here, we review evidence that demonstrates that the dmPFC contributes to strategic control in complex decision-making. This region contains a functional topography such that the posterior dmPFC supports response-related control, whereas the anterior dmPFC supports strategic control. Activation in the anterior dmPFC signals changes in how a decision problem is represented, which in turn can shape computational processes elsewhere in the brain. Based on these findings, we argue for both generalized contributions of the dmPFC to cognitive control, and specific computational roles for its subregions depending upon the task demands and context. We also contend that these strategic considerations are likely to be critical for decision-making in other domains, including interpersonal interactions in social settings. © 2012 The Authors. European Journal of Neuroscience © 2012 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  19. Strategic Control in Decision Making under Uncertainty

    PubMed Central

    Venkatraman, Vinod; Huettel, Scott

    2012-01-01

    Complex economic decisions – whether investing money for retirement or purchasing some new electronic gadget – often involve uncertainty about the likely consequences of our choices. Critical for resolving that uncertainty are strategic meta-decision processes, which allow people to simplify complex decision problems, to evaluate outcomes against a variety of contexts, and to flexibly match behavior to changes in the environment. In recent years, substantial research implicates the dorsomedial prefrontal cortex (dmPFC) in the flexible control of behavior. However, nearly all such evidence comes from paradigms involving executive function or response selection, not complex decision making. Here, we review evidence that demonstrates that the dmPFC contributes to strategic control in complex decision making. This region contains a functional topography such that the posterior dmPFC supports response-related control while the anterior dmPFC supports strategic control. Activation in the anterior dmPFC signals changes in how a decision problem is represented, which in turn can shape computational processes elsewhere in the brain. Based on these findings, we argue both for generalized contributions of the dmPFC to cognitive control, and for specific computational roles for its subregions depending upon the task demands and context. We also contend that these strategic considerations are also likely to be critical for decision making in other domains, including interpersonal interactions in social settings. PMID:22487037

  20. Multiple Choice Knapsack Problem: example of planning choice in transportation.

    PubMed

    Zhong, Tao; Young, Rhonda

    2010-05-01

    Transportation programming, a process of selecting projects for funding given budget and other constraints, is becoming more complex as a result of new federal laws, local planning regulations, and increased public involvement. This article describes the use of an integer programming tool, Multiple Choice Knapsack Problem (MCKP), to provide optimal solutions to transportation programming problems in cases where alternative versions of projects are under consideration. In this paper, optimization methods for use in the transportation programming process are compared and then the process of building and solving the optimization problems is discussed. The concepts about the use of MCKP are presented and a real-world transportation programming example at various budget levels is provided. This article illustrates how the use of MCKP addresses the modern complexities and provides timely solutions in transportation programming practice. While the article uses transportation programming as a case study, MCKP can be useful in other fields where a similar decision among a subset of the alternatives is required. Copyright 2009 Elsevier Ltd. All rights reserved.

  1. A longitudinal study of higher-order thinking skills: working memory and fluid reasoning in childhood enhance complex problem solving in adolescence.

    PubMed

    Greiff, Samuel; Wüstenberg, Sascha; Goetz, Thomas; Vainikainen, Mari-Pauliina; Hautamäki, Jarkko; Bornstein, Marc H

    2015-01-01

    Scientists have studied the development of the human mind for decades and have accumulated an impressive number of empirical studies that have provided ample support for the notion that early cognitive performance during infancy and childhood is an important predictor of later cognitive performance during adulthood. As children move from childhood into adolescence, their mental development increasingly involves higher-order cognitive skills that are crucial for successful planning, decision-making, and problem solving skills. However, few studies have employed higher-order thinking skills such as complex problem solving (CPS) as developmental outcomes in adolescents. To fill this gap, we tested a longitudinal developmental model in a sample of 2,021 Finnish sixth grade students (M = 12.41 years, SD = 0.52; 1,041 female, 978 male, 2 missing sex). We assessed working memory (WM) and fluid reasoning (FR) at age 12 as predictors of two CPS dimensions: knowledge acquisition and knowledge application. We further assessed students' CPS performance 3 years later as a developmental outcome (N = 1696; M = 15.22 years, SD = 0.43; 867 female, 829 male). Missing data partly occurred due to dropout and technical problems during the first days of testing and varied across indicators and time with a mean of 27.2%. Results revealed that FR was a strong predictor of both CPS dimensions, whereas WM exhibited only a small influence on one of the two CPS dimensions. These results provide strong support for the view that CPS involves FR and, to a lesser extent, WM in childhood and from there evolves into an increasingly complex structure of higher-order cognitive skills in adolescence.

  2. A general procedure to generate models for urban environmental-noise pollution using feature selection and machine learning methods.

    PubMed

    Torija, Antonio J; Ruiz, Diego P

    2015-02-01

    The prediction of environmental noise in urban environments requires the solution of a complex and non-linear problem, since there are complex relationships among the multitude of variables involved in the characterization and modelling of environmental noise and environmental-noise magnitudes. Moreover, the inclusion of the great spatial heterogeneity characteristic of urban environments seems to be essential in order to achieve an accurate environmental-noise prediction in cities. This problem is addressed in this paper, where a procedure based on feature-selection techniques and machine-learning regression methods is proposed and applied to this environmental problem. Three machine-learning regression methods, which are considered very robust in solving non-linear problems, are used to estimate the energy-equivalent sound-pressure level descriptor (LAeq). These three methods are: (i) multilayer perceptron (MLP), (ii) sequential minimal optimisation (SMO), and (iii) Gaussian processes for regression (GPR). In addition, because of the high number of input variables involved in environmental-noise modelling and estimation in urban environments, which make LAeq prediction models quite complex and costly in terms of time and resources for application to real situations, three different techniques are used to approach feature selection or data reduction. The feature-selection techniques used are: (i) correlation-based feature-subset selection (CFS), (ii) wrapper for feature-subset selection (WFS), and the data reduction technique is principal-component analysis (PCA). The subsequent analysis leads to a proposal of different schemes, depending on the needs regarding data collection and accuracy. The use of WFS as the feature-selection technique with the implementation of SMO or GPR as regression algorithm provides the best LAeq estimation (R(2)=0.94 and mean absolute error (MAE)=1.14-1.16 dB(A)). Copyright © 2014 Elsevier B.V. All rights reserved.

  3. WE-D-303-01: Development and Application of Digital Human Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segars, P.

    2015-06-15

    Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less

  4. A longitudinal study of higher-order thinking skills: working memory and fluid reasoning in childhood enhance complex problem solving in adolescence

    PubMed Central

    Greiff, Samuel; Wüstenberg, Sascha; Goetz, Thomas; Vainikainen, Mari-Pauliina; Hautamäki, Jarkko; Bornstein, Marc H.

    2015-01-01

    Scientists have studied the development of the human mind for decades and have accumulated an impressive number of empirical studies that have provided ample support for the notion that early cognitive performance during infancy and childhood is an important predictor of later cognitive performance during adulthood. As children move from childhood into adolescence, their mental development increasingly involves higher-order cognitive skills that are crucial for successful planning, decision-making, and problem solving skills. However, few studies have employed higher-order thinking skills such as complex problem solving (CPS) as developmental outcomes in adolescents. To fill this gap, we tested a longitudinal developmental model in a sample of 2,021 Finnish sixth grade students (M = 12.41 years, SD = 0.52; 1,041 female, 978 male, 2 missing sex). We assessed working memory (WM) and fluid reasoning (FR) at age 12 as predictors of two CPS dimensions: knowledge acquisition and knowledge application. We further assessed students’ CPS performance 3 years later as a developmental outcome (N = 1696; M = 15.22 years, SD = 0.43; 867 female, 829 male). Missing data partly occurred due to dropout and technical problems during the first days of testing and varied across indicators and time with a mean of 27.2%. Results revealed that FR was a strong predictor of both CPS dimensions, whereas WM exhibited only a small influence on one of the two CPS dimensions. These results provide strong support for the view that CPS involves FR and, to a lesser extent, WM in childhood and from there evolves into an increasingly complex structure of higher-order cognitive skills in adolescence. PMID:26283992

  5. Enhancing chemistry problem-solving achievement using problem categorization

    NASA Astrophysics Data System (ADS)

    Bunce, Diane M.; Gabel, Dorothy L.; Samuel, John V.

    The enhancement of chemistry students' skill in problem solving through problem categorization is the focus of this study. Twenty-four students in a freshman chemistry course for health professionals are taught how to solve problems using the explicit method of problem solving (EMPS) (Bunce & Heikkinen, 1986). The EMPS is an organized approach to problem analysis which includes encoding the information given in a problem (Given, Asked For), relating this to what is already in long-term memory (Recall), and planning a solution (Overall Plan) before a mathematical solution is attempted. In addition to the EMPS training, treatment students receive three 40-minute sessions following achievement tests in which they are taught how to categorize problems. Control students use this time to review the EMPS solutions of test questions. Although problem categorization is involved in one section of the EMPS (Recall), treatment students who received specific training in problem categorization demonstrate significantly higher achievement on combination problems (those problems requiring the use of more than one chemical topic for their solution) at (p = 0.01) than their counterparts. Significantly higher achievement for treatment students is also measured on an unannounced test (p = 0.02). Analysis of interview transcripts of both treatment and control students illustrates a Rolodex approach to problem solving employed by all students in this study. The Rolodex approach involves organizing equations used to solve problems on mental index cards and flipping through them, matching units given when a new problem is to be solved. A second phenomenon observed during student interviews is the absence of a link in the conceptual understanding of the chemical concepts involved in a problem and the problem-solving skills employed to correctly solve problems. This study shows that explicit training in categorization skills and the EMPS can lead to higher achievement in complex problem-solving situations (combination problems and unannounced test). However, such achievement may be limited by the lack of linkages between students' conceptual understanding and improved problem-solving skill.

  6. Sustainability: Why the Language and Ethics of Sustainability Matter in the Geoscience Classroom

    ERIC Educational Resources Information Center

    Metzger, Ellen P.; Curren, Randall R.

    2017-01-01

    Because challenges to sustainability arise at the intersection of human and biophysical systems they are inescapably embedded in social contexts and involve multiple stakeholders with diverse and often conflicting needs and value systems. Addressing complex and solution-resistant problems such as climate change, biodiversity loss, and…

  7. USE OF REVA'S WEB-BASED ENVIRONMENTAL DECISION TOOLKIT (EDT) TO ASSESS VULNERABILITY TO MERCURY ACROSS THE UNITED STATES

    EPA Science Inventory

    The problem of assessing risk from mercury across the nation is extremely complex involving integration of 1) our understanding of the methylation process in ecosystems, 2) the identification and spatial distribution of sensitive populations, and 3) the spatial pattern of mercury...

  8. ASSESSING THE RISK ASSOCIATED WITH MERCURY: USING REVA'S WEBTOOL TO COMPARE DATA, ASSUMPTIONS, AND MODELS

    EPA Science Inventory

    The problem of assessing risk from mercury across the nation is extremely complex involving integration of I) our understanding of the methylation process in ecosystems, 2) the identification and spatial distribution of sensitive populations, and 3) the spatial pattern of mercury...

  9. Unravelling Secondary Students' Challenges in Digital Literacy: A Gender Perspective

    ERIC Educational Resources Information Center

    Argelagós, Esther; Pifarré, Manoli

    2017-01-01

    The use of the Internet to learn involves complex cognitive activities. Educational researchers claim more attention in studying the nature of students' challenges when using digital information for learning purposes. Our research investigated in depth the challenges that secondary students face when solving web information-problem tasks. We…

  10. Computer program determines chemical composition of physical system at equilibrium

    NASA Technical Reports Server (NTRS)

    Kwong, S. S.

    1966-01-01

    FORTRAN 4 digital computer program calculates equilibrium composition of complex, multiphase chemical systems. This is a free energy minimization method with solution of the problem reduced to mathematical operations, without concern for the chemistry involved. Also certain thermodynamic properties are determined as byproducts of the main calculations.

  11. The Role of Sleep in Childhood Psychiatric Disorders

    ERIC Educational Resources Information Center

    Alfano, Candice A.; Gamble, Amanda L.

    2009-01-01

    Although sleep problems often comprise core features of psychiatric disorders, inadequate attention has been paid to the complex, reciprocal relationships involved in the early regulation of sleep, emotion, and behavior. In this paper, we review the pediatric literature examining sleep in children with primary psychiatric disorders as well as…

  12. Reducing Bullying: Application of Social Cognitive Theory

    ERIC Educational Resources Information Center

    Swearer, Susan M.; Wang, Cixin; Berry, Brandi; Myers, Zachary R.

    2014-01-01

    Social cognitive theory (SCT) is an important heuristic for understanding the complexity of bullying behaviors and the social nature of involvement in bullying. Bullying has been heralded as a social relationship problem, and the interplay between the individual and his or her social environment supports this conceptualization. SCT has been used…

  13. Academic Procrastination: Frequency and Cognitive-Behavioral Correlates.

    ERIC Educational Resources Information Center

    Solomon, Laura J.; Rothblum, Esther D.

    1984-01-01

    Investigated the frequency of and reasons for college students' (N=342) procrastination on academic tasks. A high percentage of students reported problems with procrastination. Results indicated that procrastination is not solely a deficit in study habits or time management but involves a complex interaction of behavioral, cognitive, and affective…

  14. Temporal Constraint Reasoning With Preferences

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Morris, Paul; Morris, Robert; Rossi, Francesca

    2001-01-01

    A number of reasoning problems involving the manipulation of temporal information can naturally be viewed as implicitly inducing an ordering of potential local decisions involving time (specifically, associated with durations or orderings of events) on the basis of preferences. For example. a pair of events might be constrained to occur in a certain order, and, in addition. it might be preferable that the delay between them be as large, or as small, as possible. This paper explores problems in which a set of temporal constraints is specified, where each constraint is associated with preference criteria for making local decisions about the events involved in the constraint, and a reasoner must infer a complete solution to the problem such that, to the extent possible, these local preferences are met in the best way. A constraint framework for reasoning about time is generalized to allow for preferences over event distances and durations, and we study the complexity of solving problems in the resulting formalism. It is shown that while in general such problems are NP-hard, some restrictions on the shape of the preference functions, and on the structure of the preference set, can be enforced to achieve tractability. In these cases, a simple generalization of a single-source shortest path algorithm can be used to compute a globally preferred solution in polynomial time.

  15. A modified Wright-Fisher model that incorporates Ne: A variant of the standard model with increased biological realism and reduced computational complexity.

    PubMed

    Zhao, Lei; Gossmann, Toni I; Waxman, David

    2016-03-21

    The Wright-Fisher model is an important model in evolutionary biology and population genetics. It has been applied in numerous analyses of finite populations with discrete generations. It is recognised that real populations can behave, in some key aspects, as though their size that is not the census size, N, but rather a smaller size, namely the effective population size, Ne. However, in the Wright-Fisher model, there is no distinction between the effective and census population sizes. Equivalently, we can say that in this model, Ne coincides with N. The Wright-Fisher model therefore lacks an important aspect of biological realism. Here, we present a method that allows Ne to be directly incorporated into the Wright-Fisher model. The modified model involves matrices whose size is determined by Ne. Thus apart from increased biological realism, the modified model also has reduced computational complexity, particularly so when Ne⪡N. For complex problems, it may be hard or impossible to numerically analyse the most commonly-used approximation of the Wright-Fisher model that incorporates Ne, namely the diffusion approximation. An alternative approach is simulation. However, the simulations need to be sufficiently detailed that they yield an effective size that is different to the census size. Simulations may also be time consuming and have attendant statistical errors. The method presented in this work may then be the only alternative to simulations, when Ne differs from N. We illustrate the straightforward application of the method to some problems involving allele fixation and the determination of the equilibrium site frequency spectrum. We then apply the method to the problem of fixation when three alleles are segregating in a population. This latter problem is significantly more complex than a two allele problem and since the diffusion equation cannot be numerically solved, the only other way Ne can be incorporated into the analysis is by simulation. We have achieved good accuracy in all cases considered. In summary, the present work extends the realism and tractability of an important model of evolutionary biology and population genetics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Development of a Composite Tailoring Procedure for Airplane Wings

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi

    2000-01-01

    The quest for finding optimum solutions to engineering problems has existed for a long time. In modern times, the development of optimization as a branch of applied mathematics is regarded to have originated in the works of Newton, Bernoulli and Euler. Venkayya has presented a historical perspective on optimization in [1]. The term 'optimization' is defined by Ashley [2] as a procedure "...which attempts to choose the variables in a design process so as formally to achieve the best value of some performance index while not violating any of the associated conditions or constraints". Ashley presented an extensive review of practical applications of optimization in the aeronautical field till about 1980 [2]. It was noted that there existed an enormous amount of published literature in the field of optimization, but its practical applications in industry were very limited. Over the past 15 years, though, optimization has been widely applied to address practical problems in aerospace design [3-5]. The design of high performance aerospace systems is a complex task. It involves the integration of several disciplines such as aerodynamics, structural analysis, dynamics, and aeroelasticity. The problem involves multiple objectives and constraints pertaining to the design criteria associated with each of these disciplines. Many important trade-offs exist between the parameters involved which are used to define the different disciplines. Therefore, the development of multidisciplinary design optimization (MDO) techniques, in which different disciplines and design parameters are coupled into a closed loop numerical procedure, seems appropriate to address such a complex problem. The importance of MDO in successful design of aerospace systems has been long recognized. Recent developments in this field have been surveyed by Sobieszczanski-Sobieski and Haftka [6].

  17. Program Helps To Determine Chemical-Reaction Mechanisms

    NASA Technical Reports Server (NTRS)

    Bittker, D. A.; Radhakrishnan, K.

    1995-01-01

    General Chemical Kinetics and Sensitivity Analysis (LSENS) computer code developed for use in solving complex, homogeneous, gas-phase, chemical-kinetics problems. Provides for efficient and accurate chemical-kinetics computations and provides for sensitivity analysis for variety of problems, including problems involving honisothermal conditions. Incorporates mathematical models for static system, steady one-dimensional inviscid flow, reaction behind incident shock wave (with boundary-layer correction), and perfectly stirred reactor. Computations of equilibrium properties performed for following assigned states: enthalpy and pressure, temperature and pressure, internal energy and volume, and temperature and volume. Written in FORTRAN 77 with exception of NAMELIST extensions used for input.

  18. Finding order in complexity: themes from the career of Dr. Robert F. Wagner

    NASA Astrophysics Data System (ADS)

    Myers, Kyle J.

    2009-02-01

    Over the course of his long and productive career, Dr. Robert F. Wagner built a framework for the evaluation of imaging systems based on a task-based, decision theoretic approach. His most recent contributions involved the consideration of the random effects associated with multiple readers of medical images and the logical extension of this work to the problem of the evaluation of multiple competing classifiers in statistical pattern recognition. This contemporary work expanded on familiar themes from Bob's many SPIE presentations in earlier years. It was driven by the need for practical solutions to current problems facing FDA'S Center for Devices and Radiological Health and the medical imaging community regarding the assessment of new computer-aided diagnosis tools and Bob's unique ability to unify concepts across a range of disciplines as he gave order to increasingly complex problems in our field.

  19. Precision time distribution within a deep space communications complex

    NASA Technical Reports Server (NTRS)

    Curtright, J. B.

    1972-01-01

    The Precision Time Distribution System (PTDS) at the Golstone Deep Space Communications Complex is a practical application of existing technology to the solution of a local problem. The problem was to synchronize four station timing systems to a master source with a relative accuracy consistently and significantly better than 10 microseconds. The solution involved combining a precision timing source, an automatic error detection assembly and a microwave distribution network into an operational system. Upon activation of the completed PTDS two years ago, synchronization accuracy at Goldstone (two station relative) was improved by an order of magnitude. It is felt that the validation of the PTDS mechanization is now completed. Other facilities which have site dispersion and synchronization accuracy requirements similar to Goldstone may find the PTDS mechanization useful in solving their problem. At present, the two station relative synchronization accuracy at Goldstone is better than one microsecond.

  20. Perspective: Quantum mechanical methods in biochemistry and biophysics.

    PubMed

    Cui, Qiang

    2016-10-14

    In this perspective article, I discuss several research topics relevant to quantum mechanical (QM) methods in biophysical and biochemical applications. Due to the immense complexity of biological problems, the key is to develop methods that are able to strike the proper balance of computational efficiency and accuracy for the problem of interest. Therefore, in addition to the development of novel ab initio and density functional theory based QM methods for the study of reactive events that involve complex motifs such as transition metal clusters in metalloenzymes, it is equally important to develop inexpensive QM methods and advanced classical or quantal force fields to describe different physicochemical properties of biomolecules and their behaviors in complex environments. Maintaining a solid connection of these more approximate methods with rigorous QM methods is essential to their transferability and robustness. Comparison to diverse experimental observables helps validate computational models and mechanistic hypotheses as well as driving further development of computational methodologies.

  1. A comparative study of turbulence models for overset grids

    NASA Technical Reports Server (NTRS)

    Renze, Kevin J.; Buning, Pieter G.; Rajagopalan, R. G.

    1992-01-01

    The implementation of two different types of turbulence models for a flow solver using the Chimera overset grid method is examined. Various turbulence model characteristics, such as length scale determination and transition modeling, are found to have a significant impact on the computed pressure distribution for a multielement airfoil case. No inherent problem is found with using either algebraic or one-equation turbulence models with an overset grid scheme, but simulation of turbulence for multiple-body or complex geometry flows is very difficult regardless of the gridding method. For complex geometry flowfields, modification of the Baldwin-Lomax turbulence model is necessary to select the appropriate length scale in wall-bounded regions. The overset grid approach presents no obstacle to use of a one- or two-equation turbulence model. Both Baldwin-Lomax and Baldwin-Barth models have problems providing accurate eddy viscosity levels for complex multiple-body flowfields such as those involving the Space Shuttle.

  2. Explicit solution techniques for impact with contact constraints

    NASA Technical Reports Server (NTRS)

    Mccarty, Robert E.

    1993-01-01

    Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.

  3. Explicit solution techniques for impact with contact constraints

    NASA Astrophysics Data System (ADS)

    McCarty, Robert E.

    1993-08-01

    Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.

  4. Practicality of electronic beam steering for MST/ST radars, part 6.2A

    NASA Technical Reports Server (NTRS)

    Clark, W. L.; Green, J. L.

    1984-01-01

    Electronic beam steering is described as complex and expensive. The Sunset implementation of electronic steering is described, and it is demonstrated that such systems are cost effective, versatile, and no more complex than fixed beam alternatives, provided three or more beams are needed. The problem of determining accurate meteorological wind components in the presence of spatial variation is considered. A cost comparison of steerable and fixed systems allowing solution of this problem is given. The concepts and relations involved in phase steering are given, followed by the description of the Sunset ST radar steering system. The implications are discussed, references to the competing SAD method are provided, and a recommendation concerning the design of the future Doppler ST/MST systems is made.

  5. Teaching NMR spectra analysis with nmr.cheminfo.org.

    PubMed

    Patiny, Luc; Bolaños, Alejandro; Castillo, Andrés M; Bernal, Andrés; Wist, Julien

    2018-06-01

    Teaching spectra analysis and structure elucidation requires students to get trained on real problems. This involves solving exercises of increasing complexity and when necessary using computational tools. Although desktop software packages exist for this purpose, nmr.cheminfo.org platform offers students an online alternative. It provides a set of exercises and tools to help solving them. Only a small number of exercises are currently available, but contributors are invited to submit new ones and suggest new types of problems. Copyright © 2018 John Wiley & Sons, Ltd.

  6. Communications network design and costing model technical manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    This computer model provides the capability for analyzing long-haul trunking networks comprising a set of user-defined cities, traffic conditions, and tariff rates. Networks may consist of all terrestrial connectivity, all satellite connectivity, or a combination of terrestrial and satellite connectivity. Network solutions provide the least-cost routes between all cities, the least-cost network routing configuration, and terrestrial and satellite service cost totals. The CNDC model allows analyses involving three specific FCC-approved tariffs, which are uniquely structured and representative of most existing service connectivity and pricing philosophies. User-defined tariffs that can be variations of these three tariffs are accepted as input to the model and allow considerable flexibility in network problem specification. The resulting model extends the domain of network analysis from traditional fixed link cost (distance-sensitive) problems to more complex problems involving combinations of distance and traffic-sensitive tariffs.

  7. Stress Recovery and Error Estimation for Shell Structures

    NASA Technical Reports Server (NTRS)

    Yazdani, A. A.; Riggs, H. R.; Tessler, A.

    2000-01-01

    The Penalized Discrete Least-Squares (PDLS) stress recovery (smoothing) technique developed for two dimensional linear elliptic problems is adapted here to three-dimensional shell structures. The surfaces are restricted to those which have a 2-D parametric representation, or which can be built-up of such surfaces. The proposed strategy involves mapping the finite element results to the 2-D parametric space which describes the geometry, and smoothing is carried out in the parametric space using the PDLS-based Smoothing Element Analysis (SEA). Numerical results for two well-known shell problems are presented to illustrate the performance of SEA/PDLS for these problems. The recovered stresses are used in the Zienkiewicz-Zhu a posteriori error estimator. The estimated errors are used to demonstrate the performance of SEA-recovered stresses in automated adaptive mesh refinement of shell structures. The numerical results are encouraging. Further testing involving more complex, practical structures is necessary.

  8. A solution to the surface intersection problem. [Boolean functions in geometric modeling

    NASA Technical Reports Server (NTRS)

    Timer, H. G.

    1977-01-01

    An application-independent geometric model within a data base framework should support the use of Boolean operators which allow the user to construct a complex model by appropriately combining a series of simple models. The use of these operators leads to the concept of implicitly and explicitly defined surfaces. With an explicitly defined model, the surface area may be computed by simply summing the surface areas of the bounding surfaces. For an implicitly defined model, the surface area computation must deal with active and inactive regions. Because the surface intersection problem involves four unknowns and its solution is a space curve, the parametric coordinates of each surface must be determined as a function of the arc length. Various subproblems involved in the general intersection problem are discussed, and the mathematical basis for their solution is presented along with a program written in FORTRAN IV for implementation on the IBM 370 TSO system.

  9. A Control Variate Method for Probabilistic Performance Assessment. Improved Estimates for Mean Performance Quantities of Interest

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacKinnon, Robert J.; Kuhlman, Kristopher L

    2016-05-01

    We present a method of control variates for calculating improved estimates for mean performance quantities of interest, E(PQI) , computed from Monte Carlo probabilistic simulations. An example of a PQI is the concentration of a contaminant at a particular location in a problem domain computed from simulations of transport in porous media. To simplify the presentation, the method is described in the setting of a one- dimensional elliptical model problem involving a single uncertain parameter represented by a probability distribution. The approach can be easily implemented for more complex problems involving multiple uncertain parameters and in particular for application tomore » probabilistic performance assessment of deep geologic nuclear waste repository systems. Numerical results indicate the method can produce estimates of E(PQI)having superior accuracy on coarser meshes and reduce the required number of simulations needed to achieve an acceptable estimate.« less

  10. The evaluative imaging of mental models - Visual representations of complexity

    NASA Technical Reports Server (NTRS)

    Dede, Christopher

    1989-01-01

    The paper deals with some design issues involved in building a system that could visually represent the semantic structures of training materials and their underlying mental models. In particular, hypermedia-based semantic networks that instantiate classification problem solving strategies are thought to be a useful formalism for such representations; the complexity of these web structures can be best managed through visual depictions. It is also noted that a useful approach to implement in these hypermedia models would be some metrics of conceptual distance.

  11. Stock-car racing makes intuitive physicists

    NASA Astrophysics Data System (ADS)

    Gwynne, Peter

    2008-03-01

    Formula One races involve cars festooned with gadgets and complex electronic devices, in which millions of dollars are spent refining a vehicle's aerodynamics and reducing its weight. But in events run by America's National Association of Stock Car Auto Racing (NASCAR), cars hurtle round an oval track at speeds of about 300 km h-1 without the help of the complex sensors that are employed in Formula One cars. To avoid crashing, drivers must make their own adjustments to track conditions, engine problems and the traffic around them.

  12. Transient modeling/analysis of hyperbolic heat conduction problems employing mixed implicit-explicit alpha method

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; D'Costa, Joseph F.

    1991-01-01

    This paper describes the evaluation of mixed implicit-explicit finite element formulations for hyperbolic heat conduction problems involving non-Fourier effects. In particular, mixed implicit-explicit formulations employing the alpha method proposed by Hughes et al. (1987, 1990) are described for the numerical simulation of hyperbolic heat conduction models, which involves time-dependent relaxation effects. Existing analytical approaches for modeling/analysis of such models involve complex mathematical formulations for obtaining closed-form solutions, while in certain numerical formulations the difficulties include severe oscillatory solution behavior (which often disguises the true response) in the vicinity of the thermal disturbances, which propagate with finite velocities. In view of these factors, the alpha method is evaluated to assess the control of the amount of numerical dissipation for predicting the transient propagating thermal disturbances. Numerical test models are presented, and pertinent conclusions are drawn for the mixed-time integration simulation of hyperbolic heat conduction models involving non-Fourier effects.

  13. A scheme to calculate higher-order homogenization as applied to micro-acoustic boundary value problems

    NASA Astrophysics Data System (ADS)

    Vagh, Hardik A.; Baghai-Wadji, Alireza

    2008-12-01

    Current technological challenges in materials science and high-tech device industry require the solution of boundary value problems (BVPs) involving regions of various scales, e.g. multiple thin layers, fibre-reinforced composites, and nano/micro pores. In most cases straightforward application of standard variational techniques to BVPs of practical relevance necessarily leads to unsatisfactorily ill-conditioned analytical and/or numerical results. To remedy the computational challenges associated with sub-sectional heterogeneities various sophisticated homogenization techniques need to be employed. Homogenization refers to the systematic process of smoothing out the sub-structural heterogeneities, leading to the determination of effective constitutive coefficients. Ordinarily, homogenization involves a sophisticated averaging and asymptotic order analysis to obtain solutions. In the majority of the cases only zero-order terms are constructed due to the complexity of the processes involved. In this paper we propose a constructive scheme for obtaining homogenized solutions involving higher order terms, and thus, guaranteeing higher accuracy and greater robustness of the numerical results. We present

  14. Improved teaching-learning-based and JAYA optimization algorithms for solving flexible flow shop scheduling problems

    NASA Astrophysics Data System (ADS)

    Buddala, Raviteja; Mahapatra, Siba Sankar

    2017-11-01

    Flexible flow shop (or a hybrid flow shop) scheduling problem is an extension of classical flow shop scheduling problem. In a simple flow shop configuration, a job having `g' operations is performed on `g' operation centres (stages) with each stage having only one machine. If any stage contains more than one machine for providing alternate processing facility, then the problem becomes a flexible flow shop problem (FFSP). FFSP which contains all the complexities involved in a simple flow shop and parallel machine scheduling problems is a well-known NP-hard (Non-deterministic polynomial time) problem. Owing to high computational complexity involved in solving these problems, it is not always possible to obtain an optimal solution in a reasonable computation time. To obtain near-optimal solutions in a reasonable computation time, a large variety of meta-heuristics have been proposed in the past. However, tuning algorithm-specific parameters for solving FFSP is rather tricky and time consuming. To address this limitation, teaching-learning-based optimization (TLBO) and JAYA algorithm are chosen for the study because these are not only recent meta-heuristics but they do not require tuning of algorithm-specific parameters. Although these algorithms seem to be elegant, they lose solution diversity after few iterations and get trapped at the local optima. To alleviate such drawback, a new local search procedure is proposed in this paper to improve the solution quality. Further, mutation strategy (inspired from genetic algorithm) is incorporated in the basic algorithm to maintain solution diversity in the population. Computational experiments have been conducted on standard benchmark problems to calculate makespan and computational time. It is found that the rate of convergence of TLBO is superior to JAYA. From the results, it is found that TLBO and JAYA outperform many algorithms reported in the literature and can be treated as efficient methods for solving the FFSP.

  15. Quantum speedup in solving the maximal-clique problem

    NASA Astrophysics Data System (ADS)

    Chang, Weng-Long; Yu, Qi; Li, Zhaokai; Chen, Jiahui; Peng, Xinhua; Feng, Mang

    2018-03-01

    The maximal-clique problem, to find the maximally sized clique in a given graph, is classically an NP-complete computational problem, which has potential applications ranging from electrical engineering, computational chemistry, and bioinformatics to social networks. Here we develop a quantum algorithm to solve the maximal-clique problem for any graph G with n vertices with quadratic speedup over its classical counterparts, where the time and spatial complexities are reduced to, respectively, O (√{2n}) and O (n2) . With respect to oracle-related quantum algorithms for the NP-complete problems, we identify our algorithm as optimal. To justify the feasibility of the proposed quantum algorithm, we successfully solve a typical clique problem for a graph G with two vertices and one edge by carrying out a nuclear magnetic resonance experiment involving four qubits.

  16. Planning and Scheduling for Fleets of Earth Observing Satellites

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Jonsson, Ari; Morris, Robert; Smith, David E.; Norvig, Peter (Technical Monitor)

    2001-01-01

    We address the problem of scheduling observations for a collection of earth observing satellites. This scheduling task is a difficult optimization problem, potentially involving many satellites, hundreds of requests, constraints on when and how to service each request, and resources such as instruments, recording devices, transmitters, and ground stations. High-fidelity models are required to ensure the validity of schedules; at the same time, the size and complexity of the problem makes it unlikely that systematic optimization search methods will be able to solve them in a reasonable time. This paper presents a constraint-based approach to solving the Earth Observing Satellites (EOS) scheduling problem, and proposes a stochastic heuristic search method for solving it.

  17. Study of mutual influence of hydrogen bonds in complicated complexes by low-temperature 1H NMR spectroscopy

    NASA Astrophysics Data System (ADS)

    Golubev, N. S.; Denisov, G. S.

    1992-07-01

    1H NMR spectra of various acid-base complexes of different stoichiometry at 100-120K in freon mixtures have been obtained. The separate signals of non-equivalent OH-protons, involved in different H-bonds, have allowed us to consider the problem of the mutual influence of these bonds, using a correlation between the δ OH chemical shift and the AΔ H H-bond enthalpy. The mutual strengthening of H-bonds in complexes of the AH⋯AH⋯B type and their weakening in AH⋯B⋯HA complexes have been found, the value of the effect being about 10-30%

  18. Advancing water resource management in agricultural, rural, and urbanizing watersheds: Enhancing University involvement

    USDA-ARS?s Scientific Manuscript database

    In this research editorial we make four points relative to solving water resource issues: (1) they are complex problems and difficult to solve, (2) some progress has been made on solving these issues, (3) external non-stationary drivers such as land use changes, climate change and variability, and s...

  19. Medical Student and Junior Doctors' Tolerance of Ambiguity: Development of a New Scale

    ERIC Educational Resources Information Center

    Hancock, Jason; Roberts, Martin; Monrouxe, Lynn; Mattick, Karen

    2015-01-01

    The practice of medicine involves inherent ambiguity, arising from limitations of knowledge, diagnostic problems, complexities of treatment and outcome and unpredictability of patient response. Research into doctors' tolerance of ambiguity is hampered by poor conceptual clarity and inadequate measurement scales. We aimed to create and pilot a…

  20. Estimation of Complex Generalized Linear Mixed Models for Measurement and Growth

    ERIC Educational Resources Information Center

    Jeon, Minjeong

    2012-01-01

    Maximum likelihood (ML) estimation of generalized linear mixed models (GLMMs) is technically challenging because of the intractable likelihoods that involve high dimensional integrations over random effects. The problem is magnified when the random effects have a crossed design and thus the data cannot be reduced to small independent clusters. A…

  1. Teenaged Pregnancy. Matrix No. 5.

    ERIC Educational Resources Information Center

    Hardy, Janet B.

    The purposes of this paper are (1) to highlight some of the complex issues involved in teenage pregnancy and its consequences; (2) to comment on some of the problems that make solutions difficult to achieve; and (3) to indicate areas in which further research is of critical importance. Among the issues of teenage pregnancy discussed are the…

  2. A Physician/Psychologist Team Approach to Children and Adolescents with Recurrent Somatic Complaints.

    ERIC Educational Resources Information Center

    Greene, John W.; Thompson, Warren

    1984-01-01

    Children and adolescents with recurrent somatic complaints represent some of the most complex school health-related problems encountered by school officials and physicians. These complaints account for missed days and are often the primary reason for prolonged absences. A collaborative approach involving a school psychologist and physician team is…

  3. Stone Soup Partnership: A Grassroots Model of Community Service.

    ERIC Educational Resources Information Center

    Kittredge, Robert E.

    1997-01-01

    Stone Soup Partnership is a collaboration between California State University at Fresno and its surrounding community to address serious problems in a high-crime, impoverished apartment complex near the university. The program involves students in service learning for university credit, and has expanded from a single summer youth program to a…

  4. Crisis of Black Athletes on the Eve of the 21st Century.

    ERIC Educational Resources Information Center

    Edwards, Harry

    2000-01-01

    Asserts that the dynamics of black youth sports involvement and the blind faith of black youths and families in sports as a vehicle for self-realization and socioeconomic advancement generate complex problems for black society. Black families often push their children toward sports careers, neglecting personal and cultural development. Discusses…

  5. Current Practice in Psychopharmacology for Children and Adolescents with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Floyd, Elizabeth Freeman; McIntosh, David E.

    2009-01-01

    Autism spectrum disorders (ASDs) are a complex group of neurodevelopmental conditions that develop in early childhood and involve a range of impairments in core areas of social interaction, communication, and restricted behavior and interests. Associated behavioral problems such as tantrums, aggression, and self-injury frequently compound the core…

  6. Using Immersive Virtual Environments for Certification

    NASA Technical Reports Server (NTRS)

    Lutz, R.; Cruz-Neira, C.

    1998-01-01

    Immersive virtual environments (VEs) technology has matured to the point where it can be utilized as a scientific and engineering problem solving tool. In particular, VEs are starting to be used to design and evaluate safety-critical systems that involve human operators, such as flight and driving simulators, complex machinery training, and emergency rescue strategies.

  7. Cognitive Activity-based Design Methodology for Novice Visual Communication Designers

    ERIC Educational Resources Information Center

    Kim, Hyunjung; Lee, Hyunju

    2016-01-01

    The notion of design thinking is becoming more concrete nowadays, as design researchers and practitioners study the thinking processes involved in design and employ the concept of design thinking to foster better solutions to complex and ill-defined problems. The goal of the present research is to develop a cognitive activity-based design…

  8. The Transformation from Multidisciplinarity to Interdisciplinarity: A Case Study of a Course Involving the Status of Arab Citizens of Israel

    ERIC Educational Resources Information Center

    Tayler, Marilyn R.

    2014-01-01

    The author demonstrates that entry-level students can achieve a more comprehensive understanding of complex problems through an explicitly interdisciplinary approach than through a merely multidisciplinary approach, using the process described in Repko's (2014) "Introduction to Interdisciplinary Studies." Repko takes the…

  9. Mental Images and the Modification of Learning Defects.

    ERIC Educational Resources Information Center

    Patten, Bernard M.

    Because human memory and thought involve extremely complex processes, it is possible to employ unusual modalities and specific visual strategies for remembering and problem-solving to assist patients with memory defects. This three-part paper discusses some of the research in the field of human memory and describes practical applications of these…

  10. Child impulsiveness-inattention, early peer experiences, and the development of early onset conduct problems.

    PubMed

    Snyder, James; Prichard, Joy; Schrepferman, Lynn; Patrick, M Renee; Stoolmiller, Mike

    2004-12-01

    The conjoint influence of child impulsiveness-inattention (I/I) and peer relationships on growth trajectories of conduct problems was assessed in a community sample of 267 boys and girls. I/I reliably predicted teacher- and parent-reported conduct problems at kindergarten entry and growth in those problems over the next 2 years for boys and girls. The relation of boys' I/I to conduct problems was mediated, in part, by peer rejection and involvement in coercive exchanges with peers. The relation of girls' I/I to conduct problems was less clearly mediated by peer processes, but peer difficulties had additive effects. The impact of peer relationships on trajectories of conduct problems was apparent to parents as well as to teachers. Although I/I increments risk for early and persisting conduct problems in concert with poor peer relationships, it does so in complex and gender-specific ways.

  11. The Movable Type Method Applied to Protein-Ligand Binding.

    PubMed

    Zheng, Zheng; Ucisik, Melek N; Merz, Kenneth M

    2013-12-10

    Accurately computing the free energy for biological processes like protein folding or protein-ligand association remains a challenging problem. Both describing the complex intermolecular forces involved and sampling the requisite configuration space make understanding these processes innately difficult. Herein, we address the sampling problem using a novel methodology we term "movable type". Conceptually it can be understood by analogy with the evolution of printing and, hence, the name movable type. For example, a common approach to the study of protein-ligand complexation involves taking a database of intact drug-like molecules and exhaustively docking them into a binding pocket. This is reminiscent of early woodblock printing where each page had to be laboriously created prior to printing a book. However, printing evolved to an approach where a database of symbols (letters, numerals, etc.) was created and then assembled using a movable type system, which allowed for the creation of all possible combinations of symbols on a given page, thereby, revolutionizing the dissemination of knowledge. Our movable type (MT) method involves the identification of all atom pairs seen in protein-ligand complexes and then creating two databases: one with their associated pairwise distant dependent energies and another associated with the probability of how these pairs can combine in terms of bonds, angles, dihedrals and non-bonded interactions. Combining these two databases coupled with the principles of statistical mechanics allows us to accurately estimate binding free energies as well as the pose of a ligand in a receptor. This method, by its mathematical construction, samples all of configuration space of a selected region (the protein active site here) in one shot without resorting to brute force sampling schemes involving Monte Carlo, genetic algorithms or molecular dynamics simulations making the methodology extremely efficient. Importantly, this method explores the free energy surface eliminating the need to estimate the enthalpy and entropy components individually. Finally, low free energy structures can be obtained via a free energy minimization procedure yielding all low free energy poses on a given free energy surface. Besides revolutionizing the protein-ligand docking and scoring problem this approach can be utilized in a wide range of applications in computational biology which involve the computation of free energies for systems with extensive phase spaces including protein folding, protein-protein docking and protein design.

  12. Non-Residential Father-Child Involvement, Interparental Conflict and Mental Health of Children Following Divorce: A Person-Focused Approach.

    PubMed

    Elam, Kit K; Sandler, Irwin; Wolchik, Sharlene; Tein, Jenn-Yun

    2016-03-01

    Variable-centered research has found complex relationships between child well-being and two critical aspects of the post-divorce family environment: the level of non-residential father involvement (i.e., contact and supportive relationship) with their children and the level of conflict between the father and mother. However, these analyses fail to capture individual differences based on distinct patterns of interparental conflict, father support and father contact. Using a person-centered latent profile analysis, the present study examined (1) profiles of non-residential father contact, support, and interparental conflict in the 2 years following divorce (N = 240), when children (49 % female) were between 9 and 12 years of age and (2) differences across profiles in concurrent child adjustment outcomes as well as outcomes 6 years later. Four profiles of father involvement were identified: High Contact-Moderate Conflict-Moderate Support, Low Contact-Moderate Conflict-Low Support, High Conflict-Moderate Contact-Moderate Support, and Low Conflict-Moderate Contact-Moderate Support. Concurrently, children with fathers in the group with high conflict were found to have significantly greater internalizing and externalizing problems compared to all other groups. Six years later, children with fathers in the group with low contact and low support were found to have greater internalizing and externalizing problems compared to children with fathers in the high conflict group, and also greater internalizing problems compared to children with fathers in the low conflict group. These results provide insight into the complex relationship among non-residential fathers' conflict, contact, and support in child adjustment within divorcing families.

  13. Crew collaboration in space: a naturalistic decision-making perspective

    NASA Technical Reports Server (NTRS)

    Orasanu, Judith

    2005-01-01

    Successful long-duration space missions will depend on the ability of crewmembers to respond promptly and effectively to unanticipated problems that arise under highly stressful conditions. Naturalistic decision making (NDM) exploits the knowledge and experience of decision makers in meaningful work domains, especially complex sociotechnical systems, including aviation and space. Decision making in these ambiguous, dynamic, high-risk environments is a complex task that involves defining the nature of the problem and crafting a response to achieve one's goals. Goal conflicts, time pressures, and uncertain outcomes may further complicate the process. This paper reviews theory and research pertaining to the NDM model and traces some of the implications for space crews and other groups that perform meaningful work in extreme environments. It concludes with specific recommendations for preparing exploration crews to use NDM effectively.

  14. Sensitivity based coupling strengths in complex engineering systems

    NASA Technical Reports Server (NTRS)

    Bloebaum, C. L.; Sobieszczanski-Sobieski, J.

    1993-01-01

    The iterative design scheme necessary for complex engineering systems is generally time consuming and difficult to implement. Although a decomposition approach results in a more tractable problem, the inherent couplings make establishing the interdependencies of the various subsystems difficult. Another difficulty lies in identifying the most efficient order of execution for the subsystem analyses. The paper describes an approach for determining the dependencies that could be suspended during the system analysis with minimal accuracy losses, thereby reducing the system complexity. A new multidisciplinary testbed is presented, involving the interaction of structures, aerodynamics, and performance disciplines. Results are presented to demonstrate the effectiveness of the system reduction scheme.

  15. Vibronic Boson Sampling: Generalized Gaussian Boson Sampling for Molecular Vibronic Spectra at Finite Temperature.

    PubMed

    Huh, Joonsuk; Yung, Man-Hong

    2017-08-07

    Molecular vibroic spectroscopy, where the transitions involve non-trivial Bosonic correlation due to the Duschinsky Rotation, is strongly believed to be in a similar complexity class as Boson Sampling. At finite temperature, the problem is represented as a Boson Sampling experiment with correlated Gaussian input states. This molecular problem with temperature effect is intimately related to the various versions of Boson Sampling sharing the similar computational complexity. Here we provide a full description to this relation in the context of Gaussian Boson Sampling. We find a hierarchical structure, which illustrates the relationship among various Boson Sampling schemes. Specifically, we show that every instance of Gaussian Boson Sampling with an initial correlation can be simulated by an instance of Gaussian Boson Sampling without initial correlation, with only a polynomial overhead. Since every Gaussian state is associated with a thermal state, our result implies that every sampling problem in molecular vibronic transitions, at any temperature, can be simulated by Gaussian Boson Sampling associated with a product of vacuum modes. We refer such a generalized Gaussian Boson Sampling motivated by the molecular sampling problem as Vibronic Boson Sampling.

  16. Dynamic pathway modeling of signal transduction networks: a domain-oriented approach.

    PubMed

    Conzelmann, Holger; Gilles, Ernst-Dieter

    2008-01-01

    Mathematical models of biological processes become more and more important in biology. The aim is a holistic understanding of how processes such as cellular communication, cell division, regulation, homeostasis, or adaptation work, how they are regulated, and how they react to perturbations. The great complexity of most of these processes necessitates the generation of mathematical models in order to address these questions. In this chapter we provide an introduction to basic principles of dynamic modeling and highlight both problems and chances of dynamic modeling in biology. The main focus will be on modeling of s transduction pathways, which requires the application of a special modeling approach. A common pattern, especially in eukaryotic signaling systems, is the formation of multi protein signaling complexes. Even for a small number of interacting proteins the number of distinguishable molecular species can be extremely high. This combinatorial complexity is due to the great number of distinct binding domains of many receptors and scaffold proteins involved in signal transduction. However, these problems can be overcome using a new domain-oriented modeling approach, which makes it possible to handle complex and branched signaling pathways.

  17. Weak imposition of frictionless contact constraints on automatically recovered high-order, embedded interfaces using the finite cell method

    NASA Astrophysics Data System (ADS)

    Bog, Tino; Zander, Nils; Kollmannsberger, Stefan; Rank, Ernst

    2018-04-01

    The finite cell method (FCM) is a fictitious domain approach that greatly simplifies simulations involving complex structures. Recently, the FCM has been applied to contact problems. The current study continues in this field by extending the concept of weakly enforced boundary conditions to inequality constraints for frictionless contact. Furthermore, it formalizes an approach that automatically recovers high-order contact surfaces of (implicitly defined) embedded geometries by means of an extended Marching Cubes algorithm. To further improve the accuracy of the discretization, irregularities at the boundary of contact zones are treated with multi-level hp-refinements. Numerical results and a systematic study of h-, p- and hp-refinements show that the FCM can efficiently provide accurate results for problems involving contact.

  18. Shooting method for solution of boundary-layer flows with massive blowing

    NASA Technical Reports Server (NTRS)

    Liu, T.-M.; Nachtsheim, P. R.

    1973-01-01

    A modified, bidirectional shooting method is presented for solving boundary-layer equations under conditions of massive blowing. Unlike the conventional shooting method, which is unstable when the blowing rate increases, the proposed method avoids the unstable direction and is capable of solving complex boundary-layer problems involving mass and energy balance on the surface.

  19. Reconsidering Social Science Theories in Natural Resource Management Continuing Professional Education

    ERIC Educational Resources Information Center

    Stummann, C. B.; Gamborg, C.

    2014-01-01

    Over 25 years ago, the "wicked problems" concept was introduced into forestry to describe the increasingly complex work situations faced by many natural resource management (NRM) professionals and at the same time the demand and frequency of public involvement in NRM issues also grew. Research on the impact of these changes for NRM…

  20. The Effects of a Participatory Action Research Intervention on Middle School Students' Understanding of Schools, Culture and Place

    ERIC Educational Resources Information Center

    Cassie, Jonathan Martin

    2011-01-01

    School violence is a complex cultural problem that affects most schools. This study used a participatory action research model involving mapmaking, photography and intercultural grouping to understand how one school's physical environment and social geography contributed to interethnic tensions on campus. The study found that mapmaking allowed…

  1. The Rules Grid: Helping Children with Social Communication and Interaction Needs Manage Social Complexity

    ERIC Educational Resources Information Center

    Devlin, Niall

    2009-01-01

    This article introduces a new practical visual approach, the Rules Grid, to support children who have social communication and interaction needs. The Rules Grid involves a system whereby behaviours of concern can be broken down into smaller behavioural manifestations which in turn lead not only to problem identification and specification, but…

  2. Interceptive Skills in Children Aged 9-11 Years, Diagnosed with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Whyatt, Caroline; Craig, Cathy M.

    2013-01-01

    Growing evidence suggests that significant motor problems are associated with a diagnosis of Autism Spectrum Disorders (ASD), particularly in catching tasks. Catching is a complex, dynamic skill that involves the ability to synchronise one's own movement to that of a moving target. To successfully complete the task, the participant must pick up…

  3. Working Together Differently: Addressing the Housing Crisis in Oregon

    ERIC Educational Resources Information Center

    Ramaley, Judith A.

    2017-01-01

    Universities are being asked to prepare our students to navigate successfully in a complex and interconnected world and to contribute to the solution of difficult problems at work and in the communities where they live. Our universities must do the same. We must adapt our approaches to education, scholarship and community involvement in order to…

  4. Teaching and Assessing Tag Rugby Made Simple

    ERIC Educational Resources Information Center

    Harvey, Stephen; Hughes, Christopher

    2009-01-01

    The game of rugby is a fast and fluid invasion game, similar to football, that involves scoring with an oval ball into an end zone. The game presents, like other invasion games, a series of highly complex tactical problems so that the ball can be maneuvered into a scoring position. Pugh and Alford (2004) recently indicated that rugby is now…

  5. "If You Brave Enough to Live It, the Least I Can Do Is Listen": Overcoming the Consequences of Complex Trauma

    ERIC Educational Resources Information Center

    Hudson, Lucy; Beilke, Sarah; Many, Michele

    2016-01-01

    Too many parents who find themselves involved with child welfare agencies have had lives threaded with deeply traumatic events. As adults, their childhood histories manifest themselves in substance abuse, domestic violence, relational problems, risk-taking behaviors, emotional lability, self-harming, anxiety, and depression. To successfully…

  6. Photography as a Means of Narrowing the Gap between Physics and Students

    ERIC Educational Resources Information Center

    Bagno, Esther; Eylon, Bat-Sheva; Levy, Smadar

    2007-01-01

    Many teachers would agree that not all their A-level students appreciate the beauty of physics or enjoy solving complex problems. In this article, we describe a photo-contest activity aimed at narrowing the gap between physics and students. The photo contest, involving both students and teachers, is guided by the National Center of Physics…

  7. Privacy and the First Amendment. Freedom of Information Foundation Series No. 5.

    ERIC Educational Resources Information Center

    Clancy, Paul

    Two strong constitutional principles--the right of privacy and freedom of the press--are headed for a major confrontation in the courts. This document explores the complex problems involved in balancing the interests of individuals and of society (the first amendment is a remedy against government, not a weapon against the people). Consideration…

  8. Reflection: A Renewed and Practical Focus for an Existing Problem in Teacher Education

    ERIC Educational Resources Information Center

    Roberts, Pauline

    2016-01-01

    Reflection has been a component of teacher education programs for many years. The introduction of the Early Years Learning Framework (EYLF) and the National Quality Standard (NQS) into Western Australian schools appear to have brought a renewed focus to this. For universities involved in teacher education, reflection remains a complex construct…

  9. Potential uses of Bayesian networks as tools for synthesis of systematic reviews of complex interventions.

    PubMed

    Stewart, G B; Mengersen, K; Meader, N

    2014-03-01

    Bayesian networks (BNs) are tools for representing expert knowledge or evidence. They are especially useful for synthesising evidence or belief concerning a complex intervention, assessing the sensitivity of outcomes to different situations or contextual frameworks and framing decision problems that involve alternative types of intervention. Bayesian networks are useful extensions to logic maps when initiating a review or to facilitate synthesis and bridge the gap between evidence acquisition and decision-making. Formal elicitation techniques allow development of BNs on the basis of expert opinion. Such applications are useful alternatives to 'empty' reviews, which identify knowledge gaps but fail to support decision-making. Where review evidence exists, it can inform the development of a BN. We illustrate the construction of a BN using a motivating example that demonstrates how BNs can ensure coherence, transparently structure the problem addressed by a complex intervention and assess sensitivity to context, all of which are critical components of robust reviews of complex interventions. We suggest that BNs should be utilised to routinely synthesise reviews of complex interventions or empty reviews where decisions must be made despite poor evidence. Copyright © 2013 John Wiley & Sons, Ltd.

  10. Towards large scale multi-target tracking

    NASA Astrophysics Data System (ADS)

    Vo, Ba-Ngu; Vo, Ba-Tuong; Reuter, Stephan; Lam, Quang; Dietmayer, Klaus

    2014-06-01

    Multi-target tracking is intrinsically an NP-hard problem and the complexity of multi-target tracking solutions usually do not scale gracefully with problem size. Multi-target tracking for on-line applications involving a large number of targets is extremely challenging. This article demonstrates the capability of the random finite set approach to provide large scale multi-target tracking algorithms. In particular it is shown that an approximate filter known as the labeled multi-Bernoulli filter can simultaneously track one thousand five hundred targets in clutter on a standard laptop computer.

  11. Phenomenological theory of collective decision-making

    NASA Astrophysics Data System (ADS)

    Zafeiris, Anna; Koman, Zsombor; Mones, Enys; Vicsek, Tamás

    2017-08-01

    An essential task of groups is to provide efficient solutions for the complex problems they face. Indeed, considerable efforts have been devoted to the question of collective decision-making related to problems involving a single dominant feature. Here we introduce a quantitative formalism for finding the optimal distribution of the group members' competences in the more typical case when the underlying problem is complex, i.e., multidimensional. Thus, we consider teams that are aiming at obtaining the best possible answer to a problem having a number of independent sub-problems. Our approach is based on a generic scheme for the process of evaluating the proposed solutions (i.e., negotiation). We demonstrate that the best performing groups have at least one specialist for each sub-problem - but a far less intuitive result is that finding the optimal solution by the interacting group members requires that the specialists also have some insight into the sub-problems beyond their unique field(s). We present empirical results obtained by using a large-scale database of citations being in good agreement with the above theory. The framework we have developed can easily be adapted to a variety of realistic situations since taking into account the weights of the sub-problems, the opinions or the relations of the group is straightforward. Consequently, our method can be used in several contexts, especially when the optimal composition of a group of decision-makers is designed.

  12. Computer-based training for improving mental calculation in third- and fifth-graders.

    PubMed

    Caviola, Sara; Gerotto, Giulia; Mammarella, Irene C

    2016-11-01

    The literature on intervention programs to improve arithmetical abilities is fragmentary and few studies have examined training on the symbolic representation of numbers (i.e. Arabic digits). In the present research, three groups of 3rd- and 5th-grade schoolchildren were given training on mental additions: 76 were assigned to a computer-based strategic training (ST) group, 73 to a process-based training (PBT) group, and 71 to a passive control (PC) group. Before and after the training, the children were given a criterion task involving complex addition problems, a nearest transfer task on complex subtraction problems, two near transfer tasks on math fluency, and a far transfer task on numerical reasoning. Our results showed developmental differences: 3rd-graders benefited more from the ST, with transfer effects on subtraction problems and math fluency, while 5th-graders benefited more from the PBT, improving their response times in the criterion task. Developmental, clinical and educational implications of these findings are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. [New approaches in pharmacology: numerical modelling and simulation].

    PubMed

    Boissel, Jean-Pierre; Cucherat, Michel; Nony, Patrice; Dronne, Marie-Aimée; Kassaï, Behrouz; Chabaud, Sylvie

    2005-01-01

    The complexity of pathophysiological mechanisms is beyond the capabilities of traditional approaches. Many of the decision-making problems in public health, such as initiating mass screening, are complex. Progress in genomics and proteomics, and the resulting extraordinary increase in knowledge with regard to interactions between gene expression, the environment and behaviour, the customisation of risk factors and the need to combine therapies that individually have minimal though well documented efficacy, has led doctors to raise new questions: how to optimise choice and the application of therapeutic strategies at the individual rather than the group level, while taking into account all the available evidence? This is essentially a problem of complexity with dimensions similar to the previous ones: multiple parameters with nonlinear relationships between them, varying time scales that cannot be ignored etc. Numerical modelling and simulation (in silico investigations) have the potential to meet these challenges. Such approaches are considered in drug innovation and development. They require a multidisciplinary approach, and this will involve modification of the way research in pharmacology is conducted.

  14. Object oriented development of engineering software using CLIPS

    NASA Technical Reports Server (NTRS)

    Yoon, C. John

    1991-01-01

    Engineering applications involve numeric complexity and manipulations of a large amount of data. Traditionally, numeric computation has been the concern in developing an engineering software. As engineering application software became larger and more complex, management of resources such as data, rather than the numeric complexity, has become the major software design problem. Object oriented design and implementation methodologies can improve the reliability, flexibility, and maintainability of the resulting software; however, some tasks are better solved with the traditional procedural paradigm. The C Language Integrated Production System (CLIPS), with deffunction and defgeneric constructs, supports the procedural paradigm. The natural blending of object oriented and procedural paradigms has been cited as the reason for the popularity of the C++ language. The CLIPS Object Oriented Language's (COOL) object oriented features are more versatile than C++'s. A software design methodology based on object oriented and procedural approaches appropriate for engineering software, and to be implemented in CLIPS was outlined. A method for sensor placement for Space Station Freedom is being implemented in COOL as a sample problem.

  15. Dynamics of liquids, molecules, and proteins measured with ultrafast 2D IR vibrational echo chemical exchange spectroscopy.

    PubMed

    Fayer, M D

    2009-01-01

    A wide variety of molecular systems undergo fast structural changes under thermal equilibrium conditions. Such transformations are involved in a vast array of chemical problems. Experimentally measuring equilibrium dynamics is a challenging problem that is at the forefront of chemical research. This review describes ultrafast 2D IR vibrational echo chemical exchange experiments and applies them to several types of molecular systems. The formation and dissociation of organic solute-solvent complexes are directly observed. The dissociation times of 13 complexes, ranging from 4 ps to 140 ps, are shown to obey a relationship that depends on the complex's formation enthalpy. The rate of rotational gauche-trans isomerization around a carbon-carbon single bond is determined for a substituted ethane at room temperature in a low viscosity solvent. The results are used to obtain an approximate isomerization rate for ethane. Finally, the time dependence of a well-defined single structural transformation of a protein is measured.

  16. Observed behaviours of pre-term children in a social play situation with classroom peers.

    PubMed

    Nadeau, Line; Tessier, Réjean; Descôteaux, Amélie

    2009-08-01

    A number of studies have reported social adjustment problems in pre-term children. To observe the pre-term's behaviour in an experimental situation and correlate these observed behaviours with the children's peer-rated social behaviours (withdrawal, aggression and sociability/leadership). Of 56 pre-term children, 24 were classified as the sick pre-term (SPT) group and 32 children as the healthy pre-term (HPT) group. The comparison group comprised 56 healthy full-terms. The experimental situation used a game called Rush Hour, a labyrinth-type board game. The play situation was videotaped and behaviours (number of consecutive moves) were coded in real time. At 12 years of age, the sick pre-term (SPT) group exhibited fewer consecutive moves during the game than the other two groups, especially when the task became more complex (involving four consecutives moves). Moreover, the Complex Task Index was correlated with the social withdrawal score rated by peers. The at-birth sick pre-term gradually became less involved in a complex decision-making task and this was understood as a lesser ability to make a decision in a complex setting.

  17. The Effect of Normalization in Violence Video Classification Performance

    NASA Astrophysics Data System (ADS)

    Ali, Ashikin; Senan, Norhalina

    2017-08-01

    Basically, data pre-processing is an important part of data mining. Normalization is a pre-processing stage for any type of problem statement, especially in video classification. Challenging problems that arises in video classification is because of the heterogeneous content, large variations in video quality and complex semantic meanings of the concepts involved. Therefore, to regularize this problem, it is thoughtful to ensure normalization or basically involvement of thorough pre-processing stage aids the robustness of classification performance. This process is to scale all the numeric variables into certain range to make it more meaningful for further phases in available data mining techniques. Thus, this paper attempts to examine the effect of 2 normalization techniques namely Min-max normalization and Z-score in violence video classifications towards the performance of classification rate using Multi-layer perceptron (MLP) classifier. Using Min-Max Normalization range of [0,1] the result shows almost 98% of accuracy, meanwhile Min-Max Normalization range of [-1,1] accuracy is 59% and for Z-score the accuracy is 50%.

  18. Computation and visualization of geometric partial differential equations

    NASA Astrophysics Data System (ADS)

    Tiee, Christopher L.

    The chief goal of this work is to explore a modern framework for the study and approximation of partial differential equations, recast common partial differential equations into this framework, and prove theorems about such equations and their approximations. A central motivation is to recognize and respect the essential geometric nature of such problems, and take it into consideration when approximating. The hope is that this process will lead to the discovery of more refined algorithms and processes and apply them to new problems. In the first part, we introduce our quantities of interest and reformulate traditional boundary value problems in the modern framework. We see how Hilbert complexes capture and abstract the most important properties of such boundary value problems, leading to generalizations of important classical results such as the Hodge decomposition theorem. They also provide the proper setting for numerical approximations. We also provide an abstract framework for evolution problems in these spaces: Bochner spaces. We next turn to approximation. We build layers of abstraction, progressing from functions, to differential forms, and finally, to Hilbert complexes. We explore finite element exterior calculus (FEEC), which allows us to approximate solutions involving differential forms, and analyze the approximation error. In the second part, we prove our central results. We first prove an extension of current error estimates for the elliptic problem in Hilbert complexes. This extension handles solutions with nonzero harmonic part. Next, we consider evolution problems in Hilbert complexes and prove abstract error estimates. We apply these estimates to the problem for Riemannian hypersurfaces in R. {n+1},generalizing current results for open subsets of R. {n}. Finally, we applysome of the concepts to a nonlinear problem, the Ricci flow on surfaces, and use tools from nonlinear analysis to help develop and analyze the equations. In the appendices, we detail some additional motivation and a source for further examples: canonical geometries that are realized as steady-state solutions to parabolic equations similar to that of Ricci flow. An eventual goal is to compute such solutions using the methods of the previous chapters.

  19. Siderophore-drug complexes: potential medicinal applications of the 'Trojan horse' strategy.

    PubMed

    Górska, Agnieszka; Sloderbach, Anna; Marszałł, Michał Piotr

    2014-09-01

    The ability of bacteria to develop resistance to antimicrobial agents poses problems in the treatment of numerous bacterial infections. One method to circumvent permeability-mediated drug resistance involves the employment of the 'Trojan horse' strategy. The Trojan horse concept involves the use of bacterial iron uptake systems to enter and kill bacteria. The siderophore-drug complex is recognized by specific siderophore receptors and is then actively transported across the outer membrane. The recently identified benefits of this strategy have led to the synthesis of a series of siderophore-based antibiotics. Several studies have shown that siderophore-drug conjugates make it possible to design antibiotics with improved cell transport and reduce the frequency of resistance mutants. Growing interest in siderophore-drug conjugates for the treatment of human diseases including iron overload, cancer, and malaria has driven the search for new siderophore-drug complexes. This strategy may have special importance for the development of iron oxide nanoparticle-based therapeutics. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. The search for a hippocampal engram.

    PubMed

    Mayford, Mark

    2014-01-05

    Understanding the molecular and cellular changes that underlie memory, the engram, requires the identification, isolation and manipulation of the neurons involved. This presents a major difficulty for complex forms of memory, for example hippocampus-dependent declarative memory, where the participating neurons are likely to be sparse, anatomically distributed and unique to each individual brain and learning event. In this paper, I discuss several new approaches to this problem. In vivo calcium imaging techniques provide a means of assessing the activity patterns of large numbers of neurons over long periods of time with precise anatomical identification. This provides important insight into how the brain represents complex information and how this is altered with learning. The development of techniques for the genetic modification of neural ensembles based on their natural, sensory-evoked, activity along with optogenetics allows direct tests of the coding function of these ensembles. These approaches provide a new methodological framework in which to examine the mechanisms of complex forms of learning at the level of the neurons involved in a specific memory.

  1. The search for a hippocampal engram

    PubMed Central

    Mayford, Mark

    2014-01-01

    Understanding the molecular and cellular changes that underlie memory, the engram, requires the identification, isolation and manipulation of the neurons involved. This presents a major difficulty for complex forms of memory, for example hippocampus-dependent declarative memory, where the participating neurons are likely to be sparse, anatomically distributed and unique to each individual brain and learning event. In this paper, I discuss several new approaches to this problem. In vivo calcium imaging techniques provide a means of assessing the activity patterns of large numbers of neurons over long periods of time with precise anatomical identification. This provides important insight into how the brain represents complex information and how this is altered with learning. The development of techniques for the genetic modification of neural ensembles based on their natural, sensory-evoked, activity along with optogenetics allows direct tests of the coding function of these ensembles. These approaches provide a new methodological framework in which to examine the mechanisms of complex forms of learning at the level of the neurons involved in a specific memory. PMID:24298162

  2. Environmental problem-solving: Psychosocial factors

    NASA Astrophysics Data System (ADS)

    Miller, Alan

    1982-11-01

    This is a study of individual differences in environmental problem-solving, the probable roots of these differences, and their implications for the education of resource professionals. A group of student Resource Managers were required to elaborate their conception of a complex resource issue (Spruce Budworm management) and to generate some ideas on management policy. Of particular interest was the way in which subjects dealt with the psychosocial aspects of the problem. A structural and content analysis of responses indicated a predominance of relatively compartmentalized styles, a technological orientation, and a tendency to ignore psychosocial issues. A relationship between problem-solving behavior and personal (psychosocial) style was established which, in the context of other evidence, suggests that problem-solving behavior is influenced by more deep seated personality factors. The educational implication drawn was that problem-solving cannot be viewed simply as an intellectual-technical activity but one that involves, and requires the education of, the whole person.

  3. Solving a class of generalized fractional programming problems using the feasibility of linear programs.

    PubMed

    Shen, Peiping; Zhang, Tongli; Wang, Chunfeng

    2017-01-01

    This article presents a new approximation algorithm for globally solving a class of generalized fractional programming problems (P) whose objective functions are defined as an appropriate composition of ratios of affine functions. To solve this problem, the algorithm solves an equivalent optimization problem (Q) via an exploration of a suitably defined nonuniform grid. The main work of the algorithm involves checking the feasibility of linear programs associated with the interesting grid points. It is proved that the proposed algorithm is a fully polynomial time approximation scheme as the ratio terms are fixed in the objective function to problem (P), based on the computational complexity result. In contrast to existing results in literature, the algorithm does not require the assumptions on quasi-concavity or low-rank of the objective function to problem (P). Numerical results are given to illustrate the feasibility and effectiveness of the proposed algorithm.

  4. Disentangling Complexity in Bayesian Automatic Adaptive Quadrature

    NASA Astrophysics Data System (ADS)

    Adam, Gheorghe; Adam, Sanda

    2018-02-01

    The paper describes a Bayesian automatic adaptive quadrature (BAAQ) solution for numerical integration which is simultaneously robust, reliable, and efficient. Detailed discussion is provided of three main factors which contribute to the enhancement of these features: (1) refinement of the m-panel automatic adaptive scheme through the use of integration-domain-length-scale-adapted quadrature sums; (2) fast early problem complexity assessment - enables the non-transitive choice among three execution paths: (i) immediate termination (exceptional cases); (ii) pessimistic - involves time and resource consuming Bayesian inference resulting in radical reformulation of the problem to be solved; (iii) optimistic - asks exclusively for subrange subdivision by bisection; (3) use of the weaker accuracy target from the two possible ones (the input accuracy specifications and the intrinsic integrand properties respectively) - results in maximum possible solution accuracy under minimum possible computing time.

  5. An Ensemble Framework Coping with Instability in the Gene Selection Process.

    PubMed

    Castellanos-Garzón, José A; Ramos, Juan; López-Sánchez, Daniel; de Paz, Juan F; Corchado, Juan M

    2018-03-01

    This paper proposes an ensemble framework for gene selection, which is aimed at addressing instability problems presented in the gene filtering task. The complex process of gene selection from gene expression data faces different instability problems from the informative gene subsets found by different filter methods. This makes the identification of significant genes by the experts difficult. The instability of results can come from filter methods, gene classifier methods, different datasets of the same disease and multiple valid groups of biomarkers. Even though there is a wide number of proposals, the complexity imposed by this problem remains a challenge today. This work proposes a framework involving five stages of gene filtering to discover biomarkers for diagnosis and classification tasks. This framework performs a process of stable feature selection, facing the problems above and, thus, providing a more suitable and reliable solution for clinical and research purposes. Our proposal involves a process of multistage gene filtering, in which several ensemble strategies for gene selection were added in such a way that different classifiers simultaneously assess gene subsets to face instability. Firstly, we apply an ensemble of recent gene selection methods to obtain diversity in the genes found (stability according to filter methods). Next, we apply an ensemble of known classifiers to filter genes relevant to all classifiers at a time (stability according to classification methods). The achieved results were evaluated in two different datasets of the same disease (pancreatic ductal adenocarcinoma), in search of stability according to the disease, for which promising results were achieved.

  6. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  7. The Efficacy and Development of Students' Problem-Solving Strategies During Compulsory Schooling: Logfile Analyses

    PubMed Central

    Molnár, Gyöngyvér; Csapó, Benő

    2018-01-01

    The purpose of this study was to examine the role of exploration strategies students used in the first phase of problem solving. The sample for the study was drawn from 3rd- to 12th-grade students (aged 9–18) in Hungarian schools (n = 4,371). Problems designed in the MicroDYN approach with different levels of complexity were administered to the students via the eDia online platform. Logfile analyses were performed to ascertain the impact of strategy use on the efficacy of problem solving. Students' exploration behavior was coded and clustered through Latent Class Analyses. Several theoretically effective strategies were identified, including the vary-one-thing-at-a-time (VOTAT) strategy and its sub-strategies. The results of the analyses indicate that the use of a theoretically effective strategy, which extract all information required to solve the problem, did not always lead to high performance. Conscious VOTAT strategy users proved to be the best problem solvers followed by non-conscious VOTAT strategy users and non-VOTAT strategy users. In the primary school sub-sample, six qualitatively different strategy class profiles were distinguished. The results shed new light on and provide a new interpretation of previous analyses of the processes involved in complex problem solving. They also highlight the importance of explicit enhancement of problem-solving skills and problem-solving strategies as a tool for knowledge acquisition in new contexts during and beyond school lessons. PMID:29593606

  8. The Efficacy and Development of Students' Problem-Solving Strategies During Compulsory Schooling: Logfile Analyses.

    PubMed

    Molnár, Gyöngyvér; Csapó, Benő

    2018-01-01

    The purpose of this study was to examine the role of exploration strategies students used in the first phase of problem solving. The sample for the study was drawn from 3 rd - to 12 th -grade students (aged 9-18) in Hungarian schools ( n = 4,371). Problems designed in the MicroDYN approach with different levels of complexity were administered to the students via the eDia online platform. Logfile analyses were performed to ascertain the impact of strategy use on the efficacy of problem solving. Students' exploration behavior was coded and clustered through Latent Class Analyses. Several theoretically effective strategies were identified, including the vary-one-thing-at-a-time (VOTAT) strategy and its sub-strategies. The results of the analyses indicate that the use of a theoretically effective strategy, which extract all information required to solve the problem, did not always lead to high performance. Conscious VOTAT strategy users proved to be the best problem solvers followed by non-conscious VOTAT strategy users and non-VOTAT strategy users. In the primary school sub-sample, six qualitatively different strategy class profiles were distinguished. The results shed new light on and provide a new interpretation of previous analyses of the processes involved in complex problem solving. They also highlight the importance of explicit enhancement of problem-solving skills and problem-solving strategies as a tool for knowledge acquisition in new contexts during and beyond school lessons.

  9. Assessment of PIV-based unsteady load determination of an airfoil with actuated flap

    NASA Astrophysics Data System (ADS)

    Sterenborg, J. J. H. M.; Lindeboom, R. C. J.; Simão Ferreira, C. J.; van Zuijlen, A. H.; Bijl, H.

    2014-02-01

    For complex experimental setups involving movable structures it is not trivial to directly measure unsteady loads. An alternative is to deduce unsteady loads indirectly from measured velocity fields using Noca's method. The ultimate aim is to use this method in future work to determine unsteady loads for fluid-structure interaction problems. The focus in this paper is first on the application and assessment of Noca's method for an airfoil with an oscillating trailing edge flap. To our best knowledge Noca's method has not been applied yet to airfoils with moving control surfaces or fluid-structure interaction problems. In addition, wind tunnel corrections for this type of unsteady flow problem are considered.

  10. Human factors in air traffic control: problems at the interfaces.

    PubMed

    Shouksmith, George

    2003-10-01

    The triangular ISIS model for describing the operation of human factors in complex sociotechnical organisations or systems is applied in this research to a large international air traffic control system. A large sample of senior Air Traffic Controllers were randomly assigned to small focus discussion groups, whose task was to identify problems occurring at the interfaces of the three major human factor components: individual, system impacts, and social. From these discussions, a number of significant interface problems, which could adversely affect the functioning of the Air Traffic Control System, emerged. The majority of these occurred at the Individual-System Impact and Individual-Social interfaces and involved a perceived need for further interface centered training.

  11. Already at the Table: Patterns of Play and Gambling Involvement Prior to Gambling Expansion.

    PubMed

    Nelson, Sarah E; LaPlante, Debi A; Gray, Heather M; Tom, Matthew A; Kleschinsky, John H; Shaffer, Howard J

    2018-03-01

    During 2011, the Governor of Massachusetts signed a bill to allow casino gambling in the state (Commonwealth of Massachusetts 2011). As a result, two resort casinos will begin operations during 2018 and 2019; a smaller slots parlor began operations during June 2015. Prior to this expansion, gambling was widely available in Massachusetts, through the state lottery, off-track betting, and gambling opportunities available in neighboring states. Within this context, it is important to understand the patterns of gambling involvement in the population prior to gambling expansion. The current study examined gambling involvement, patterns of play, and gambling-related problems prior to gambling expansion among a sample of 511 Massachusetts residents who were members of a statewide Internet panel. To measure patterns of play, we asked questions about past-year games played and frequency of play. To measure breadth of involvement, we assessed the number of different games played. To measure depth of involvement, we measured time spent gambling, amount wagered, and amount won or lost. Principal component analysis revealed four play pattern components accounting for more than 50% of the variance in game play frequency. Multiple regression analyses revealed that component scores composed of casino gambling and skill-based gambling (e.g., poker, sports) variables uniquely contributed to the prediction of gambling-related problems, even when depth of involvement was controlled. However, the addition of breadth of involvement to the model resulted in a model where no set of variables contributed significantly, suggesting a complex relationship among play patterns, depth, and breadth of involvement. The study established discrete and distinguishable gambling play patterns associated with gambling-related problems and identified groups of individuals potentially vulnerable to the effects of gambling expansion.

  12. How are things adding up? Neural differences between arithmetic operations are due to general problem solving strategies.

    PubMed

    Tschentscher, Nadja; Hauk, Olaf

    2014-05-15

    A number of previous studies have interpreted differences in brain activation between arithmetic operation types (e.g. addition and multiplication) as evidence in favor of distinct cortical representations, processes or neural systems. It is still not clear how differences in general task complexity contribute to these neural differences. Here, we used a mental arithmetic paradigm to disentangle brain areas related to general problem solving from those involved in operation type specific processes (addition versus multiplication). We orthogonally varied operation type and complexity. Importantly, complexity was defined not only based on surface criteria (for example number size), but also on the basis of individual participants' strategy ratings, which were validated in a detailed behavioral analysis. We replicated previously reported operation type effects in our analyses based on surface criteria. However, these effects vanished when controlling for individual strategies. Instead, procedural strategies contrasted with memory retrieval reliably activated fronto-parietal and motor regions, while retrieval strategies activated parietal cortices. This challenges views that operation types rely on partially different neural systems, and suggests that previously reported differences between operation types may have emerged due to invalid measures of complexity. We conclude that mental arithmetic is a powerful paradigm to study brain networks of abstract problem solving, as long as individual participants' strategies are taken into account. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. The Open Method of Coordination and the Implementation of the Bologna Process

    ERIC Educational Resources Information Center

    Veiga, Amelia; Amaral, Alberto

    2006-01-01

    In this paper the authors argue that the use of the Open Method of Coordination (OMC) in the implementation of the Bologna process presents coordination problems that do not allow for the full coherence of the results. As the process is quite complex, involving three different levels (European, national and local) and as the final actors in the…

  14. Suicide prevention: increasing education and awareness.

    PubMed

    Grandin, L D; Yan, L J; Gray, S M; Jamison, K R; Sachs, G S

    2001-01-01

    Suicide is a serious and complex public health problem. Health care providers, including both psychiatrists and primary care physicians, are just beginning to understand the intricacies involved in suicide and its prevention. Suicide rates continue to rise, making the education of the public and physicians regarding awareness and prevention, recognition of a wide range of risk factors, and research into suicide prevention strategies very important.

  15. Changes in Transferable Knowledge Resulting from Study in a Graduate Software Engineering Curriculum

    ERIC Educational Resources Information Center

    Bareiss, Ray; Sedano, Todd; Katz, Edward

    2012-01-01

    This paper presents the initial results of a study of the evolution of students' knowledge of software engineering from the beginning to the end of a master's degree curriculum in software engineering. Students were presented with a problem involving the initiation of a complex new project at the beginning of the program and again at the end of…

  16. A Critical Review on Clinical Application of Separation Techniques for Selective Recognition of Uracil and 5-Fluorouracil.

    PubMed

    Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali

    2016-03-01

    The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.

  17. Non-Residential Father-Child Involvement, Interparental Conflict and Mental Health of Children Following Divorce: A Person-Focused Approach

    PubMed Central

    Elam, Kit K.; Sandler, Irwin; Wolchik, Sharlene; Tein, Jenn-Yun

    2015-01-01

    Variable-centered research has found complex relationships between child well-being and two critical aspects of the post-divorce family environment: the level of non-residential father involvement (i.e., contact and supportive relationship) with their children and the level of conflict between the father and mother. However, these analyses fail to capture individual differences based on distinct patterns of interparental conflict, father support and father contact. Using a person-centered latent profile analysis, the present study examined (1) profiles of non-residential father contact, support, and interparental conflict in the two years following divorce (N = 240), when children (49% female) were between 9 and 12 years of age and (2) differences across profiles in concurrent child adjustment outcomes as well as outcomes six years later. Four profiles of father involvement were identified: High Contact – Moderate Conflict – Moderate Support, Low Contact – Moderate Conflict – Low Support, High Conflict – Moderate Contact –Moderate Support, and Low Conflict – Moderate Contact – Moderate Support. Concurrently, children with fathers in the group with high conflict were found to have significantly greater internalizing and externalizing problems compared to all other groups. Six years later, children with fathers in the group with low contact and low support were found to have greater internalizing and externalizing problems compared to children with fathers in the high conflict group, and also greater internalizing problems compared to children with fathers in the low conflict group. These results provide insight into the complex relationship among non-residential fathers’ conflict, contact, and support in child adjustment within divorcing families. PMID:26692236

  18. Avenues into Food Planning: A Review of Scholarly Food System Research

    PubMed Central

    Brinkley, Catherine

    2014-01-01

    This review summarizes several avenues of planning inquiry into food systems research, revealing gaps in the literature, allied fields of study and mismatches between scholarly disciplines and the food system life cycle. Planners and scholars in associated fields have identified and defined problems in the food system as ‘wicked’ problems, complex environmental issues that require systemic solutions at the community scale. While food justice scholars have contextualized problem areas, planning scholars have made a broad case for planning involvement in solving these wicked problems while ensuring that the functional and beneficial parts of the food system continue to thrive. This review maps the entry points of scholarly interest in food systems and planning’s contributions to its study, charting a research agenda for the future. PMID:24932131

  19. Stock management in hospital pharmacy using chance-constrained model predictive control.

    PubMed

    Jurado, I; Maestre, J M; Velarde, P; Ocampo-Martinez, C; Fernández, I; Tejera, B Isla; Prado, J R Del

    2016-05-01

    One of the most important problems in the pharmacy department of a hospital is stock management. The clinical need for drugs must be satisfied with limited work labor while minimizing the use of economic resources. The complexity of the problem resides in the random nature of the drug demand and the multiple constraints that must be taken into account in every decision. In this article, chance-constrained model predictive control is proposed to deal with this problem. The flexibility of model predictive control allows taking into account explicitly the different objectives and constraints involved in the problem while the use of chance constraints provides a trade-off between conservativeness and efficiency. The solution proposed is assessed to study its implementation in two Spanish hospitals. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Processing power limits social group size: computational evidence for the cognitive costs of sociality

    PubMed Central

    Dávid-Barrett, T.; Dunbar, R. I. M.

    2013-01-01

    Sociality is primarily a coordination problem. However, the social (or communication) complexity hypothesis suggests that the kinds of information that can be acquired and processed may limit the size and/or complexity of social groups that a species can maintain. We use an agent-based model to test the hypothesis that the complexity of information processed influences the computational demands involved. We show that successive increases in the kinds of information processed allow organisms to break through the glass ceilings that otherwise limit the size of social groups: larger groups can only be achieved at the cost of more sophisticated kinds of information processing that are disadvantageous when optimal group size is small. These results simultaneously support both the social brain and the social complexity hypotheses. PMID:23804623

  1. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S R; Bihari, B L; Salari, K

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  2. High-resolution numerical approximation of traffic flow problems with variable lanes and free-flow velocities.

    PubMed

    Zhang, Peng; Liu, Ru-Xun; Wong, S C

    2005-05-01

    This paper develops macroscopic traffic flow models for a highway section with variable lanes and free-flow velocities, that involve spatially varying flux functions. To address this complex physical property, we develop a Riemann solver that derives the exact flux values at the interface of the Riemann problem. Based on this solver, we formulate Godunov-type numerical schemes to solve the traffic flow models. Numerical examples that simulate the traffic flow around a bottleneck that arises from a drop in traffic capacity on the highway section are given to illustrate the efficiency of these schemes.

  3. Enacting science

    NASA Astrophysics Data System (ADS)

    MacDonald, Anthony Leo

    My study examines the development of forms of knowing that arise when students engage in open-ended explorations involving self-directed design and building involving simple materials. It is grounded in an enactivist theoretical perspective on cognition which holds that the creation of action-thought processes for engaging the world is interwoven with the meanings that are constructed for these experiences. A dynamic conception of persons-acting-in-a-setting is fundamental to an enactivist view of cognition. How is understanding enacted in building activity? How does the shape of a problem emerge? How do students enact meaning and understanding when they experience a high degree of physical engagement in building things? What are some characteristics of an enactive learning/teaching environment? My research settings comprise a range of individual, group and classroom engagements of varying lengths over a three and one-half year period. The first research episode involved two grade eight students in an investigation of Paper Towels. The second four month engagement was in a grade nine science class that culminated in the building of a Solar House. The third grade ten episode involved a one month project to build a Mousetrap Powered Car. A fourth Invent a Machine project was conducted in two grade eight science classes taught by the teacher who participated in the Solar House project. Two students were present in three of the four projects. I interviewed one of these students upon completion of his high school physics courses. I found that building is a form of thinking which develops competency in managing complex practical tasks. A triadic relationship of exploration, planning and acting is present. Practical and procedural understandings emerge as students enter and re-enter self-directed problem settings. Thinking patterns depend on the kinds of materials chosen, the ways they are used, and on how students contextualize the problem. Classroom assessment procedures gain complexity and incorporate process components as students become involved in establishing criteria for their work. Contemporary science programs emphasize using performance criteria to evaluate student learning in investigative activity. My study seeks to expand the notion of performance by identifying and portraying essential features of student action-thought.

  4. Complex space monofilar approximation of diffraction currents on a conducting half plane

    NASA Technical Reports Server (NTRS)

    Lindell, I. V.

    1987-01-01

    Simple approximation of diffraction surface currents on a conducting half plane, due to an incoming plane wave, is obtained with a line current (monofile) in complex space. When compared to an approximating current at the edge, the diffraction pattern is seen to improve by an order of magnitude for a minimal increase of computation effort. Thus, the inconvient Fresnel integral functions can be avoided for quick calculations of diffracted fields and the accuracy is good in other directions than along the half plane. The method can be applied to general problems involving planar metal edges.

  5. The Computational Complexity of the Kakuro Puzzle, Revisited

    NASA Astrophysics Data System (ADS)

    Ruepp, Oliver; Holzer, Markus

    We present a new proof of NP-completeness for the problem of solving instances of the Japanese pencil puzzle Kakuro (also known as Cross-Sum). While the NP-completeness of Kakuro puzzles has been shown before [T. Seta. The complexity of CROSS SUM. IPSJ SIG Notes, AL-84:51-58, 2002], there are still two interesting aspects to our proof: we show NP-completeness for a new variant of Kakuro that has not been investigated before and thus improves the aforementioned result. Moreover some parts of the proof have been generated automatically, using an interesting technique involving SAT solvers.

  6. Reflections and meditations upon complex chromosomal exchanges.

    PubMed

    Savage, John R K

    2002-12-01

    The application of FISH chromosome painting techniques, especially the recent mFISH (and its equivalents) where all 23 human chromosome pairs can be distinguished, has demonstrated that many chromosome-type structural exchanges are much more complicated (involving more "break-rejoins" and arms) than has hitherto been assumed. It is clear that we have been greatly under-estimating the damage produced in chromatin by such agents as ionising radiation. This article gives a brief historical summary of observations leading up to this conclusion, and after outlining some of the problems surrounding the formation of complex chromosomes exchanges, speculates about possible solutions currently being proposed.

  7. Demonstration of quantum advantage in machine learning

    NASA Astrophysics Data System (ADS)

    Ristè, Diego; da Silva, Marcus P.; Ryan, Colm A.; Cross, Andrew W.; Córcoles, Antonio D.; Smolin, John A.; Gambetta, Jay M.; Chow, Jerry M.; Johnson, Blake R.

    2017-04-01

    The main promise of quantum computing is to efficiently solve certain problems that are prohibitively expensive for a classical computer. Most problems with a proven quantum advantage involve the repeated use of a black box, or oracle, whose structure encodes the solution. One measure of the algorithmic performance is the query complexity, i.e., the scaling of the number of oracle calls needed to find the solution with a given probability. Few-qubit demonstrations of quantum algorithms, such as Deutsch-Jozsa and Grover, have been implemented across diverse physical systems such as nuclear magnetic resonance, trapped ions, optical systems, and superconducting circuits. However, at the small scale, these problems can already be solved classically with a few oracle queries, limiting the obtained advantage. Here we solve an oracle-based problem, known as learning parity with noise, on a five-qubit superconducting processor. Executing classical and quantum algorithms using the same oracle, we observe a large gap in query count in favor of quantum processing. We find that this gap grows by orders of magnitude as a function of the error rates and the problem size. This result demonstrates that, while complex fault-tolerant architectures will be required for universal quantum computing, a significant quantum advantage already emerges in existing noisy systems.

  8. [Influence of mental rotation of objects on psychophysiological functions of women].

    PubMed

    Chikina, L V; Fedorchuk, S V; Trushina, V A; Ianchuk, P I; Makarchuk, M Iu

    2012-01-01

    An integral part of activity of modern human beings is an involvement to work with the computer systems which, in turn, produces a nervous - emotional tension. Hence, a problem of control of the psychophysiological state of workmen with the purpose of health preservation and success of their activity and the problem of application of rehabilitational actions are actual. At present it is known that the efficiency of rehabilitational procedures rises following application of the complex of regenerative programs. Previously performed by us investigation showed that mental rotation is capable to compensate the consequences of a nervous - emotional tension. Therefore, in the present work we investigated how the complex of spatial tasks developed by us influences psychophysiological performances of tested women for which the psycho-emotional tension with the usage of computer technologies is more essential, and the procedure of mental rotation is more complex task for them, than for men. The complex of spatial tasks applied in the given work included: mental rotation of simple objects (letters and digits), mental rotation of complex objects (geometrical figures) and mental rotation of complex objects with the usage of a short-term memory. Execution of the complex of spatial tasks reduces the time of simple and complex sensomotor response, raises parameters of a short-term memory, brain work capacity and improves nervous processes. Collectively, mental rotation of objects can be recommended as a rehabilitational resource for compensation of consequences of any psycho-emotional strain, both for men, and for women.

  9. Communication: Correct charge transfer in CT complexes from the Becke'05 density functional

    NASA Astrophysics Data System (ADS)

    Becke, Axel D.; Dale, Stephen G.; Johnson, Erin R.

    2018-06-01

    It has been known for over twenty years that density functionals of the generalized-gradient approximation (GGA) type and exact-exchange-GGA hybrids with low exact-exchange mixing fraction yield enormous errors in the properties of charge-transfer (CT) complexes. Manifestations of this error have also plagued computations of CT excitation energies. GGAs transfer far too much charge in CT complexes. This error has therefore come to be called "delocalization" error. It remains, to this day, a vexing unsolved problem in density-functional theory (DFT). Here we report that a 100% exact-exchange-based density functional known as Becke'05 or "B05" [A. D. Becke, J. Chem. Phys. 119, 2972 (2003); 122, 064101 (2005)] predicts excellent charge transfers in classic CT complexes involving the electron donors NH3, C2H4, HCN, and C2H2 and electron acceptors F2 and Cl2. Our approach is variational, as in our recent "B05min" dipole moments paper [Dale et al., J. Chem. Phys. 147, 154103 (2017)]. Therefore B05 is not only an accurate DFT for thermochemistry but is promising as a solution to the delocalization problem as well.

  10. Social and ethical dimension of the natural sciences, complex problems of the age, interdisciplinarity, and the contribution of education

    NASA Astrophysics Data System (ADS)

    Develaki, Maria

    2008-09-01

    In view of the complex problems of this age, the question of the socio-ethical dimension of science acquires particular importance. We approach this matter from a philosophical and sociological standpoint, looking at such focal concerns as the motivation, purposes and methods of scientific activity, the ambivalence of scientific research and the concomitant risks, and the conflict between research freedom and external socio-political intervention. We then point out the impediments to the effectiveness of cross-disciplinary or broader meetings for addressing these complex problems and managing the associated risks, given the difficulty in communication between experts in different fields and non-experts, difficulties that education is challenged to help resolve. We find that the social necessity of informed decision-making on the basis of cross-disciplinary collaboration is reflected in the newer curricula, such as that of Greece, in aims like the acquisition of cross-subject knowledge and skills, and the ability to make decisions on controversial issues involving value conflicts. The interest and the reflections of the science education community in these matters increase its—traditionally limited—contribution to the theoretical debate on education and, by extension, the value of science education in the education system.

  11. Coding Theory Information Theory and Radar

    DTIC Science & Technology

    2005-01-01

    the design and synthesis of artificial multiagent systems and for the understanding of human decision-making processes. This... altruism that may exist in a complex society. SGT derives its ability to account simultaneously for both group and individual interests from the structure of ...satisficing decision theory as a model of human decision mak- ing. 2 Multi-Attribute Decision Making Many decision problems involve the consideration of

  12. Basic investigation of turbine erosion phenomena

    NASA Technical Reports Server (NTRS)

    Pouchot, W. D.; Kothmann, R. E.; Fentress, W. K.; Heymann, F. J.; Varljen, T. C.; Chi, J. W. H.; Milton, J. D.; Glassmire, C. M.; Kyslinger, J. A.; Desai, K. A.

    1971-01-01

    An analytical-empirical model is presented of turbine erosion that fits and explains experience in both steam and metal vapor turbines. Because of the complexities involved in analyzing turbine problems, in a pure scientific sense, it is obvious that this goal can be only partially realized. Therefore, emphasis is placed on providing a useful model for preliminary erosion estimates for given configurations, fluids, and flow conditions.

  13. Methods and costs associated with outfitting light aircraft for remote sensing applications

    NASA Technical Reports Server (NTRS)

    Rhodes, O. L.; Zetka, E. F.

    1973-01-01

    This document was designed to provide the potential user of a light aircraft remote sensor platform/data gathering system with general information on aircraft definition, implementation complexity, costs, scheduling and operational factors involved in this type of activity. Most of the subject material was developed from actual situations and problem areas encountered during the build-up cycle and early phases of flight operations.

  14. Experiments in Natural and Synthetic Dental Materials: A Mouthful of Experiments

    NASA Technical Reports Server (NTRS)

    Masi, James V.

    1996-01-01

    The objectives of these experiments are to show that the area of biomaterials, especially dental materials (natural and synthetic), contain all of the elements of good and bad design, with the caveat that a person's health is directly involved. The students learn the process of designing materials for the complex interactions in the oral cavity, analyze those already used, and suggest possible solutions to the problems involved with present technology. The N.I.O.S.H. Handbook is used extensively by the students and judgement calls are made, even without extensive biology education.

  15. Problems in sickness certification of patients: a qualitative study on views of 26 physicians in Sweden.

    PubMed

    von Knorring, Mia; Sundberg, Linda; Löfgren, Anna; Alexanderson, Kristina

    2008-01-01

    To identify what problems physicians experience in sickness certification of patients. Qualitative analyses of data from six focus-group discussions. Four counties in different regions of Sweden. Twenty-six physicians strategically selected to achieve variation with regard to sex, geographical location, urban/rural area, and type of clinic. The problems involved four areas: society and the social insurance system, the organization of healthcare, the performance of other actors in the system, and the physicians' working situation. In all areas the problems also involved manager issues such as overall leadership, organization of healthcare, and existing incentives and support systems for physicians' handling of patients' sickness certification. Many physicians described feelings of fatigue and a lack of pride in their work with sickness certification tasks, as they believed they contributed to unnecessary sickness absence and to medicalization of patients' non-medical problems. The problems identified have negative consequences both for patients and for the well-being of physicians. Many of the problems seem related to inadequate leadership and management of sickness certification issues. Therefore, they cannot be handled merely by training of physicians, which has so far been the main intervention in this area. They also have to be addressed on manager levels within healthcare. Further research is needed on how physicians cope with the problems identified and on managers' strategies and responsibilities in relation to these problems. If the complexity of the problems is not recognized, there is a risk that inadequate actions will be taken to solve them.

  16. Benchmark Problems Used to Assess Computational Aeroacoustics Codes

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.; Envia, Edmane

    2005-01-01

    The field of computational aeroacoustics (CAA) encompasses numerical techniques for calculating all aspects of sound generation and propagation in air directly from fundamental governing equations. Aeroacoustic problems typically involve flow-generated noise, with and without the presence of a solid surface, and the propagation of the sound to a receiver far away from the noise source. It is a challenge to obtain accurate numerical solutions to these problems. The NASA Glenn Research Center has been at the forefront in developing and promoting the development of CAA techniques and methodologies for computing the noise generated by aircraft propulsion systems. To assess the technological advancement of CAA, Glenn, in cooperation with the Ohio Aerospace Institute and the AeroAcoustics Research Consortium, organized and hosted the Fourth CAA Workshop on Benchmark Problems. Participants from industry and academia from both the United States and abroad joined to present and discuss solutions to benchmark problems. These demonstrated technical progress ranging from the basic challenges to accurate CAA calculations to the solution of CAA problems of increasing complexity and difficulty. The results are documented in the proceedings of the workshop. Problems were solved in five categories. In three of the five categories, exact solutions were available for comparison with CAA results. A fourth category of problems representing sound generation from either a single airfoil or a blade row interacting with a gust (i.e., problems relevant to fan noise) had approximate analytical or completely numerical solutions. The fifth category of problems involved sound generation in a viscous flow. In this case, the CAA results were compared with experimental data.

  17. Testing of Environmental Satellite Bus-Instrument Interfaces Using Engineering Models

    NASA Technical Reports Server (NTRS)

    Gagnier, Donald; Hayner, Rick; Nosek, Thomas; Roza, Michael; Hendershot, James E.; Razzaghi, Andrea I.

    2004-01-01

    This paper discusses the formulation and execution of a laboratory test of the electrical interfaces between multiple atmospheric scientific instruments and the spacecraft bus that carries them. The testing, performed in 2002, used engineering models of the instruments and the Aura spacecraft bus electronics. Aura is one of NASA s Earth Observatory System missions. The test was designed to evaluate the complex interfaces in the command and data handling subsystems prior to integration of the complete flight instruments on the spacecraft. A problem discovered during the flight integration phase of the observatory can cause significant cost and schedule impacts. The tests successfully revealed problems and led to their resolution before the full-up integration phase, saving significant cost and schedule. This approach could be beneficial for future environmental satellite programs involving the integration of multiple, complex scientific instruments onto a spacecraft bus.

  18. Exponential convergence through linear finite element discretization of stratified subdomains

    NASA Astrophysics Data System (ADS)

    Guddati, Murthy N.; Druskin, Vladimir; Vaziri Astaneh, Ali

    2016-10-01

    Motivated by problems where the response is needed at select localized regions in a large computational domain, we devise a novel finite element discretization that results in exponential convergence at pre-selected points. The key features of the discretization are (a) use of midpoint integration to evaluate the contribution matrices, and (b) an unconventional mapping of the mesh into complex space. Named complex-length finite element method (CFEM), the technique is linked to Padé approximants that provide exponential convergence of the Dirichlet-to-Neumann maps and thus the solution at specified points in the domain. Exponential convergence facilitates drastic reduction in the number of elements. This, combined with sparse computation associated with linear finite elements, results in significant reduction in the computational cost. The paper presents the basic ideas of the method as well as illustration of its effectiveness for a variety of problems involving Laplace, Helmholtz and elastodynamics equations.

  19. Discrete Adjoint-Based Design for Unsteady Turbulent Flows On Dynamic Overset Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Diskin, Boris

    2012-01-01

    A discrete adjoint-based design methodology for unsteady turbulent flows on three-dimensional dynamic overset unstructured grids is formulated, implemented, and verified. The methodology supports both compressible and incompressible flows and is amenable to massively parallel computing environments. The approach provides a general framework for performing highly efficient and discretely consistent sensitivity analysis for problems involving arbitrary combinations of overset unstructured grids which may be static, undergoing rigid or deforming motions, or any combination thereof. General parent-child motions are also accommodated, and the accuracy of the implementation is established using an independent verification based on a complex-variable approach. The methodology is used to demonstrate aerodynamic optimizations of a wind turbine geometry, a biologically-inspired flapping wing, and a complex helicopter configuration subject to trimming constraints. The objective function for each problem is successfully reduced and all specified constraints are satisfied.

  20. A Multiobjective Interval Programming Model for Wind-Hydrothermal Power System Dispatching Using 2-Step Optimization Algorithm

    PubMed Central

    Jihong, Qu

    2014-01-01

    Wind-hydrothermal power system dispatching has received intensive attention in recent years because it can help develop various reasonable plans to schedule the power generation efficiency. But future data such as wind power output and power load would not be accurately predicted and the nonlinear nature involved in the complex multiobjective scheduling model; therefore, to achieve accurate solution to such complex problem is a very difficult task. This paper presents an interval programming model with 2-step optimization algorithm to solve multiobjective dispatching. Initially, we represented the future data into interval numbers and simplified the object function to a linear programming problem to search the feasible and preliminary solutions to construct the Pareto set. Then the simulated annealing method was used to search the optimal solution of initial model. Thorough experimental results suggest that the proposed method performed reasonably well in terms of both operating efficiency and precision. PMID:24895663

  1. A multiobjective interval programming model for wind-hydrothermal power system dispatching using 2-step optimization algorithm.

    PubMed

    Ren, Kun; Jihong, Qu

    2014-01-01

    Wind-hydrothermal power system dispatching has received intensive attention in recent years because it can help develop various reasonable plans to schedule the power generation efficiency. But future data such as wind power output and power load would not be accurately predicted and the nonlinear nature involved in the complex multiobjective scheduling model; therefore, to achieve accurate solution to such complex problem is a very difficult task. This paper presents an interval programming model with 2-step optimization algorithm to solve multiobjective dispatching. Initially, we represented the future data into interval numbers and simplified the object function to a linear programming problem to search the feasible and preliminary solutions to construct the Pareto set. Then the simulated annealing method was used to search the optimal solution of initial model. Thorough experimental results suggest that the proposed method performed reasonably well in terms of both operating efficiency and precision.

  2. Cognition of an expert tackling an unfamiliar conceptual physics problem

    NASA Astrophysics Data System (ADS)

    Schuster, David; Undreiu, Adriana

    2009-11-01

    We have investigated and analyzed the cognition of an expert tackling a qualitative conceptual physics problem of an unfamiliar type. Our goal was to elucidate the detailed cognitive processes and knowledge elements involved, irrespective of final solution form, and consider implications for instruction. The basic but non-trivial problem was to find qualitatively the direction of acceleration of a pendulum bob at various stages of its motion, a problem originally studied by Reif and Allen. Methodology included interviews, introspection, retrospection and self-reported metacognition. Multiple facets of cognition were revealed, with different reasoning strategies used at different stages and for different points on the path. An account is given of the zigzag thinking paths and interplay of reasoning modes and schema elements involved. We interpret the cognitive processes in terms of theoretical concepts that emerged, namely: case-based, principle-based, experiential-intuitive and practical-heuristic reasoning; knowledge elements and schemata; activation; metacognition and epistemic framing. The complexity of cognition revealed in this case study contrasts with the tidy principle-based solutions we present to students. The pervasive role of schemata, case-based reasoning, practical heuristic strategies, and their interplay with physics principles is noteworthy, since these aspects of cognition are generally neither recognized nor taught. The schema/reasoning-mode perspective has direct application in science teaching, learning and problem-solving.

  3. Composition of web services using Markov decision processes and dynamic programming.

    PubMed

    Uc-Cetina, Víctor; Moo-Mena, Francisco; Hernandez-Ucan, Rafael

    2015-01-01

    We propose a Markov decision process model for solving the Web service composition (WSC) problem. Iterative policy evaluation, value iteration, and policy iteration algorithms are used to experimentally validate our approach, with artificial and real data. The experimental results show the reliability of the model and the methods employed, with policy iteration being the best one in terms of the minimum number of iterations needed to estimate an optimal policy, with the highest Quality of Service attributes. Our experimental work shows how the solution of a WSC problem involving a set of 100,000 individual Web services and where a valid composition requiring the selection of 1,000 services from the available set can be computed in the worst case in less than 200 seconds, using an Intel Core i5 computer with 6 GB RAM. Moreover, a real WSC problem involving only 7 individual Web services requires less than 0.08 seconds, using the same computational power. Finally, a comparison with two popular reinforcement learning algorithms, sarsa and Q-learning, shows that these algorithms require one or two orders of magnitude and more time than policy iteration, iterative policy evaluation, and value iteration to handle WSC problems of the same complexity.

  4. Group Development and Integration in a Cross-Disciplinary and Intercultural Research Team.

    PubMed

    Kirk-Lawlor, Naomi; Allred, Shorna

    2017-04-01

    Cross-disciplinary research is necessary to solve many complex problems that affect society today, including problems involving linked social and environmental systems. Examples include natural resource management or scarcity problems, problematic effects of climate change, and environmental pollution issues. Intercultural research teams are needed to address many complex environmental matters as they often cross geographic and political boundaries, and involve people of different countries and cultures. It follows that disciplinarily and culturally diverse research teams have been organized to investigate and address environmental issues. This case study investigates a team composed of both monolingual and bilingual Chilean and US university researchers who are geoscientists, engineers and economists. The objective of this research team was to study both the natural and human parts of a hydrologic system in a hyper-arid region in northern Chile. Interviews (n = 8) addressed research questions focusing on the interaction of cross-disciplinary diversity and cultural diversity during group integration and development within the team. The case study revealed that the group struggled more with cross-disciplinary challenges than with intercultural ones. Particularly challenging ones were instances the of disciplinary crosstalk, or hidden misunderstandings, where team members thought they understood their cross-disciplinary colleagues, when in reality they did not. Results showed that translation served as a facilitator to cross-disciplinary integration of the research team. The use of translation in group meetings as a strategy for effective cross-disciplinary integration can be extended to monolingual cross-disciplinary teams as well.

  5. Group Development and Integration in a Cross-Disciplinary and Intercultural Research Team

    NASA Astrophysics Data System (ADS)

    Kirk-Lawlor, Naomi; Allred, Shorna

    2017-04-01

    Cross-disciplinary research is necessary to solve many complex problems that affect society today, including problems involving linked social and environmental systems. Examples include natural resource management or scarcity problems, problematic effects of climate change, and environmental pollution issues. Intercultural research teams are needed to address many complex environmental matters as they often cross geographic and political boundaries, and involve people of different countries and cultures. It follows that disciplinarily and culturally diverse research teams have been organized to investigate and address environmental issues. This case study investigates a team composed of both monolingual and bilingual Chilean and US university researchers who are geoscientists, engineers and economists. The objective of this research team was to study both the natural and human parts of a hydrologic system in a hyper-arid region in northern Chile. Interviews ( n = 8) addressed research questions focusing on the interaction of cross-disciplinary diversity and cultural diversity during group integration and development within the team. The case study revealed that the group struggled more with cross-disciplinary challenges than with intercultural ones. Particularly challenging ones were instances the of disciplinary crosstalk, or hidden misunderstandings, where team members thought they understood their cross-disciplinary colleagues, when in reality they did not. Results showed that translation served as a facilitator to cross-disciplinary integration of the research team. The use of translation in group meetings as a strategy for effective cross-disciplinary integration can be extended to monolingual cross-disciplinary teams as well.

  6. Psychopathology in pediatric epilepsy: role of antiepileptic drugs.

    PubMed

    Caplan, Rochelle

    2012-01-01

    Children with epilepsy are usually treated with antiepileptic drugs (AEDS). Some AEDs adversely affect behavior in susceptible children. Since psychiatric comorbidity is prevalent in pediatric epilepsy, this paper attempts to disentangle these AED side effects from the psychopathology associated with this illness. It first outlines the clinical and methodological problems involved in determining if AEDs contribute to the behavior and emotional problems of children with epilepsy. It then presents research evidence for and against the role AEDs play in the psychopathology of children with epilepsy, and outlines how future studies might investigate this problem. A brief description of how to clinically separate out AED effects from the complex illness-related and psychosocial factors that contribute to the behavior difficulties of children with epilepsy concludes the paper.

  7. Summary of the Tandem Cylinder Solutions from the Benchmark Problems for Airframe Noise Computations-I Workshop

    NASA Technical Reports Server (NTRS)

    Lockard, David P.

    2011-01-01

    Fifteen submissions in the tandem cylinders category of the First Workshop on Benchmark problems for Airframe Noise Computations are summarized. Although the geometry is relatively simple, the problem involves complex physics. Researchers employed various block-structured, overset, unstructured and embedded Cartesian grid techniques and considerable computational resources to simulate the flow. The solutions are compared against each other and experimental data from 2 facilities. Overall, the simulations captured the gross features of the flow, but resolving all the details which would be necessary to compute the noise remains challenging. In particular, how to best simulate the effects of the experimental transition strip, and the associated high Reynolds number effects, was unclear. Furthermore, capturing the spanwise variation proved difficult.

  8. The Davey-Stewartson Equation on the Half-Plane

    NASA Astrophysics Data System (ADS)

    Fokas, A. S.

    2009-08-01

    The Davey-Stewartson (DS) equation is a nonlinear integrable evolution equation in two spatial dimensions. It provides a multidimensional generalisation of the celebrated nonlinear Schrödinger (NLS) equation and it appears in several physical situations. The implementation of the Inverse Scattering Transform (IST) to the solution of the initial-value problem of the NLS was presented in 1972, whereas the analogous problem for the DS equation was solved in 1983. These results are based on the formulation and solution of certain classical problems in complex analysis, namely of a Riemann Hilbert problem (RH) and of either a d-bar or a non-local RH problem respectively. A method for solving the mathematically more complicated but physically more relevant case of boundary-value problems for evolution equations in one spatial dimension, like the NLS, was finally presented in 1997, after interjecting several novel ideas to the panoply of the IST methodology. Here, this method is further extended so that it can be applied to evolution equations in two spatial dimensions, like the DS equation. This novel extension involves several new steps, including the formulation of a d-bar problem for a sectionally non-analytic function, i.e. for a function which has different non-analytic representations in different domains of the complex plane. This, in addition to the computation of a d-bar derivative, also requires the computation of the relevant jumps across the different domains. This latter step has certain similarities (but is more complicated) with the corresponding step for those initial-value problems in two dimensions which can be solved via a non-local RH problem, like KPI.

  9. Uncertainties in building a strategic defense.

    PubMed

    Zraket, C A

    1987-03-27

    Building a strategic defense against nuclear ballistic missiles involves complex and uncertain functional, spatial, and temporal relations. Such a defensive system would evolve and grow over decades. It is too complex, dynamic, and interactive to be fully understood initially by design, analysis, and experiments. Uncertainties exist in the formulation of requirements and in the research and design of a defense architecture that can be implemented incrementally and be fully tested to operate reliably. The analysis and measurement of system survivability, performance, and cost-effectiveness are critical to this process. Similar complexities exist for an adversary's system that would suppress or use countermeasures against a missile defense. Problems and opportunities posed by these relations are described, with emphasis on the unique characteristics and vulnerabilities of space-based systems.

  10. Immersed boundary methods for simulating fluid-structure interaction

    NASA Astrophysics Data System (ADS)

    Sotiropoulos, Fotis; Yang, Xiaolei

    2014-02-01

    Fluid-structure interaction (FSI) problems commonly encountered in engineering and biological applications involve geometrically complex flexible or rigid bodies undergoing large deformations. Immersed boundary (IB) methods have emerged as a powerful simulation tool for tackling such flows due to their inherent ability to handle arbitrarily complex bodies without the need for expensive and cumbersome dynamic re-meshing strategies. Depending on the approach such methods adopt to satisfy boundary conditions on solid surfaces they can be broadly classified as diffused and sharp interface methods. In this review, we present an overview of the fundamentals of both classes of methods with emphasis on solution algorithms for simulating FSI problems. We summarize and juxtapose different IB approaches for imposing boundary conditions, efficient iterative algorithms for solving the incompressible Navier-Stokes equations in the presence of dynamic immersed boundaries, and strong and loose coupling FSI strategies. We also present recent results from the application of such methods to study a wide range of problems, including vortex-induced vibrations, aquatic swimming, insect flying, human walking and renewable energy. Limitations of such methods and the need for future research to mitigate them are also discussed.

  11. Reasoning about Resources and Hierarchical Tasks Using OWL and SWRL

    NASA Astrophysics Data System (ADS)

    Elenius, Daniel; Martin, David; Ford, Reginald; Denker, Grit

    Military training and testing events are highly complex affairs, potentially involving dozens of legacy systems that need to interoperate in a meaningful way. There are superficial interoperability concerns (such as two systems not sharing the same messaging formats), but also substantive problems such as different systems not sharing the same understanding of the terrain, positions of entities, and so forth. We describe our approach to facilitating such events: describe the systems and requirements in great detail using ontologies, and use automated reasoning to automatically find and help resolve problems. The complexity of our problem took us to the limits of what one can do with OWL, and we needed to introduce some innovative techniques of using and extending it. We describe our novel ways of using SWRL and discuss its limitations as well as extensions to it that we found necessary or desirable. Another innovation is our representation of hierarchical tasks in OWL, and an engine that reasons about them. Our task ontology has proved to be a very flexible and expressive framework to describe requirements on resources and their capabilities in order to achieve some purpose.

  12. Application of heuristic satellite plan synthesis algorithms to requirements of the WARC-88 allotment plan

    NASA Technical Reports Server (NTRS)

    Heyward, Ann O.; Reilly, Charles H.; Walton, Eric K.; Mata, Fernando; Olen, Carl

    1990-01-01

    Creation of an Allotment Plan for the Fixed Satellite Service at the 1988 Space World Administrative Radio Conference (WARC) represented a complex satellite plan synthesis problem, involving a large number of planned and existing systems. Solutions to this problem at WARC-88 required the use of both automated and manual procedures to develop an acceptable set of system positions. Development of an Allotment Plan may also be attempted through solution of an optimization problem, known as the Satellite Location Problem (SLP). Three automated heuristic procedures, developed specifically to solve SLP, are presented. The heuristics are then applied to two specific WARC-88 scenarios. Solutions resulting from the fully automated heuristics are then compared with solutions obtained at WARC-88 through a combination of both automated and manual planning efforts.

  13. A case of Rocky Mountain spotted fever.

    PubMed

    Rubel, Barry S

    2007-01-01

    Rocky Mountain spotted fever is a serious, generalized infection that is spread to humans through the bite of infected ticks. It can be lethal but it is curable. The disease gets its name from the Rocky Mountain region where it was first identified in 1896. The fever is caused by the bacterium Rickettsia rickettsii and is maintained in nature in a complex life cycle involving ticks and mammals. Humans are considered to be accidental hosts and are not involved in the natural transmission cycle of this pathogen. The author examined a 47-year-old woman during a periodic recall appointment. The patient had no dental problems other than the need for routine prophylaxis but mentioned a recent problem with swelling of her extremities with an accompanying rash and general malaise and soreness in her neck region. Tests were conducted and a diagnosis of Rocky Mountain spotted fever was made.

  14. The birth of tragedy in pediatrics: a phronetic conception of bioethics.

    PubMed

    Carnevale, Franco A

    2007-09-01

    Accepted standards of parental decisional autonomy and child best interests do not address adequately the complex moral problems involved in the care of critically ill children. A growing body of moral discourse is calling for the recognition of ;tragedy' in selected human problems. A tragic dilemma is an irresolvable dilemma with forced terrible alternatives, where even the virtuous agent inescapably emerges with ;dirty hands'. The shift in moral framework described here recognizes that the form of conduct called for by tragic dilemmas is the practice of phronesis. The phronetic agent has acquired a capacity to discern good agency in tragic circumstances. This discernment is practiced through the artful creation of moral narratives: stories that convey that which is morally meaningful in a particular situation; that is, stories that are ;meaning making'. The phronetic agent addresses tragic dilemmas involving children as a narrator of contextualized temporal embodied human (counter)stories.

  15. [Reflex epilepsy evoked by decision making: report of a case (author's transl)].

    PubMed

    Mutani, R; Ganga, A; Agnetti, V

    1980-01-01

    A 17-year-old girl with a story of Gran Mal attacks occurring during lessons of mathematics or solving mathematical problems, was investigated with prolonged EEG recordings. During the sessions, relax periods were alternated with arithmetical or mathematical testing, with card or checkers games and solution of puzzles and crossword problems, and with different neuropsychological tests. EGG recordings were characterized by the appearance, on a normal background, of bilaterally synchronous and symmetrical spike-and-wave and polispike-and-wave discharges, associated with loss of consciousness. During relax their mean frequency was one/54 min., it doubled during execution of tests involved with nonsequential decision making, and was eight times as high (one/7 min.) during tests involving sequential decision making. Some tension, challenge and complexity of the performance were also important as precipitating factors. Their lack deprived sequential tests of their efficacy, while on the contrary their presence sometimes gave nonsequential tests full efficacy.

  16. Disentangling the stochastic behavior of complex time series

    NASA Astrophysics Data System (ADS)

    Anvari, Mehrnaz; Tabar, M. Reza Rahimi; Peinke, Joachim; Lehnertz, Klaus

    2016-10-01

    Complex systems involving a large number of degrees of freedom, generally exhibit non-stationary dynamics, which can result in either continuous or discontinuous sample paths of the corresponding time series. The latter sample paths may be caused by discontinuous events - or jumps - with some distributed amplitudes, and disentangling effects caused by such jumps from effects caused by normal diffusion processes is a main problem for a detailed understanding of stochastic dynamics of complex systems. Here we introduce a non-parametric method to address this general problem. By means of a stochastic dynamical jump-diffusion modelling, we separate deterministic drift terms from different stochastic behaviors, namely diffusive and jumpy ones, and show that all of the unknown functions and coefficients of this modelling can be derived directly from measured time series. We demonstrate appli- cability of our method to empirical observations by a data-driven inference of the deterministic drift term and of the diffusive and jumpy behavior in brain dynamics from ten epilepsy patients. Particularly these different stochastic behaviors provide extra information that can be regarded valuable for diagnostic purposes.

  17. Qualitative review of usability problems in health information systems for radiology.

    PubMed

    Dias, Camila Rodrigues; Pereira, Marluce Rodrigues; Freire, André Pimenta

    2017-12-01

    Radiology processes are commonly supported by Radiology Information System (RIS), Picture Archiving and Communication System (PACS) and other software for radiology. However, these information technologies can present usability problems that affect the performance of radiologists and physicians, especially considering the complexity of the tasks involved. The purpose of this study was to extract, classify and analyze qualitatively the usability problems in PACS, RIS and other software for radiology. A systematic review was performed to extract usability problems reported in empirical usability studies in the literature. The usability problems were categorized as violations of Nielsen and Molich's usability heuristics. The qualitative analysis indicated the causes and the effects of the identified usability problems. From the 431 papers initially identified, 10 met the study criteria. The analysis of the papers identified 90 instances of usability problems, classified into categories corresponding to established usability heuristics. The five heuristics with the highest number of instances of usability problems were "Flexibility and efficiency of use", "Consistency and standards", "Match between system and the real world", "Recognition rather than recall" and "Help and documentation", respectively. These problems can make the interaction time consuming, causing delays in tasks, dissatisfaction, frustration, preventing users from enjoying all the benefits and functionalities of the system, as well as leading to more errors and difficulties in carrying out clinical analyses. Furthermore, the present paper showed a lack of studies performed on systems for radiology, especially usability evaluations using formal methods of evaluation involving the final users. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Efficient estimation of ideal-observer performance in classification tasks involving high-dimensional complex backgrounds

    PubMed Central

    Park, Subok; Clarkson, Eric

    2010-01-01

    The Bayesian ideal observer is optimal among all observers and sets an absolute upper bound for the performance of any observer in classification tasks [Van Trees, Detection, Estimation, and Modulation Theory, Part I (Academic, 1968).]. Therefore, the ideal observer should be used for objective image quality assessment whenever possible. However, computation of ideal-observer performance is difficult in practice because this observer requires the full description of unknown, statistical properties of high-dimensional, complex data arising in real life problems. Previously, Markov-chain Monte Carlo (MCMC) methods were developed by Kupinski et al. [J. Opt. Soc. Am. A 20, 430(2003) ] and by Park et al. [J. Opt. Soc. Am. A 24, B136 (2007) and IEEE Trans. Med. Imaging 28, 657 (2009) ] to estimate the performance of the ideal observer and the channelized ideal observer (CIO), respectively, in classification tasks involving non-Gaussian random backgrounds. However, both algorithms had the disadvantage of long computation times. We propose a fast MCMC for real-time estimation of the likelihood ratio for the CIO. Our simulation results show that our method has the potential to speed up ideal-observer performance in tasks involving complex data when efficient channels are used for the CIO. PMID:19884916

  19. Developing dimensions for a multicomponent multidisciplinary approach to obesity management: a qualitative study.

    PubMed

    Cochrane, Anita J; Dick, Bob; King, Neil A; Hills, Andrew P; Kavanagh, David J

    2017-10-16

    There have been consistent recommendations for multicomponent and multidisciplinary approaches for obesity management. However, there is no clear agreement on the components, disciplines or processes to be considered within such an approach. In this study, we explored multicomponent and multidisciplinary approaches through an examination of knowledge, skills, beliefs, and recommendations of stakeholders involved in obesity management. These stakeholders included researchers, practitioners, educators, and patients. We used qualitative action research methods, including convergent interviewing and observation, to assist the process of inquiry. The consensus was that a multicomponent and multidisciplinary approach should be based on four central meta-components (patient, practitioner, process, and environmental factors), and specific components of these factors were identified. Psychologists, dieticians, exercise physiologists and general practitioners were nominated as key practitioners to be included. A complex condition like obesity requires that multiple components be addressed, and that both patients and multiple disciplines are involved in developing solutions. Implementing cycles of continuous improvement to deal with complexity, instead of trying to control for it, offers an effective way to deal with complex, changing multisystem problems like obesity.

  20. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    As fracture mechanics material testing evolves, the governing test standards continue to be refined to better reflect the latest understanding of the physics of the fracture processes involved. The traditional format of ASTM fracture testing standards, utilizing equations expressed directly in the text of the standard to assess the experimental result, is self-limiting in the complexity that can be reasonably captured. The use of automated analysis techniques to draw upon a rich, detailed solution database for assessing fracture mechanics tests provides a foundation for a new approach to testing standards that enables routine users to obtain highly reliable assessments of tests involving complex, non-linear fracture behavior. Herein, the case for automating the analysis of tests of surface cracks in tension in the elastic-plastic regime is utilized as an example of how such a database can be generated and implemented for use in the ASTM standards framework. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  1. Thorium–phosphorus triamidoamine complexes containing Th–P single- and multiple-bond interactions

    PubMed Central

    Wildman, Elizabeth P.; Balázs, Gábor; Wooles, Ashley J.; Scheer, Manfred; Liddle, Stephen T.

    2016-01-01

    Despite the burgeoning field of uranium-ligand multiple bonds, analogous complexes involving other actinides remain scarce. For thorium, under ambient conditions only a few multiple bonds to carbon, nitrogen, oxygen, sulfur, selenium and tellurium are reported, and no multiple bonds to phosphorus are known, reflecting a general paucity of synthetic methodologies and also problems associated with stabilising these linkages at the large thorium ion. Here we report structurally authenticated examples of a parent thorium(IV)–phosphanide (Th–PH2), a terminal thorium(IV)–phosphinidene (Th=PH), a parent dithorium(IV)–phosphinidiide (Th–P(H)-Th) and a discrete actinide–phosphido complex under ambient conditions (Th=P=Th). Although thorium is traditionally considered to have dominant 6d-orbital contributions to its bonding, contrasting to majority 5f-orbital character for uranium, computational analyses suggests that the bonding of thorium can be more nuanced, in terms of 5f- versus 6d-orbital composition and also significant involvement of the 7s-orbital and how this affects the balance of 5f- versus 6d-orbital bonding character. PMID:27682617

  2. Thorium-phosphorus triamidoamine complexes containing Th-P single- and multiple-bond interactions.

    PubMed

    Wildman, Elizabeth P; Balázs, Gábor; Wooles, Ashley J; Scheer, Manfred; Liddle, Stephen T

    2016-09-29

    Despite the burgeoning field of uranium-ligand multiple bonds, analogous complexes involving other actinides remain scarce. For thorium, under ambient conditions only a few multiple bonds to carbon, nitrogen, oxygen, sulfur, selenium and tellurium are reported, and no multiple bonds to phosphorus are known, reflecting a general paucity of synthetic methodologies and also problems associated with stabilising these linkages at the large thorium ion. Here we report structurally authenticated examples of a parent thorium(IV)-phosphanide (Th-PH 2 ), a terminal thorium(IV)-phosphinidene (Th=PH), a parent dithorium(IV)-phosphinidiide (Th-P(H)-Th) and a discrete actinide-phosphido complex under ambient conditions (Th=P=Th). Although thorium is traditionally considered to have dominant 6d-orbital contributions to its bonding, contrasting to majority 5f-orbital character for uranium, computational analyses suggests that the bonding of thorium can be more nuanced, in terms of 5f- versus 6d-orbital composition and also significant involvement of the 7s-orbital and how this affects the balance of 5f- versus 6d-orbital bonding character.

  3. European Science Notes Information Bulletin Reports on Current European/ Middle Eastern Science

    DTIC Science & Technology

    1991-06-01

    particularly those that involve shock wave/boundary layer cell-centered, finite-volume, explicit, Runge-Kutta interactions , still prov;de considerble...aircraft configuration attributed to using an interactive vcual grid generation was provided by A. Bocci and A. Baxendale, the Aircraft system developed...the surface pressure the complex problem of wing/body/pylon/store distributions with and without the mass flow through the interaction . Reasonable

  4. Contrasting Views of Complexity and Their Implications For Network-Centric Infrastructures

    DTIC Science & Technology

    2010-07-04

    problem) can be undecidable. While two gravitationally interacting bodies yield simple orbits, Poincare showed that the motion of even three...statistical me- chanics are valid only when the [billiard] balls are distributed, in their positions and motions , in a helter-skelter, i.e., a disorga- nized...Rube Goldberg, whose famous cartoons depict “ comically involved complicated invention[s], laboriously contrived to perform a simple operation” [68

  5. Towards safer, better healthcare: harnessing the natural properties of complex sociotechnical systems.

    PubMed

    Braithwaite, J; Runciman, W B; Merry, A F

    2009-02-01

    To sustain an argument that harnessing the natural properties of sociotechnical systems is necessary to promote safer, better healthcare. Triangulated analyses of discrete literature sources, particularly drawing on those from mathematics, sociology, marketing science and psychology. Progress involves the use of natural networks and exploiting features such as their scale-free and small world nature, as well as characteristics of group dynamics like natural appeal (stickiness) and propagation (tipping points). The agenda for change should be set by prioritising problems in natural categories, addressed by groups who self select on the basis of their natural interest in the areas in question, and who set clinical standards and develop tools, the use of which should be monitored by peers. This approach will facilitate the evidence-based practice that most agree is now overdue, but which has not yet been realised by the application of conventional methods. A key to health system transformation may lie under-recognised under our noses, and involves exploiting the naturally-occurring characteristics of complex systems. Current strategies to address healthcare problems are insufficient. Clinicians work best when their expertise is mobilised, and they flourish in groupings of their own interests and preference. Being invited, empowered and nurtured rather than directed, micro-managed and controlled through a hierarchy is preferable.

  6. Convolving engineering and medical pedagogies for training of tomorrow's health care professionals.

    PubMed

    Lee, Raphael C

    2013-03-01

    Several fundamental benefits justify why biomedical engineering and medicine should form a more convergent alliance, especially for the training of tomorrow's physicians and biomedical engineers. Herein, we review the rationale underlying the benefits. Biological discovery has advanced beyond the era of molecular biology well into today's era of molecular systems biology, which focuses on understanding the rules that govern the behavior of complex living systems. This has important medical implications. To realize cost-effective personalized medicine, it is necessary to translate the advances in molecular systems biology to higher levels of biological organization (organ, system, and organismal levels) and then to develop new medical therapeutics based on simulation and medical informatics analysis. Higher education in biological and medical sciences must adapt to a new set of training objectives. This will involve a shifting away from reductionist problem solving toward more integrative, continuum, and predictive modeling approaches which traditionally have been more associated with engineering science. Future biomedical engineers and MDs must be able to predict clinical response to therapeutic intervention. Medical education will involve engineering pedagogies, wherein basic governing rules of complex system behavior and skill sets in manipulating these systems to achieve a practical desired outcome are taught. Similarly, graduate biomedical engineering programs will include more practical exposure to clinical problem solving.

  7. Towards safer, better healthcare: harnessing the natural properties of complex sociotechnical systems

    PubMed Central

    Braithwaite, J; Runciman, W B; Merry, A F

    2009-01-01

    Objectives: To sustain an argument that harnessing the natural properties of sociotechnical systems is necessary to promote safer, better healthcare. Methods: Triangulated analyses of discrete literature sources, particularly drawing on those from mathematics, sociology, marketing science and psychology. Results: Progress involves the use of natural networks and exploiting features such as their scale-free and small world nature, as well as characteristics of group dynamics like natural appeal (stickiness) and propagation (tipping points). The agenda for change should be set by prioritising problems in natural categories, addressed by groups who self select on the basis of their natural interest in the areas in question, and who set clinical standards and develop tools, the use of which should be monitored by peers. This approach will facilitate the evidence-based practice that most agree is now overdue, but which has not yet been realised by the application of conventional methods. Conclusion: A key to health system transformation may lie under-recognised under our noses, and involves exploiting the naturally-occurring characteristics of complex systems. Current strategies to address healthcare problems are insufficient. Clinicians work best when their expertise is mobilised, and they flourish in groupings of their own interests and preference. Being invited, empowered and nurtured rather than directed, micro-managed and controlled through a hierarchy is preferable. PMID:19204130

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koniges, A.E.; Craddock, G.G.; Schnack, D.D.

    The purpose of the workshop was to assemble workers, both within and outside of the fusion-related computations areas, for discussion regarding the issues of dynamically adaptive gridding. There were three invited talks related to adaptive gridding application experiences in various related fields of computational fluid dynamics (CFD), and nine short talks reporting on the progress of adaptive techniques in the specific areas of scrape-off-layer (SOL) modeling and magnetohydrodynamic (MHD) stability. Adaptive mesh methods have been successful in a number of diverse fields of CFD for over a decade. The method involves dynamic refinement of computed field profiles in a waymore » that disperses uniformly the numerical errors associated with discrete approximations. Because the process optimizes computational effort, adaptive mesh methods can be used to study otherwise the intractable physical problems that involve complex boundary shapes or multiple spatial/temporal scales. Recent results indicate that these adaptive techniques will be required for tokamak fluid-based simulations involving the diverted tokamak SOL modeling and MHD simulations problems related to the highest priority ITER relevant issues.Individual papers are indexed separately on the energy data bases.« less

  9. Fourth Computational Aeroacoustics (CAA) Workshop on Benchmark Problems

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D. (Editor)

    2004-01-01

    This publication contains the proceedings of the Fourth Computational Aeroacoustics (CAA) Workshop on Benchmark Problems. In this workshop, as in previous workshops, the problems were devised to gauge the technological advancement of computational techniques to calculate all aspects of sound generation and propagation in air directly from the fundamental governing equations. A variety of benchmark problems have been previously solved ranging from simple geometries with idealized acoustic conditions to test the accuracy and effectiveness of computational algorithms and numerical boundary conditions; to sound radiation from a duct; to gust interaction with a cascade of airfoils; to the sound generated by a separating, turbulent viscous flow. By solving these and similar problems, workshop participants have shown the technical progress from the basic challenges to accurate CAA calculations to the solution of CAA problems of increasing complexity and difficulty. The fourth CAA workshop emphasized the application of CAA methods to the solution of realistic problems. The workshop was held at the Ohio Aerospace Institute in Cleveland, Ohio, on October 20 to 22, 2003. At that time, workshop participants presented their solutions to problems in one or more of five categories. Their solutions are presented in this proceedings along with the comparisons of their solutions to the benchmark solutions or experimental data. The five categories for the benchmark problems were as follows: Category 1:Basic Methods. The numerical computation of sound is affected by, among other issues, the choice of grid used and by the boundary conditions. Category 2:Complex Geometry. The ability to compute the sound in the presence of complex geometric surfaces is important in practical applications of CAA. Category 3:Sound Generation by Interacting With a Gust. The practical application of CAA for computing noise generated by turbomachinery involves the modeling of the noise source mechanism as a vortical gust interacting with an airfoil. Category 4:Sound Transmission and Radiation. Category 5:Sound Generation in Viscous Problems. Sound is generated under certain conditions by a viscous flow as the flow passes an object or a cavity.

  10. Computations of Drop Collision and Coalescence

    NASA Technical Reports Server (NTRS)

    Tryggvason, Gretar; Juric, Damir; Nas, Selman; Mortazavi, Saeed

    1996-01-01

    Computations of drops collisions, coalescence, and other problems involving drops are presented. The computations are made possible by a finite difference/front tracking technique that allows direct solutions of the Navier-Stokes equations for a multi-fluid system with complex, unsteady internal boundaries. This method has been used to examine the various collision modes for binary collisions of drops of equal size, mixing of two drops of unequal size, behavior of a suspension of drops in linear and parabolic shear flows, and the thermal migration of several drops. The key results from these simulations are reviewed. Extensions of the method to phase change problems and preliminary results for boiling are also shown.

  11. Chemistry, manufacturing and controls in passive transdermal drug delivery systems.

    PubMed

    Goswami, Tarun; Audett, Jay

    2015-01-01

    Transdermal drug delivery systems (TDDS) are used for the delivery of the drugs through the skin into the systemic circulation by applying them to the intact skin. The development of TDDS is a complex and multidisciplinary affair which involves identification of suitable drug, excipients and various other components. There have been numerous problems reported with respect to TDDS quality and performance. These problems can be reduced by appropriately addressing chemistry, manufacturing and controls requirements, which would thereby result in development of robust TDDS product and processes. This article provides recommendations on the chemistry, manufacturing and controls focusing on the unique technical aspects of TDDS.

  12. The design of multiplayer online video game systems

    NASA Astrophysics Data System (ADS)

    Hsu, Chia-chun A.; Ling, Jim; Li, Qing; Kuo, C.-C. J.

    2003-11-01

    The distributed Multiplayer Online Game (MOG) system is complex since it involves technologies in computer graphics, multimedia, artificial intelligence, computer networking, embedded systems, etc. Due to the large scope of this problem, the design of MOG systems has not yet been widely addressed in the literatures. In this paper, we review and analyze the current MOG system architecture followed by evaluation. Furthermore, we propose a clustered-server architecture to provide a scalable solution together with the region oriented allocation strategy. Two key issues, i.e. interesting management and synchronization, are discussed in depth. Some preliminary ideas to deal with the identified problems are described.

  13. Breaking Megrelishvili protocol using matrix diagonalization

    NASA Astrophysics Data System (ADS)

    Arzaki, Muhammad; Triantoro Murdiansyah, Danang; Adi Prabowo, Satrio

    2018-03-01

    In this article we conduct a theoretical security analysis of Megrelishvili protocol—a linear algebra-based key agreement between two participants. We study the computational complexity of Megrelishvili vector-matrix problem (MVMP) as a mathematical problem that strongly relates to the security of Megrelishvili protocol. In particular, we investigate the asymptotic upper bounds for the running time and memory requirement of the MVMP that involves diagonalizable public matrix. Specifically, we devise a diagonalization method for solving the MVMP that is asymptotically faster than all of the previously existing algorithms. We also found an important counterintuitive result: the utilization of primitive matrix in Megrelishvili protocol makes the protocol more vulnerable to attacks.

  14. Mass extinctions: Persistent problems and new directions

    NASA Technical Reports Server (NTRS)

    Jablonski, D.

    1994-01-01

    Few contest that mass extinctions have punctuated the history of life, or that those events were so pervasive environmentally, taxonomically, and geographically that physical forcing factors were probably involved. However, consensus remains elusive on the nature of those factors, and on how a given perturbation - impact, volcanism, sea-level change, or ocean anoxic event - could actually generate the observed intensity and selectivity of biotic losses. At least two basic problems underlie these long-standing disagreements: difficulties in resolving the fine details of taxon ranges and abundances immediately prior to and after an extinction boundary and the scarcity of simple, unitary cause-and-effect relations in complex biological systems.

  15. Identification of unique repeated patterns, location of mutation in DNA finger printing using artificial intelligence technique.

    PubMed

    Mukunthan, B; Nagaveni, N

    2014-01-01

    In genetic engineering, conventional techniques and algorithms employed by forensic scientists to assist in identification of individuals on the basis of their respective DNA profiles involves more complex computational steps and mathematical formulae, also the identification of location of mutation in a genomic sequence in laboratories is still an exigent task. This novel approach provides ability to solve the problems that do not have an algorithmic solution and the available solutions are also too complex to be found. The perfect blend made of bioinformatics and neural networks technique results in efficient DNA pattern analysis algorithm with utmost prediction accuracy.

  16. Applications of Metal Additive Manufacturing in Veterinary Orthopedic Surgery

    NASA Astrophysics Data System (ADS)

    Harrysson, Ola L. A.; Marcellin-Little, Denis J.; Horn, Timothy J.

    2015-03-01

    Veterinary medicine has undergone a rapid increase in specialization over the last three decades. Veterinarians now routinely perform joint replacement, neurosurgery, limb-sparing surgery, interventional radiology, radiation therapy, and other complex medical procedures. Many procedures involve advanced imaging and surgical planning. Evidence-based medicine has also become part of the modus operandi of veterinary clinicians. Modeling and additive manufacturing can provide individualized or customized therapeutic solutions to support the management of companion animals with complex medical problems. The use of metal additive manufacturing is increasing in veterinary orthopedic surgery. This review describes and discusses current and potential applications of metal additive manufacturing in veterinary orthopedic surgery.

  17. Computer aided reliability, availability, and safety modeling for fault-tolerant computer systems with commentary on the HARP program

    NASA Technical Reports Server (NTRS)

    Shooman, Martin L.

    1991-01-01

    Many of the most challenging reliability problems of our present decade involve complex distributed systems such as interconnected telephone switching computers, air traffic control centers, aircraft and space vehicles, and local area and wide area computer networks. In addition to the challenge of complexity, modern fault-tolerant computer systems require very high levels of reliability, e.g., avionic computers with MTTF goals of one billion hours. Most analysts find that it is too difficult to model such complex systems without computer aided design programs. In response to this need, NASA has developed a suite of computer aided reliability modeling programs beginning with CARE 3 and including a group of new programs such as: HARP, HARP-PC, Reliability Analysts Workbench (Combination of model solvers SURE, STEM, PAWS, and common front-end model ASSIST), and the Fault Tree Compiler. The HARP program is studied and how well the user can model systems using this program is investigated. One of the important objectives will be to study how user friendly this program is, e.g., how easy it is to model the system, provide the input information, and interpret the results. The experiences of the author and his graduate students who used HARP in two graduate courses are described. Some brief comparisons were made with the ARIES program which the students also used. Theoretical studies of the modeling techniques used in HARP are also included. Of course no answer can be any more accurate than the fidelity of the model, thus an Appendix is included which discusses modeling accuracy. A broad viewpoint is taken and all problems which occurred in the use of HARP are discussed. Such problems include: computer system problems, installation manual problems, user manual problems, program inconsistencies, program limitations, confusing notation, long run times, accuracy problems, etc.

  18. Incorporating Auditory Models in Speech/Audio Applications

    NASA Astrophysics Data System (ADS)

    Krishnamoorthi, Harish

    2011-12-01

    Following the success in incorporating perceptual models in audio coding algorithms, their application in other speech/audio processing systems is expanding. In general, all perceptual speech/audio processing algorithms involve minimization of an objective function that directly/indirectly incorporates properties of human perception. This dissertation primarily investigates the problems associated with directly embedding an auditory model in the objective function formulation and proposes possible solutions to overcome high complexity issues for use in real-time speech/audio algorithms. Specific problems addressed in this dissertation include: 1) the development of approximate but computationally efficient auditory model implementations that are consistent with the principles of psychoacoustics, 2) the development of a mapping scheme that allows synthesizing a time/frequency domain representation from its equivalent auditory model output. The first problem is aimed at addressing the high computational complexity involved in solving perceptual objective functions that require repeated application of auditory model for evaluation of different candidate solutions. In this dissertation, a frequency pruning and a detector pruning algorithm is developed that efficiently implements the various auditory model stages. The performance of the pruned model is compared to that of the original auditory model for different types of test signals in the SQAM database. Experimental results indicate only a 4-7% relative error in loudness while attaining up to 80-90 % reduction in computational complexity. Similarly, a hybrid algorithm is developed specifically for use with sinusoidal signals and employs the proposed auditory pattern combining technique together with a look-up table to store representative auditory patterns. The second problem obtains an estimate of the auditory representation that minimizes a perceptual objective function and transforms the auditory pattern back to its equivalent time/frequency representation. This avoids the repeated application of auditory model stages to test different candidate time/frequency vectors in minimizing perceptual objective functions. In this dissertation, a constrained mapping scheme is developed by linearizing certain auditory model stages that ensures obtaining a time/frequency mapping corresponding to the estimated auditory representation. This paradigm was successfully incorporated in a perceptual speech enhancement algorithm and a sinusoidal component selection task.

  19. Psychosocial dimensions of solving an indoor air problem.

    PubMed

    Lahtinen, Marjaana; Huuhtanen, Pekka; Kähkönen, Erkki; Reijula, Kari

    2002-03-01

    This investigation focuses on the psychological and social dimensions of managing and solving indoor air problems. The data were collected in nine workplaces by interviews (n = 85) and questionnaires (n = 375). Indoor air problems in office environments have traditionally utilized industrial hygiene or technical expertise. However, indoor air problems at workplaces are often more complex issues to solve. Technical questions are inter-related with the dynamics of the work community, and the cooperation and interaction skills of the parties involved in the solving process are also put to the test. In the present study, the interviewees were very critical of the process of solving the indoor air problem. The responsibility for coordinating the problem-managing process was generally considered vague, as were the roles and functions of the various parties. Communication problems occurred and rumors about the indoor air problem circulated widely. Conflicts were common, complicating the process in several ways. The research focused on examining different ways of managing and resolving an indoor air problem. In addition, reference material on the causal factors of the indoor air problem was also acquired. The study supported the hypothesis that psychosocial factors play a significant role in indoor air problems.

  20. "What constitutes a 'problem'?" Producing 'alcohol problems' through online counselling encounters.

    PubMed

    Savic, Michael; Ferguson, Nyssa; Manning, Victoria; Bathish, Ramez; Lubman, Dan I

    2017-08-01

    Typically, health policy, practice and research views alcohol and other drug (AOD) 'problems' as objective things waiting to be detected, diagnosed and treated. However, this approach to policy development and treatment downplays the role of clinical practices, tools, discourses, and systems in shaping how AOD use is constituted as a 'problem'. For instance, people might present to AOD treatment with multiple psycho-social concerns, but usually only a singular AOD-associated 'problem' is considered serviceable. As the assumed nature of 'the serviceable problem' influences what treatment responses people receive, and how they may come to be enacted as 'addicted' or 'normal' subjects, it is important to subject clinical practices of problem formulation to critical analysis. Given that the reach of AOD treatment has expanded via the online medium, in this article we examine how 'problems' are produced in online alcohol counselling encounters involving people aged 55 and over. Drawing on poststructural approaches to problematisation, we not only trace how and what 'problems' are produced, but also what effects these give rise to. We discuss three approaches to problem formulation: (1) Addiction discourses at work; (2) Moving between concerns and alcohol 'problems'; (3) Making 'problems' complex and multiple. On the basis of this analysis, we argue that online AOD counselling does not just respond to pre-existing 'AOD problems'. Rather, through the social and clinical practices of formulation at work in clinical encounters, online counselling also produces them. Thus, given a different set of circumstances, practices and relations, 'problems' might be defined or emerge differently-perhaps not as 'problems' at all or perhaps as different kinds of concerns. We conclude by highlighting the need for a critical reflexivity in AOD treatment and policy in order to open up possibilities for different ways of engaging with, and responding to, people's needs in their complexity. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Self-Directed Cooperative Planetary Rovers

    NASA Technical Reports Server (NTRS)

    Zilberstein, Shlomo; Morris, Robert (Technical Monitor)

    2003-01-01

    The project is concerned with the development of decision-theoretic techniques to optimize the scientific return of planetary rovers. Planetary rovers are small unmanned vehicles equipped with cameras and a variety of sensors used for scientific experiments. They must operate under tight constraints over such resources as operation time, power, storage capacity, and communication bandwidth. Moreover, the limited computational resources of the rover limit the complexity of on-line planning and scheduling. We have developed a comprehensive solution to this problem that involves high-level tools to describe a mission; a compiler that maps a mission description and additional probabilistic models of the components of the rover into a Markov decision problem; and algorithms for solving the rover control problem that are sensitive to the limited computational resources and high-level of uncertainty in this domain.

  2. Mesoscale modeling: solving complex flows in biology and biotechnology.

    PubMed

    Mills, Zachary Grant; Mao, Wenbin; Alexeev, Alexander

    2013-07-01

    Fluids are involved in practically all physiological activities of living organisms. However, biological and biorelated flows are hard to analyze due to the inherent combination of interdependent effects and processes that occur on a multitude of spatial and temporal scales. Recent advances in mesoscale simulations enable researchers to tackle problems that are central for the understanding of such flows. Furthermore, computational modeling effectively facilitates the development of novel therapeutic approaches. Among other methods, dissipative particle dynamics and the lattice Boltzmann method have become increasingly popular during recent years due to their ability to solve a large variety of problems. In this review, we discuss recent applications of these mesoscale methods to several fluid-related problems in medicine, bioengineering, and biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Development of a change management system

    NASA Technical Reports Server (NTRS)

    Parks, Cathy Bonifas

    1993-01-01

    The complexity and interdependence of software on a computer system can create a situation where a solution to one problem causes failures in dependent software. In the computer industry, software problems arise and are often solved with 'quick and dirty' solutions. But in implementing these solutions, documentation about the solution or user notification of changes is often overlooked, and new problems are frequently introduced because of insufficient review or testing. These problems increase when numerous heterogeneous systems are involved. Because of this situation, a change management system plays an integral part in the maintenance of any multisystem computing environment. At the NASA Ames Advanced Computational Facility (ACF), the Online Change Management System (OCMS) was designed and developed to manage the changes being applied to its multivendor computing environment. This paper documents the research, design, and modifications that went into the development of this change management system (CMS).

  4. Evolutionary fuzzy modeling human diagnostic decisions.

    PubMed

    Peña-Reyes, Carlos Andrés

    2004-05-01

    Fuzzy CoCo is a methodology, combining fuzzy logic and evolutionary computation, for constructing systems able to accurately predict the outcome of a human decision-making process, while providing an understandable explanation of the underlying reasoning. Fuzzy logic provides a formal framework for constructing systems exhibiting both good numeric performance (accuracy) and linguistic representation (interpretability). However, fuzzy modeling--meaning the construction of fuzzy systems--is an arduous task, demanding the identification of many parameters. To solve it, we use evolutionary computation techniques (specifically cooperative coevolution), which are widely used to search for adequate solutions in complex spaces. We have successfully applied the algorithm to model the decision processes involved in two breast cancer diagnostic problems, the WBCD problem and the Catalonia mammography interpretation problem, obtaining systems both of high performance and high interpretability. For the Catalonia problem, an evolved system was embedded within a Web-based tool-called COBRA-for aiding radiologists in mammography interpretation.

  5. A Three-Dimensional Finite-Element Model for Simulating Water Flow in Variably Saturated Porous Media

    NASA Astrophysics Data System (ADS)

    Huyakorn, Peter S.; Springer, Everett P.; Guvanasen, Varut; Wadsworth, Terry D.

    1986-12-01

    A three-dimensional finite-element model for simulating water flow in variably saturated porous media is presented. The model formulation is general and capable of accommodating complex boundary conditions associated with seepage faces and infiltration or evaporation on the soil surface. Included in this formulation is an improved Picard algorithm designed to cope with severely nonlinear soil moisture relations. The algorithm is formulated for both rectangular and triangular prism elements. The element matrices are evaluated using an "influence coefficient" technique that avoids costly numerical integration. Spatial discretization of a three-dimensional region is performed using a vertical slicing approach designed to accommodate complex geometry with irregular boundaries, layering, and/or lateral discontinuities. Matrix solution is achieved using a slice successive overrelaxation scheme that permits a fairly large number of nodal unknowns (on the order of several thousand) to be handled efficiently on small minicomputers. Six examples are presented to verify and demonstrate the utility of the proposed finite-element model. The first four examples concern one- and two-dimensional flow problems used as sample problems to benchmark the code. The remaining examples concern three-dimensional problems. These problems are used to illustrate the performance of the proposed algorithm in three-dimensional situations involving seepage faces and anisotropic soil media.

  6. The complexity of patient safety reporting systems in UK dentistry.

    PubMed

    Renton, T; Master, S

    2016-10-21

    Since the 'Francis Report', UK regulation focusing on patient safety has significantly changed. Healthcare workers are increasingly involved in NHS England patient safety initiatives aimed at improving reporting and learning from patient safety incidents (PSIs). Unfortunately, dentistry remains 'isolated' from these main events and continues to have a poor record for reporting and learning from PSIs and other events, thus limiting improvement of patient safety in dentistry. The reasons for this situation are complex.This paper provides a review of the complexities of the existing systems and procedures in relation to patient safety in dentistry. It highlights the conflicting advice which is available and which further complicates an overly burdensome process. Recommendations are made to address these problems with systems and procedures supporting patient safety development in dentistry.

  7. A programming environment for distributed complex computing. An overview of the Framework for Interdisciplinary Design Optimization (FIDO) project. NASA Langley TOPS exhibit H120b

    NASA Technical Reports Server (NTRS)

    Townsend, James C.; Weston, Robert P.; Eidson, Thomas M.

    1993-01-01

    The Framework for Interdisciplinary Design Optimization (FIDO) is a general programming environment for automating the distribution of complex computing tasks over a networked system of heterogeneous computers. For example, instead of manually passing a complex design problem between its diverse specialty disciplines, the FIDO system provides for automatic interactions between the discipline tasks and facilitates their communications. The FIDO system networks all the computers involved into a distributed heterogeneous computing system, so they have access to centralized data and can work on their parts of the total computation simultaneously in parallel whenever possible. Thus, each computational task can be done by the most appropriate computer. Results can be viewed as they are produced and variables changed manually for steering the process. The software is modular in order to ease migration to new problems: different codes can be substituted for each of the current code modules with little or no effect on the others. The potential for commercial use of FIDO rests in the capability it provides for automatically coordinating diverse computations on a networked system of workstations and computers. For example, FIDO could provide the coordination required for the design of vehicles or electronics or for modeling complex systems.

  8. A complex endeavour: an ethnographic study of the implementation of the Sepsis Six clinical care bundle.

    PubMed

    Tarrant, Carolyn; O'Donnell, Barbara; Martin, Graham; Bion, Julian; Hunter, Alison; Rooney, Kevin D

    2016-11-16

    Implementation of the 'Sepsis Six' clinical care bundle within an hour of recognition of sepsis is recommended as an approach to reduce mortality in patients with sepsis, but achieving reliable delivery of the bundle has proved challenging. There remains little understanding of the barriers to reliable implementation of bundle components. We examined frontline clinical practice in implementing the Sepsis Six. We conducted an ethnographic study in six hospitals participating in the Scottish Patient Safety Programme Sepsis collaborative. We conducted around 300 h of non-participant observation in emergency departments, acute medical receiving units and medical and surgical wards. We interviewed a purposive sample of 43 members of hospital staff. Data were analysed using a constant comparative approach. Implementation strategies to promote reliable use of the Sepsis Six primarily focused on education, engaging and motivating staff, and providing prompts for behaviour, along with efforts to ensure that equipment required was readily available. Although these strategies were successful in raising staff awareness of sepsis and engagement with implementation, our study identified that completing the bundle within an hour was not straightforward. Our emergent theory suggested that rather than being an apparently simple sequence of six steps, the Sepsis Six actually involved a complex trajectory comprising multiple interdependent tasks that required prioritisation and scheduling, and which was prone to problems of coordination and operational failures. Interventions that involved allocating specific roles and responsibilities for completing the Sepsis Six in ways that reduced the need for coordination and task switching, and the use of process mapping to identify system failures along the trajectory, could help mitigate against some of these problems. Implementation efforts that focus on individual behaviour change to improve uptake of the Sepsis Six should be supplemented by an understanding of the bundle as a complex trajectory of work in which improving reliability requires attention to coordination of workflow, as well as addressing the mundane problems of interruptions and operational failures that obstruct task completion.

  9. Discrete Regularization for Calibration of Geologic Facies Against Dynamic Flow Data

    NASA Astrophysics Data System (ADS)

    Khaninezhad, Mohammad-Reza; Golmohammadi, Azarang; Jafarpour, Behnam

    2018-04-01

    Subsurface flow model calibration involves many more unknowns than measurements, leading to ill-posed problems with nonunique solutions. To alleviate nonuniqueness, the problem is regularized by constraining the solution space using prior knowledge. In certain sedimentary environments, such as fluvial systems, the contrast in hydraulic properties of different facies types tends to dominate the flow and transport behavior, making the effect of within facies heterogeneity less significant. Hence, flow model calibration in those formations reduces to delineating the spatial structure and connectivity of different lithofacies types and their boundaries. A major difficulty in calibrating such models is honoring the discrete, or piecewise constant, nature of facies distribution. The problem becomes more challenging when complex spatial connectivity patterns with higher-order statistics are involved. This paper introduces a novel formulation for calibration of complex geologic facies by imposing appropriate constraints to recover plausible solutions that honor the spatial connectivity and discreteness of facies models. To incorporate prior connectivity patterns, plausible geologic features are learned from available training models. This is achieved by learning spatial patterns from training data, e.g., k-SVD sparse learning or the traditional Principal Component Analysis. Discrete regularization is introduced as a penalty functions to impose solution discreteness while minimizing the mismatch between observed and predicted data. An efficient gradient-based alternating directions algorithm is combined with variable splitting to minimize the resulting regularized nonlinear least squares objective function. Numerical results show that imposing learned facies connectivity and discreteness as regularization functions leads to geologically consistent solutions that improve facies calibration quality.

  10. Good practices in managing work-related indoor air problems: a psychosocial perspective.

    PubMed

    Lahtinen, Marjaana; Huuhtanen, Pekka; Vähämäki, Kari; Kähkönen, Erkki; Mussalo-Rauhamaa, Helena; Reijula, Kari

    2004-07-01

    Indoor air problems at workplaces are often exceedingly complex. Technical questions are interrelated with the dynamics of the work community, and the cooperation and interaction skills of the parties involved in the problem solving process are also put to the test. The objective of our study was to analyze the process of managing and solving indoor air problems from a psychosocial perspective. This collective case study was based on data from questionnaires, interviews and various documentary materials. Technical inspections of the buildings and indoor air measurements were also carried out. The following four factors best differentiated successful cases from impeded cases: extensive multiprofessional collaboration and participative action, systematic action and perseverance, investment in information and communication, and process thinking and learning. The study also proposed a theoretical model for the role of the psychosocial work environment in indoor air problems. The expertise related to social and human aspects of problem solving plays a significant role in solving indoor air problems. Failures to properly handle these aspects may lead to resources being wasted and result in a problematic situation becoming stagnant or worse. Copyright 2004 Wiley-Liss, Inc.

  11. Visualizing Uncertainty for Data Fusion Graphics: Review of Selected Literature and Industry Approaches

    DTIC Science & Technology

    2015-06-09

    anomaly detection , which is generally considered part of high level information fusion (HLIF) involving temporal-geospatial data as well as meta-data... Anomaly detection in the Maritime defence and security domain typically focusses on trying to identify vessels that are behaving in an unusual...manner compared with lawful vessels operating in the area – an applied case of target detection among distractors. Anomaly detection is a complex problem

  12. Modeling Complex Dynamic Interactions of Nonlinear, Aeroelastic, Multistage, and Localization Phenomena in Turbine Engines

    DTIC Science & Technology

    2011-02-25

    fast method of predicting the number of iterations needed for converged results. A new hybrid technique is proposed to predict the convergence history...interchanging between the modes, whereas a smaller veering (or crossing) region shows fast mode switching. Then, the nonlinear vibration re- sponse of the...problems of interest involve dynamic ( fast ) crack propagation, then the nodes selected by the proposed approach at some time instant might not

  13. Ultrastructure Processing of Advanced Materials.

    DTIC Science & Technology

    1992-11-01

    alkoxide) involving the sodium and the other metal [e.g., NaZr 2(OR)9]. The use of anhydrous ammonia usually solves this problem. MCIX + xNH 3 + xROH - M...the formation of pentacoordinate silicic acid complexes with hydroxide and fluoride ions, as well as neutral adducts with hydrogen fluoride, ammonia ...stable than that for any other small neutral adduct such as water, ammonia , and hydrogen chloride. Elimination of water is much easier by internal

  14. Improved result on stability analysis of discrete stochastic neural networks with time delay

    NASA Astrophysics Data System (ADS)

    Wu, Zhengguang; Su, Hongye; Chu, Jian; Zhou, Wuneng

    2009-04-01

    This Letter investigates the problem of exponential stability for discrete stochastic time-delay neural networks. By defining a novel Lyapunov functional, an improved delay-dependent exponential stability criterion is established in terms of linear matrix inequality (LMI) approach. Meanwhile, the computational complexity of the newly established stability condition is reduced because less variables are involved. Numerical example is given to illustrate the effectiveness and the benefits of the proposed method.

  15. Simulating variable source problems via post processing of individual particle tallies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bleuel, D.L.; Donahue, R.J.; Ludewigt, B.A.

    2000-10-20

    Monte Carlo is an extremely powerful method of simulating complex, three dimensional environments without excessive problem simplification. However, it is often time consuming to simulate models in which the source can be highly varied. Similarly difficult are optimization studies involving sources in which many input parameters are variable, such as particle energy, angle, and spatial distribution. Such studies are often approached using brute force methods or intelligent guesswork. One field in which these problems are often encountered is accelerator-driven Boron Neutron Capture Therapy (BNCT) for the treatment of cancers. Solving the reverse problem of determining the best neutron source formore » optimal BNCT treatment can be accomplished by separating the time-consuming particle-tracking process of a full Monte Carlo simulation from the calculation of the source weighting factors which is typically performed at the beginning of a Monte Carlo simulation. By post-processing these weighting factors on a recorded file of individual particle tally information, the effect of changing source variables can be realized in a matter of seconds, instead of requiring hours or days for additional complete simulations. By intelligent source biasing, any number of different source distributions can be calculated quickly from a single Monte Carlo simulation. The source description can be treated as variable and the effect of changing multiple interdependent source variables on the problem's solution can be determined. Though the focus of this study is on BNCT applications, this procedure may be applicable to any problem that involves a variable source.« less

  16. Sensitivity analysis and approximation methods for general eigenvalue problems

    NASA Technical Reports Server (NTRS)

    Murthy, D. V.; Haftka, R. T.

    1986-01-01

    Optimization of dynamic systems involving complex non-hermitian matrices is often computationally expensive. Major contributors to the computational expense are the sensitivity analysis and reanalysis of a modified design. The present work seeks to alleviate this computational burden by identifying efficient sensitivity analysis and approximate reanalysis methods. For the algebraic eigenvalue problem involving non-hermitian matrices, algorithms for sensitivity analysis and approximate reanalysis are classified, compared and evaluated for efficiency and accuracy. Proper eigenvector normalization is discussed. An improved method for calculating derivatives of eigenvectors is proposed based on a more rational normalization condition and taking advantage of matrix sparsity. Important numerical aspects of this method are also discussed. To alleviate the problem of reanalysis, various approximation methods for eigenvalues are proposed and evaluated. Linear and quadratic approximations are based directly on the Taylor series. Several approximation methods are developed based on the generalized Rayleigh quotient for the eigenvalue problem. Approximation methods based on trace theorem give high accuracy without needing any derivatives. Operation counts for the computation of the approximations are given. General recommendations are made for the selection of appropriate approximation technique as a function of the matrix size, number of design variables, number of eigenvalues of interest and the number of design points at which approximation is sought.

  17. Composition of Web Services Using Markov Decision Processes and Dynamic Programming

    PubMed Central

    Uc-Cetina, Víctor; Moo-Mena, Francisco; Hernandez-Ucan, Rafael

    2015-01-01

    We propose a Markov decision process model for solving the Web service composition (WSC) problem. Iterative policy evaluation, value iteration, and policy iteration algorithms are used to experimentally validate our approach, with artificial and real data. The experimental results show the reliability of the model and the methods employed, with policy iteration being the best one in terms of the minimum number of iterations needed to estimate an optimal policy, with the highest Quality of Service attributes. Our experimental work shows how the solution of a WSC problem involving a set of 100,000 individual Web services and where a valid composition requiring the selection of 1,000 services from the available set can be computed in the worst case in less than 200 seconds, using an Intel Core i5 computer with 6 GB RAM. Moreover, a real WSC problem involving only 7 individual Web services requires less than 0.08 seconds, using the same computational power. Finally, a comparison with two popular reinforcement learning algorithms, sarsa and Q-learning, shows that these algorithms require one or two orders of magnitude and more time than policy iteration, iterative policy evaluation, and value iteration to handle WSC problems of the same complexity. PMID:25874247

  18. Automatic detection of artifacts in converted S3D video

    NASA Astrophysics Data System (ADS)

    Bokov, Alexander; Vatolin, Dmitriy; Zachesov, Anton; Belous, Alexander; Erofeev, Mikhail

    2014-03-01

    In this paper we present algorithms for automatically detecting issues specific to converted S3D content. When a depth-image-based rendering approach produces a stereoscopic image, the quality of the result depends on both the depth maps and the warping algorithms. The most common problem with converted S3D video is edge-sharpness mismatch. This artifact may appear owing to depth-map blurriness at semitransparent edges: after warping, the object boundary becomes sharper in one view and blurrier in the other, yielding binocular rivalry. To detect this problem we estimate the disparity map, extract boundaries with noticeable differences, and analyze edge-sharpness correspondence between views. We pay additional attention to cases involving a complex background and large occlusions. Another problem is detection of scenes that lack depth volume: we present algorithms for detecting at scenes and scenes with at foreground objects. To identify these problems we analyze the features of the RGB image as well as uniform areas in the depth map. Testing of our algorithms involved examining 10 Blu-ray 3D releases with converted S3D content, including Clash of the Titans, The Avengers, and The Chronicles of Narnia: The Voyage of the Dawn Treader. The algorithms we present enable improved automatic quality assessment during the production stage.

  19. Projected Regression Methods for Inverting Fredholm Integrals: Formalism and Application to Analytical Continuation

    NASA Astrophysics Data System (ADS)

    Arsenault, Louis-Francois; Neuberg, Richard; Hannah, Lauren A.; Millis, Andrew J.

    We present a machine learning-based statistical regression approach to the inversion of Fredholm integrals of the first kind by studying an important example for the quantum materials community, the analytical continuation problem of quantum many-body physics. It involves reconstructing the frequency dependence of physical excitation spectra from data obtained at specific points in the complex frequency plane. The approach provides a natural regularization in cases where the inverse of the Fredholm kernel is ill-conditioned and yields robust error metrics. The stability of the forward problem permits the construction of a large database of input-output pairs. Machine learning methods applied to this database generate approximate solutions which are projected onto the subspace of functions satisfying relevant constraints. We show that for low input noise the method performs as well or better than Maximum Entropy (MaxEnt) under standard error metrics, and is substantially more robust to noise. We expect the methodology to be similarly effective for any problem involving a formally ill-conditioned inversion, provided that the forward problem can be efficiently solved. AJM was supported by the Office of Science of the U.S. Department of Energy under Subcontract No. 3F-3138 and LFA by the Columbia Univeristy IDS-ROADS project, UR009033-05 which also provided part support to RN and LH.

  20. Genome-wide detection of intervals of genetic heterogeneity associated with complex traits

    PubMed Central

    Llinares-López, Felipe; Grimm, Dominik G.; Bodenham, Dean A.; Gieraths, Udo; Sugiyama, Mahito; Rowan, Beth; Borgwardt, Karsten

    2015-01-01

    Motivation: Genetic heterogeneity, the fact that several sequence variants give rise to the same phenotype, is a phenomenon that is of the utmost interest in the analysis of complex phenotypes. Current approaches for finding regions in the genome that exhibit genetic heterogeneity suffer from at least one of two shortcomings: (i) they require the definition of an exact interval in the genome that is to be tested for genetic heterogeneity, potentially missing intervals of high relevance, or (ii) they suffer from an enormous multiple hypothesis testing problem due to the large number of potential candidate intervals being tested, which results in either many false positives or a lack of power to detect true intervals. Results: Here, we present an approach that overcomes both problems: it allows one to automatically find all contiguous sequences of single nucleotide polymorphisms in the genome that are jointly associated with the phenotype. It also solves both the inherent computational efficiency problem and the statistical problem of multiple hypothesis testing, which are both caused by the huge number of candidate intervals. We demonstrate on Arabidopsis thaliana genome-wide association study data that our approach can discover regions that exhibit genetic heterogeneity and would be missed by single-locus mapping. Conclusions: Our novel approach can contribute to the genome-wide discovery of intervals that are involved in the genetic heterogeneity underlying complex phenotypes. Availability and implementation: The code can be obtained at: http://www.bsse.ethz.ch/mlcb/research/bioinformatics-and-computational-biology/sis.html. Contact: felipe.llinares@bsse.ethz.ch Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26072488

  1. The problem of complex eigensystems in the semianalytical solution for advancement of time in solute transport simulations: a new method using real arithmetic

    USGS Publications Warehouse

    Umari, Amjad M.J.; Gorelick, Steven M.

    1986-01-01

    In the numerical modeling of groundwater solute transport, explicit solutions may be obtained for the concentration field at any future time without computing concentrations at intermediate times. The spatial variables are discretized and time is left continuous in the governing differential equation. These semianalytical solutions have been presented in the literature and involve the eigensystem of a coefficient matrix. This eigensystem may be complex (i.e., have imaginary components) due to the asymmetry created by the advection term in the governing advection-dispersion equation. Previous investigators have either used complex arithmetic to represent a complex eigensystem or chosen large dispersivity values for which the imaginary components of the complex eigenvalues may be ignored without significant error. It is shown here that the error due to ignoring the imaginary components of complex eigenvalues is large for small dispersivity values. A new algorithm that represents the complex eigensystem by converting it to a real eigensystem is presented. The method requires only real arithmetic.

  2. [Problem-posing as a nutritional education strategy with obese teenagers].

    PubMed

    Rodrigues, Erika Marafon; Boog, Maria Cristina Faber

    2006-05-01

    Obesity is a public health issue with relevant social determinants in its etiology and where interventions with teenagers encounter complex biopsychological conditions. This study evaluated intervention in nutritional education through a problem-posing approach with 22 obese teenagers, treated collectively and individually for eight months. Speech acts were collected through the use of word cards, observer recording, and tape-recording. The study adopted a qualitative methodology, and the approach involved content analysis. Problem-posing facilitated changes in eating behavior, triggering reflections on nutritional practices, family circumstances, social stigma, interaction with health professionals, and religion. Teenagers under individual care posed problems more effectively in relation to eating, while those under collective care posed problems in relation to family and psychological issues, with effective qualitative eating changes in both groups. The intervention helped teenagers understand their life history and determinants of eating behaviors, spontaneously implementing eating changes and making them aware of possibilities for maintaining the new practices and autonomously exercising their role as protagonists in their own health care.

  3. Perturbation solutions of combustion instability problems

    NASA Technical Reports Server (NTRS)

    Googerdy, A.; Peddieson, J., Jr.; Ventrice, M.

    1979-01-01

    A method involving approximate modal analysis using the Galerkin method followed by an approximate solution of the resulting modal-amplitude equations by the two-variable perturbation method (method of multiple scales) is applied to two problems of pressure-sensitive nonlinear combustion instability in liquid-fuel rocket motors. One problem exhibits self-coupled instability while the other exhibits mode-coupled instability. In both cases it is possible to carry out the entire linear stability analysis and significant portions of the nonlinear stability analysis in closed form. In the problem of self-coupled instability the nonlinear stability boundary and approximate forms of the limit-cycle amplitudes and growth and decay rates are determined in closed form while the exact limit-cycle amplitudes and growth and decay rates are found numerically. In the problem of mode-coupled instability the limit-cycle amplitudes are found in closed form while the growth and decay rates are found numerically. The behavior of the solutions found by the perturbation method are in agreement with solutions obtained using complex numerical methods.

  4. COMPREHENSIVE ASSESSMENT OF COMPLEX TECHNOLOGIES: INTEGRATING VARIOUS ASPECTS IN HEALTH TECHNOLOGY ASSESSMENT.

    PubMed

    Lysdahl, Kristin Bakke; Mozygemba, Kati; Burns, Jacob; Brönneke, Jan Benedikt; Chilcott, James B; Ward, Sue; Hofmann, Bjørn

    2017-01-01

    Despite recent development of health technology assessment (HTA) methods, there are still methodological gaps for the assessment of complex health technologies. The INTEGRATE-HTA guidance for effectiveness, economic, ethical, socio-cultural, and legal aspects, deals with challenges when assessing complex technologies, such as heterogeneous study designs, multiple stakeholder perspectives, and unpredictable outcomes. The objective of this article is to outline this guidance and describe the added value of integrating these assessment aspects. Different methods were used to develop the various parts of the guidance, but all draw on existing, published knowledge and were supported by stakeholder involvement. The guidance was modified after application in a case study and in response to feedback from internal and external reviewers. The guidance consists of five parts, addressing five core aspects of HTA, all presenting stepwise approaches based on the assessment of complexity, context, and stakeholder involvement. The guidance on effectiveness, health economics and ethics aspects focus on helping users choose appropriate, or further develop, existing methods. The recommendations are based on existing methods' applicability for dealing with problems arising with complex interventions. The guidance offers new frameworks to identify socio-cultural and legal issues, along with overviews of relevant methods and sources. The INTEGRATE-HTA guidance outlines a wide range of methods and facilitates appropriate choices among them. The guidance enables understanding of how complexity matters for HTA and brings together assessments from disciplines, such as epidemiology, economics, ethics, law, and social theory. This indicates relevance for a broad range of technologies.

  5. Complex Problem Solving: What It Is and What It Is Not

    PubMed Central

    Dörner, Dietrich; Funke, Joachim

    2017-01-01

    Computer-simulated scenarios have been part of psychological research on problem solving for more than 40 years. The shift in emphasis from simple toy problems to complex, more real-life oriented problems has been accompanied by discussions about the best ways to assess the process of solving complex problems. Psychometric issues such as reliable assessments and addressing correlations with other instruments have been in the foreground of these discussions and have left the content validity of complex problem solving in the background. In this paper, we return the focus to content issues and address the important features that define complex problems. PMID:28744242

  6. Gentle Masking of Low-Complexity Sequences Improves Homology Search

    PubMed Central

    Frith, Martin C.

    2011-01-01

    Detection of sequences that are homologous, i.e. descended from a common ancestor, is a fundamental task in computational biology. This task is confounded by low-complexity tracts (such as atatatatatat), which arise frequently and independently, causing strong similarities that are not homologies. There has been much research on identifying low-complexity tracts, but little research on how to treat them during homology search. We propose to find homologies by aligning sequences with “gentle” masking of low-complexity tracts. Gentle masking means that the match score involving a masked letter is , where is the unmasked score. Gentle masking slightly but noticeably improves the sensitivity of homology search (compared to “harsh” masking), without harming specificity. We show examples in three useful homology search problems: detection of NUMTs (nuclear copies of mitochondrial DNA), recruitment of metagenomic DNA reads to reference genomes, and pseudogene detection. Gentle masking is currently the best way to treat low-complexity tracts during homology search. PMID:22205972

  7. Governing Influence of Thermodynamic and Chemical Equilibria on the Interfacial Properties in Complex Fluids.

    PubMed

    Harikrishnan, A R; Dhar, Purbarun; Gedupudi, Sateesh; Das, Sarit K

    2018-04-12

    We propose a comprehensive analysis and a quasi-analytical mathematical formalism to predict the surface tension and contact angles of complex surfactant-infused nanocolloids. The model rests on the foundations of the interaction potentials for the interfacial adsorption-desorption dynamics in complex multicomponent colloids. Surfactant-infused nanoparticle-laden interface problems are difficult to deal with because of the many-body interactions and interfaces involved at the meso-nanoscales. The model is based on the governing role of thermodynamic and chemical equilibrium parameters in modulating the interfacial energies. The influence of parameters such as the presence of surfactants, nanoparticles, and surfactant-capped nanoparticles on interfacial dynamics is revealed by the analysis. Solely based on the knowledge of interfacial properties of independent surfactant solutions and nanocolloids, the same can be deduced for complex surfactant-based nanocolloids through the proposed approach. The model accurately predicts the equilibrium surface tension and contact angle of complex nanocolloids available in the existing literature and present experimental findings.

  8. Study protocol: a mixed methods study to assess mental health recovery, shared decision-making and quality of life (Plan4Recovery).

    PubMed

    Coffey, Michael; Hannigan, Ben; Meudell, Alan; Hunt, Julian; Fitzsimmons, Deb

    2016-08-17

    Recovery in mental health care is complex, highly individual and can be facilitated by a range of professional and non-professional support. In this study we will examine how recovery from mental health problems is promoted in non-medical settings. We hypothesise a relationship between involvement in decisions about care, social support and recovery and quality of life outcomes. We will use standardised validated instruments of involvement in decision-making, social contacts, recovery and quality of life with a random sample of people accessing non-statutory mental health social care services in Wales. We will add to this important information with detailed one to one case study interviews with people, their family members and their support workers. We will use a series of these interviews to examine how people build recovery over time to help us understand more about their involvement in decisions and the social links they build. We want to see how being involved in decisions about care and the social links people have are related to recovery and quality of life for people with experience of using mental health support services. We want to understand the different perspectives of the people involved in making recovery possible. We will use this information to guide further studies of particular types of social interventions and their use in helping recovery from mental health problems.

  9. A problem-solving approach to effective insulin injection for patients at either end of the body mass index.

    PubMed

    Juip, Micki; Fitzner, Karen

    2012-06-01

    People with diabetes require skills and knowledge to adhere to medication regimens and self-manage this complex disease. Effective self-management is contingent upon effective problem solving and decision making. Gaps existed regarding useful approaches to problem solving by individuals with very low and very high body mass index (BMI) who self-administer insulin injections. This article addresses those gaps by presenting findings from a patient survey, a symposium on the topic of problem solving, and recent interviews with diabetes educators to facilitate problem-solving approaches for people with diabetes with high and low BMI who inject insulin and/or other medications. In practice, problem solving involves problem identification, definition, and specification; goal and barrier identification are a prelude to generating a set of potential strategies for problem resolution and applying these strategies to implement a solution. Teaching techniques, such as site rotation and ensuring that people with diabetes use the appropriate equipment, increase confidence with medication adherence. Medication taking is more effective when people with diabetes are equipped with the knowledge, skills, and problem-solving behaviors to effectively self-manage their injections.

  10. A study of the performance of patients with frontal lobe lesions in a financial planning task.

    PubMed

    Goel, V; Grafman, J; Tajik, J; Gana, S; Danto, D

    1997-10-01

    It has long been argued that patients with lesions in the prefrontal cortex have difficulties in decision making and problem solving in real-world, ill-structured situations, particularly problem types involving planning and look-ahead components. Recently, several researchers have questioned our ability to capture and characterize these deficits adequately using just the standard neuropsychological test batteries, and have called for tests that reflect real-world task requirements more accurately. We present data from 10 patients with focal lesions to the prefrontal cortex and 10 normal control subjects engaged in a real-world financial planning task. We also introduce a theoretical framework and methodology developed in the cognitive science literature for quantifying and analysing the complex data generated by problem-solving tasks. Our findings indicate that patient performance is impoverished at a global level but not at the local level. Patients have difficulty in organizing and structuring their problem space. Once they begin problem solving, they have difficulty in allocating adequate effort to each problem-solving phase. Patients also have difficulty dealing with the fact that there are no right or wrong answers nor official termination points in real-world planning problems. They also find it problematic to generate their own feedback. They invariably terminate the session before the details are fleshed out and all the goals satisfied. Finally, patients do not take full advantage of the fact that constraints on real-world problems are negotiable. However, it is not necessary to postulate a 'planning' deficit. It is possible to understand the patients' difficulties in real world planning tasks in terms of the following four accepted deficits: inadequate access to 'structured event complexes', difficulty in generalizing from particulars, failure to shift between 'mental sets', and poor judgment regarding adequacy and completeness of a plan.

  11. Optimal quantum cloning based on the maximin principle by using a priori information

    NASA Astrophysics Data System (ADS)

    Kang, Peng; Dai, Hong-Yi; Wei, Jia-Hua; Zhang, Ming

    2016-10-01

    We propose an optimal 1 →2 quantum cloning method based on the maximin principle by making full use of a priori information of amplitude and phase about the general cloned qubit input set, which is a simply connected region enclosed by a "longitude-latitude grid" on the Bloch sphere. Theoretically, the fidelity of the optimal quantum cloning machine derived from this method is the largest in terms of the maximin principle compared with that of any other machine. The problem solving is an optimization process that involves six unknown complex variables, six vectors in an uncertain-dimensional complex vector space, and four equality constraints. Moreover, by restricting the structure of the quantum cloning machine, the optimization problem is simplified as a three-real-parameter suboptimization problem with only one equality constraint. We obtain the explicit formula for a suboptimal quantum cloning machine. Additionally, the fidelity of our suboptimal quantum cloning machine is higher than or at least equal to that of universal quantum cloning machines and phase-covariant quantum cloning machines. It is also underlined that the suboptimal cloning machine outperforms the "belt quantum cloning machine" for some cases.

  12. Innovative Use of the Law to Address Complex Global Health Problems Comment on "The Legal Strength of International Health Instruments - What It Brings toGlobal Health Governance?"

    PubMed

    Walls, Helen L; Ooms, Gorik

    2017-05-20

    Addressing the increasingly globalised determinants of many important problems affecting human health is a complex task requiring collective action. We suggest that part of the solution to addressing intractable global health issues indeed lies with the role of new legal instruments in the form of globally binding treaties, as described in the recent article of Nikogosian and Kickbusch. However, in addition to the use of international law to develop new treaties, another part of the solution may lie in innovative use of existing legal instruments. A 2015 court ruling in The Hague, which ordered the Dutch government to cut greenhouse gas emissions by at least 25% within five years, complements this perspective, suggesting a way forward for addressing global health problems that critically involves civil society and innovative use of existing domestic legal instruments. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  13. The Future of Psychology: Connecting Mind to Brain

    PubMed Central

    Barrett, Lisa Feldman

    2009-01-01

    Psychological states such as thoughts and feelings are real. Brain states are real. The problem is that the two are not real in the same way, creating the mind–brain correspondence problem. In this article, I present a possible solution to this problem that involves two suggestions. First, complex psychological states such as emotion and cognition an be thought of as constructed events that can be causally reduced to a set of more basic, psychologically primitive ingredients that are more clearly respected by the brain. Second, complex psychological categories like emotion and cognition are the phenomena that require explanation in psychology, and, therefore, they cannot be abandoned by science. Describing the content and structure of these categories is a necessary and valuable scientific activity. Physical concepts are free creations of the human mind, and are not, however it may seem, uniquely determined by the external world.—Einstein & Infeld (1938, p. 33) The cardinal passions of our life, anger, love, fear, hate, hope, and the most comprehensive divisions of our intellectual activity, to remember, expect, think, know, dream (and he goes on to say, feel) are the only facts of a subjective order…—James (1890, p. 195) PMID:19844601

  14. Testing of Environmental Satellite Bus-Instrument Interfaces Using Engineering Models

    NASA Technical Reports Server (NTRS)

    Gagnier, Don; Hayner, Rick; Roza, Michael; Nosek, Thomas; Razzaghi, Andrea

    2004-01-01

    This paper discusses the formulation and execution of a laboratory test of the electrical interfaces between multiple atmospheric science instruments and the spacecraft bus that carries them. The testing, performed in 2002, used engineering models of the instruments that will be flown on the Aura s p a c m and of the Aura spacecraft bus electronics. Aura is one of NASA's Earth Observing System @OS) Program missions managed by the Goddard Space Flight Center. The test was designed to evaluate the complex interfaces in the spacecraft and instrument command and data handling (C&DH) subsystems prior to integration of the complete flight instruments on the spacecraft. A problem discovered during (and not before) the flight hardware integration phase can cause significant cost and schedule impacts. The testing successfully surfaced problems and led to their resolution before the full-up integration phase, saving significant cost and schedule time. This approach could be used on future environmental satellite programs involving multiple, complex scientific instruments being integrated onto a bus.

  15. Design consideration in constructing high performance embedded Knowledge-Based Systems (KBS)

    NASA Technical Reports Server (NTRS)

    Dalton, Shelly D.; Daley, Philip C.

    1988-01-01

    As the hardware trends for artificial intelligence (AI) involve more and more complexity, the process of optimizing the computer system design for a particular problem will also increase in complexity. Space applications of knowledge based systems (KBS) will often require an ability to perform both numerically intensive vector computations and real time symbolic computations. Although parallel machines can theoretically achieve the speeds necessary for most of these problems, if the application itself is not highly parallel, the machine's power cannot be utilized. A scheme is presented which will provide the computer systems engineer with a tool for analyzing machines with various configurations of array, symbolic, scaler, and multiprocessors. High speed networks and interconnections make customized, distributed, intelligent systems feasible for the application of AI in space. The method presented can be used to optimize such AI system configurations and to make comparisons between existing computer systems. It is an open question whether or not, for a given mission requirement, a suitable computer system design can be constructed for any amount of money.

  16. Drug-nutrient interactions in enteral feeding: a primary care focus.

    PubMed

    Varella, L; Jones, E; Meguid, M M

    1997-06-01

    Drug and nutrient interactions are complex and can take many forms, including malabsorption of either the drug or the nutrient component. Some drugs can stimulate or suppress appetite, whereas others can cause nausea and vomiting resulting in inadequate nutritional intake. Absorption of drugs is a complex process that can be affected by the physical characteristics of the gastrointestinal tract (GIT) as well. Depending on the physical properties of a drug, it may be absorbed in a limited area of the GIT or more diffusely along much of the entire length. Many diseases and conditions are also known to affect the GIT either directly or indirectly. Dietary factors also need to be considered when the "food" is an enteral formula. The widespread use of enteral tubes requires that consideration be given to patients receiving both enteral feedings and medication concurrently. The location of a tube in the gastrointestinal tract, as well as the problems involved in crushing and administering solid dosage forms, creates a unique set of problems.

  17. Immunology for physicists

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perelson, A.S.; Weisbuch, G.

    1997-10-01

    The immune system is a complex system of cells and molecules that can provide us with a basic defense against pathogenic organisms. Like the nervous system, the immune system performs pattern recognition tasks, learns, and retains a memory of the antigens that it has fought. The immune system contains more than 10{sup 7} different clones of cells that communicate via cell-cell contact and the secretion of molecules. Performing complex tasks such as learning and memory involves cooperation among large numbers of components of the immune system and hence there is interest in using methods and concepts from statistical physics. Furthermore,more » the immune response develops in time and the description of its time evolution is an interesting problem in dynamical systems. In this paper, the authors provide a brief introduction to the biology of the immune system and discuss a number of immunological problems in which the use of physical concepts and mathematical methods has increased our understanding. {copyright} {ital 1997} {ital The American Physical Society}« less

  18. Applications of artificial intelligence to mission planning

    NASA Technical Reports Server (NTRS)

    Ford, Donnie R.; Rogers, John S.; Floyd, Stephen A.

    1990-01-01

    The scheduling problem facing NASA-Marshall mission planning is extremely difficult for several reasons. The most critical factor is the computational complexity involved in developing a schedule. The size of the search space is large along some dimensions and infinite along others. It is because of this and other difficulties that many of the conventional operation research techniques are not feasible or inadequate to solve the problems by themselves. Therefore, the purpose is to examine various artificial intelligence (AI) techniques to assist conventional techniques or to replace them. The specific tasks performed were as follows: (1) to identify mission planning applications for object oriented and rule based programming; (2) to investigate interfacing AI dedicated hardware (Lisp machines) to VAX hardware; (3) to demonstrate how Lisp may be called from within FORTRAN programs; (4) to investigate and report on programming techniques used in some commercial AI shells, such as Knowledge Engineering Environment (KEE); and (5) to study and report on algorithmic methods to reduce complexity as related to AI techniques.

  19. Potential Flow Theory and Operation Guide for the Panel Code PMARC. Version 14

    NASA Technical Reports Server (NTRS)

    Ashby, Dale L.

    1999-01-01

    The theoretical basis for PMARC, a low-order panel code for modeling complex three-dimensional bodies, in potential flow, is outlined. PMARC can be run on a wide variety of computer platforms, including desktop machines, workstations, and supercomputers. Execution times for PMARC vary tremendously depending on the computer resources used, but typically range from several minutes for simple or moderately complex cases to several hours for very large complex cases. Several of the advanced features currently included in the code, such as internal flow modeling, boundary layer analysis, and time-dependent flow analysis, including problems involving relative motion, are discussed in some detail. The code is written in Fortran77, using adjustable-size arrays so that it can be easily redimensioned to match problem requirements and computer hardware constraints. An overview of the program input is presented. A detailed description of the input parameters is provided in the appendices. PMARC results for several test cases are presented along with analytic or experimental data, where available. The input files for these test cases are given in the appendices. PMARC currently supports plotfile output formats for several commercially available graphics packages. The supported graphics packages are Plot3D, Tecplot, and PmarcViewer.

  20. Child prostitution in Thailand.

    PubMed

    Lau, Carmen

    2008-06-01

    Child prostitution is an old, global and complex phenomenon, which deprives children of their childhood, human rights and dignity. Child prostitution can be seen as the commercial sexual exploitation of children involving an element of forced labour, and thus can be considered as a contemporary form of slavery. Globally, child prostitution is reported to be a common problem in Central and South America and Asia. Of all the south-east Asian nations, the problem is most prolific in Thailand. In Thailand, there appears to be a long history of child prostitution, and this article explores the factors that underpin the Thai child sex industry and the lessons and implications that can be drawn for health care and nursing around the world.

  1. Challenges in building intelligent systems for space mission operations

    NASA Technical Reports Server (NTRS)

    Hartman, Wayne

    1991-01-01

    The purpose here is to provide a top-level look at the stewardship functions performed in space operations, and to identify the major issues and challenges that must be addressed to build intelligent systems that can realistically support operations functions. The focus is on decision support activities involving monitoring, state assessment, goal generation, plan generation, and plan execution. The bottom line is that problem solving in the space operations domain is a very complex process. A variety of knowledge constructs, representations, and reasoning processes are necessary to support effective human problem solving. Emulating these kinds of capabilities in intelligent systems offers major technical challenges that the artificial intelligence community is only beginning to address.

  2. Optical systems engineering - A tutorial

    NASA Technical Reports Server (NTRS)

    Wyman, C. L.

    1979-01-01

    The paper examines the use of the systems engineering approach in the design of optical systems, noting that the use of such an approach which involves an integrated interdisciplinary approach to the development of systems is most appropriate for optics. It is shown that the high precision character of optics leads to complex and subtle effects on optical system performance, resulting from structural, thermal dynamical, control system, and manufacturing and assembly considerations. Attention is given to communication problems that often occur among users and optical engineers due to the unique factors of optical systems. It is concluded that it is essential that the optics community provide leadership to resolve communication problems and fully formalize the field of optical systems engineering.

  3. The Role of Sleep in Childhood Psychiatric Disorders

    PubMed Central

    Alfano, Candice A.; Gamble, Amanda L.

    2009-01-01

    Although sleep problems often comprise core features of psychiatric disorders, inadequate attention has been paid to the complex, reciprocal relationships involved in the early regulation of sleep, emotion, and behavior. In this paper, we review the pediatric literature examining sleep in children with primary psychiatric disorders as well as evidence for the role of early sleep problems as a risk factor for the development of psychopathology. Based on these cumulative data, possible mechanisms and implications of early sleep disruption are considered. Finally, assessment recommendations for mental health clinicians working with children and adolescents are provided toward reducing the risk of and improving treatments for sleep disorders and psychopathology in children and adolescents. PMID:19960111

  4. A Benchmark Problem for Development of Autonomous Structural Modal Identification

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.; Woodard, Stanley E.; Juang, Jer-Nan

    1996-01-01

    This paper summarizes modal identification results obtained using an autonomous version of the Eigensystem Realization Algorithm on a dynamically complex, laboratory structure. The benchmark problem uses 48 of 768 free-decay responses measured in a complete modal survey test. The true modal parameters of the structure are well known from two previous, independent investigations. Without user involvement, the autonomous data analysis identified 24 to 33 structural modes with good to excellent accuracy in 62 seconds of CPU time (on a DEC Alpha 4000 computer). The modal identification technique described in the paper is the baseline algorithm for NASA's Autonomous Dynamics Determination (ADD) experiment scheduled to fly on International Space Station assembly flights in 1997-1999.

  5. Social and health care professionals' views on responsible agency in the process of ending intimate partner violence.

    PubMed

    Virkki, Tuija

    2015-06-01

    This article examines social and health care professionals' views, based on their encounters with both victims and perpetrators, on the division of responsibility in the process of ending intimate partner violence. Applying discourse analysis to focus group discussions with a total of 45 professionals on solutions to the problem, several positions of responsible agency in which professionals place themselves and their clients are identified. The results suggest that one key to understanding the complexities involved in violence intervention lies in a more adequate theorization of the temporal and intersubjective dimensions of the process of assigning responsibility for the problem. © The Author(s) 2015.

  6. An experimental paradigm for team decision processes

    NASA Technical Reports Server (NTRS)

    Serfaty, D.; Kleinman, D. L.

    1986-01-01

    The study of distributed information processing and decision making is presently hampered by two factors: (1) The inherent complexity of the mathematical formulation of decentralized problems has prevented the development of models that could be used to predict performance in a distributed environment; and (2) The lack of comprehensive scientific empirical data on human team decision making has hindered the development of significant descriptive models. As a part of a comprehensive effort to find a new framework for multihuman decision making problems, a novel experimental research paradigm was developed involving human terms in decision making tasks. Attempts to construct parts of an integrated model with ideas from queueing networks, team theory, distributed estimation and decentralized resource management are described.

  7. Experiences with general practitioners described by families of children with intellectual disabilities and challenging behaviour: a qualitative study

    PubMed Central

    Lien, Lars; Danbolt, Lars J; Kjønsberg, Kari; Haavet, Ole R

    2011-01-01

    Objective To investigate parents' experiences of follow-up by general practitioners (GPs) of children with intellectual disabilities (ID) and comorbid behavioural and/or psychological problems. Design Qualitative study based on in-depth interviews with parents of children with ID and a broad range of accompanying health problems. Setting County centred study in Norway involving primary and specialist care. Participants Nine parents of seven children with ID, all received services from an assigned GP and a specialist hospital department. Potential participants were identified by the specialist hospital department and purposefully selected by the authors to represent both genders and a range of diagnoses, locations and assigned GPs. Results Three clusters of experiences emerged from the analysis: expectations, relationships and actual use. The participants had low expectations of the GPs' competence and involvement with their child, and primarily used the GP for the treatment of simple somatic problems. Only one child regularly visited their GP for general and mental health check-ups. The participants' experience of their GPs was that they did not have time and were not interested in the behavioural and mental problems of these children. Conclusions Families with children with ID experience a complex healthcare system in situations where they are vulnerable to lack of information, involvement and competence. GPs are part of a stable service system and are in a position to provide security, help and support to these families. Parents' experiences could be improved by regular health checks for their children and GPs being patient, taking time and showing interest in challenging behaviour. PMID:22123921

  8. SEU System Analysis: Not Just the Sum of All Parts

    NASA Technical Reports Server (NTRS)

    Berg, Melanie D.; Label, Kenneth

    2014-01-01

    Single event upset (SEU) analysis of complex systems is challenging. Currently, system SEU analysis is performed by component level partitioning and then either: the most dominant SEU cross-sections (SEUs) are used in system error rate calculations; or the partition SEUs are summed to eventually obtain a system error rate. In many cases, system error rates are overestimated because these methods generally overlook system level derating factors. The problem with overestimating is that it can cause overdesign and consequently negatively affect the following: cost, schedule, functionality, and validation/verification. The scope of this presentation is to discuss the risks involved with our current scheme of SEU analysis for complex systems; and to provide alternative methods for improvement.

  9. Influence of the boundary conditions on heat and mass transfer in spacer-filled channels

    NASA Astrophysics Data System (ADS)

    Ciofalo, M.; La Cerva, M. F.; Di Liberto, M.; Tamburini, A.

    2017-11-01

    The purpose of this study is to discuss some problems which arise in heat or mass transfer in complex channels, with special reference to the spacer-filled channels adopted in membrane processes. Among the issues addressed are the consistent definition of local and mean heat or mass transfer coefficients; the influence of the wall boundary conditions; the influence of one-side versus two-side heat/mass transfer. Most of the results discussed were obtained by finite volume CFD simulations concerning heat transfer in Membrane Distillation or mass transfer in Electrodialysis and Reverse Electrodialysis, but many of the conclusions apply also to different processes involving geometrically complex channels

  10. Structure parameters in rotating Couette-Poiseuille channel flow

    NASA Technical Reports Server (NTRS)

    Knightly, George H.; Sather, D.

    1986-01-01

    It is well-known that a number of steady state problems in fluid mechanics involving systems of nonlinear partial differential equations can be reduced to the problem of solving a single operator equation of the form: v + lambda Av + lambda B(v) = 0, v is the summation of H, lambda is the summation of one-dimensional Euclid space, where H is an appropriate (real or complex) Hilbert space. Here lambda is a typical load parameter, e.g., the Reynolds number, A is a linear operator, and B is a quadratic operator generated by a bilinear form. In this setting many bifurcation and stability results for problems were obtained. A rotating Couette-Poiseuille channel flow was studied, and it showed that, in general, the superposition of a Poiseuille flow on a rotating Couette channel flow is destabilizing.

  11. A framework for discrete stochastic simulation on 3D moving boundary domains

    DOE PAGES

    Drawert, Brian; Hellander, Stefan; Trogdon, Michael; ...

    2016-11-14

    We have developed a method for modeling spatial stochastic biochemical reactions in complex, three-dimensional, and time-dependent domains using the reaction-diffusion master equation formalism. In particular, we look to address the fully coupled problems that arise in systems biology where the shape and mechanical properties of a cell are determined by the state of the biochemistry and vice versa. To validate our method and characterize the error involved, we compare our results for a carefully constructed test problem to those of a microscale implementation. Finally, we demonstrate the effectiveness of our method by simulating a model of polarization and shmoo formationmore » during the mating of yeast. The method is generally applicable to problems in systems biology where biochemistry and mechanics are coupled, and spatial stochastic effects are critical.« less

  12. Radar image interpretation techniques applied to sea ice geophysical problems

    NASA Technical Reports Server (NTRS)

    Carsey, F. D.

    1983-01-01

    The geophysical science problems in the sea ice area which at present concern understanding the ice budget, where ice is formed, how thick it grows and where it melts, and the processes which control the interaction of air-sea and ice at the ice margins is discussed. The science problems relate to basic questions of sea ice: how much is there, thickness, drift rate, production rate, determination of the morphology of the ice margin, storms feeling for the ice, storms and influence at the margin to alter the pack, and ocean response to a storm at the margin. Some of these questions are descriptive and some require complex modeling of interactions between the ice, the ocean, the atmosphere and the radiation fields. All involve measurements of the character of the ice pack, and SAR plays a significant role in the measurements.

  13. Distance learning for University Physics in South Africa

    NASA Astrophysics Data System (ADS)

    Cilliers, J. A.; Basson, I.

    1997-03-01

    The University of South Africa (Unisa) is one of the largest distance education universities in the world. Teaching physics at a distance is a complex and multifaceted problem which is compounded in the South African context by the diversity of educational backgrounds of the learners involved. The fact that students are distributed over a vast geographical area, presents unique problems for the incorporation of the practical component into the curriculum. Current research involves a fundamental evaluation of the aims and objectives of the introductory laboratory. The project is based on the notion that practicals, as they have been used in most physics curricula, are not particularly effective or efficient, although they are costly both financially and logistically. Design, development and delivery of efficient study material imply that there should be agreement between what the student knows and can do, and what the material offers. An in depth profile that takes into account biographic as well as cognitive characteristics of the target group, is therefore being compiled. This paper gives an overview of the specific problems and circumstances that were identified for distance education in physics in a multi-cultural society, and proposes a new model for the incorporation of the introductory laboratory into the curriculum.

  14. Building co-management as a process: problem solving through partnerships in Aboriginal country, Australia.

    PubMed

    Zurba, Melanie; Ross, Helen; Izurieta, Arturo; Rist, Philip; Bock, Ellie; Berkes, Fikret

    2012-06-01

    Collaborative problem solving has increasingly become important in the face of the complexities in the management of resources, including protected areas. The strategy undertaken by Girringun Aboriginal Corporation in north tropical Queensland, Australia, for developing co-management demonstrates the potential for a problem solving approach involving sequential initiatives, as an alternative to the more familiar negotiated agreements for co-management. Our longitudinal case study focuses on the development of indigenous ranger units as a strategic mechanism for the involvement of traditional owners in managing their country in collaboration with government and other interested parties. This was followed by Australia's first traditional use of marine resources agreement, and development of a multi-jurisdictional, land to sea, indigenous protected area. In using a relationship building approach to develop regional scale co-management, Girringun has been strengthening its capabilities as collaborator and regional service provider, thus, bringing customary decision-making structures into play to 'care for country'. From this evolving process we have identified the key components of a relationship building strategy, 'the pillars of co-management'. This approach includes learning-by-doing, the building of respect and rapport, sorting out responsibilities, practical engagement, and capacity-building.

  15. Studying marine stratus with large eddy simulation

    NASA Technical Reports Server (NTRS)

    Moeng, Chin-Hoh

    1990-01-01

    Data sets from field experiments over the stratocumulus regime may include complications from larger scale variations, decoupled cloud layers, diurnal cycle, or entrainment instability, etc. On top of the already complicated turbulence-radiation-condensation processes within the cloud-topped boundary layer (CTBL), these complexities may sometimes make interpretation of the data sets difficult. To study these processes, a better understanding is needed of the basic processes involved in the prototype CTBL. For example, is cloud top radiative cooling the primary source of the turbulent kinetic energy (TKE) within the CTBL. Historically, laboratory measurements have played an important role in addressing the turbulence problems. The CTBL is a turbulent field which is probably impossible to generate in laboratories. Large eddy simulation (LES) is an alternative way of 'measuring' the turbulent structure under controlled environments, which allows the systematic examination of the basic physical processes involved. However, there are problems with the LES approach for the CTBL. The LES data need to be consistent with the observed data. The LES approach is discussed, and results are given which provide some insights into the simulated turbulent flow field. Problems with this approach for the CTBL and information from the FIRE experiment needed to justify the LES results are discussed.

  16. Building Co-Management as a Process: Problem Solving Through Partnerships in Aboriginal Country, Australia

    NASA Astrophysics Data System (ADS)

    Zurba, Melanie; Ross, Helen; Izurieta, Arturo; Rist, Philip; Bock, Ellie; Berkes, Fikret

    2012-06-01

    Collaborative problem solving has increasingly become important in the face of the complexities in the management of resources, including protected areas. The strategy undertaken by Girringun Aboriginal Corporation in north tropical Queensland, Australia, for developing co-management demonstrates the potential for a problem solving approach involving sequential initiatives, as an alternative to the more familiar negotiated agreements for co-management. Our longitudinal case study focuses on the development of indigenous ranger units as a strategic mechanism for the involvement of traditional owners in managing their country in collaboration with government and other interested parties. This was followed by Australia's first traditional use of marine resources agreement, and development of a multi-jurisdictional, land to sea, indigenous protected area. In using a relationship building approach to develop regional scale co-management, Girringun has been strengthening its capabilities as collaborator and regional service provider, thus, bringing customary decision-making structures into play to `care for country'. From this evolving process we have identified the key components of a relationship building strategy, `the pillars of co-management'. This approach includes learning-by-doing, the building of respect and rapport, sorting out responsibilities, practical engagement, and capacity-building.

  17. The determination factors of left-right asymmetry disorders- a short review.

    PubMed

    Catana, Andreea; Apostu, Adina Patricia

    2017-01-01

    Laterality defects in humans, situs inversus and heterotaxy, are rare disorders, with an incidence of 1:8000 to 1:10 000 in the general population, and a multifactorial etiology. It has been proved that 1.44/10 000 of all cardiac problems are associated with malformations of left-right asymmetry and heterotaxy accounts for 3% of all congenital heart defects. It is considered that defects of situs appear due to genetic and environmental factors. Also, there is evidence that the ciliopathies (defects of structure or function) are involved in development abnormalities. Over 100 genes have been reported to be involved in left-right patterning in model organisms, but only a few are likely to candidate for left-right asymmetry defects in humans. Left-right asymmetry disorders are genetically heterogeneous and have variable manifestations (from asymptomatic to serious clinical problems). The discovery of the right mechanism of left-right development will help explain the clinical complexity and may contribute to a therapy of these disorders.

  18. Policy implications of private sector involvement in correctional services and programs.

    PubMed

    Johnson, T A

    1987-01-01

    The movement toward private sector involvement in our correctional services and programs is growing. Before our focus is turned completely to privatization of these services, it would be prudent to analyze the "policy impact of such change. It is evident that the diverse and incompatible policies guiding the government approach to corrections and the absence of any rational planning to answer public interest goals is costly. Moreover, despite the increasing complexity of problems now confronting public authorities, little change has been made in their approach to resolving them. However, is it realistic to assume that the profit/loss barometer of the private sector can be applied in an area of social problems that are so pluralistic and ill defined? What of the many areas of potential legal concern, that is, vicarious litigation, First Amendment right of prisoners, and so forth? These are all areas that need to be researched so that any judgements or decisions made will be sound.

  19. Synthesis of E- and Z-trisubstituted alkenes by catalytic cross-metathesis

    NASA Astrophysics Data System (ADS)

    Nguyen, Thach T.; Koh, Ming Joo; Mann, Tyler J.; Schrock, Richard R.; Hoveyda, Amir H.

    2017-12-01

    Catalytic cross-metathesis is a central transformation in chemistry, yet corresponding methods for the stereoselective generation of acyclic trisubstituted alkenes in either the E or the Z isomeric forms are not known. The key problems are a lack of chemoselectivity—namely, the preponderance of side reactions involving only the less hindered starting alkene, resulting in homo-metathesis by-products—and the formation of short-lived methylidene complexes. By contrast, in catalytic cross-coupling, substrates are more distinct and homocoupling is less of a problem. Here we show that through cross-metathesis reactions involving E- or Z-trisubstituted alkenes, which are easily prepared from commercially available starting materials by cross-coupling reactions, many desirable and otherwise difficult-to-access linear E- or Z-trisubstituted alkenes can be synthesized efficiently and in exceptional stereoisomeric purity (up to 98 per cent E or 95 per cent Z). The utility of the strategy is demonstrated by the concise stereoselective syntheses of biologically active compounds, such as the antifungal indiacen B and the anti-inflammatory coibacin D.

  20. Synthesis of E- and Z-trisubstituted alkenes by catalytic cross-metathesis.

    PubMed

    Nguyen, Thach T; Koh, Ming Joo; Mann, Tyler J; Schrock, Richard R; Hoveyda, Amir H

    2017-12-20

    Catalytic cross-metathesis is a central transformation in chemistry, yet corresponding methods for the stereoselective generation of acyclic trisubstituted alkenes in either the E or the Z isomeric forms are not known. The key problems are a lack of chemoselectivity-namely, the preponderance of side reactions involving only the less hindered starting alkene, resulting in homo-metathesis by-products-and the formation of short-lived methylidene complexes. By contrast, in catalytic cross-coupling, substrates are more distinct and homocoupling is less of a problem. Here we show that through cross-metathesis reactions involving E- or Z-trisubstituted alkenes, which are easily prepared from commercially available starting materials by cross-coupling reactions, many desirable and otherwise difficult-to-access linear E- or Z-trisubstituted alkenes can be synthesized efficiently and in exceptional stereoisomeric purity (up to 98 per cent E or 95 per cent Z). The utility of the strategy is demonstrated by the concise stereoselective syntheses of biologically active compounds, such as the antifungal indiacen B and the anti-inflammatory coibacin D.

  1. Action research methodology in clinical pharmacy: how to involve and change.

    PubMed

    Nørgaard, Lotte Stig; Sørensen, Ellen Westh

    2016-06-01

    Introduction The focus in clinical pharmacy practice is and has for the last 30-35 years been on changing the role of pharmacy staff into service orientation and patient counselling. One way of doing this is by involving staff in change process and as a researcher to take part in the change process by establishing partnerships with staff. On the background of the authors' widespread action research (AR)-based experiences, recommendations and comments for how to conduct an AR-study is described, and one of their AR-based studies illustrate the methodology and the research methods used. Methodology AR is defined as an approach to research which is based on a problem-solving relationship between researchers and clients, which aims at both solving a problem and at collaboratively generating new knowledge. Research questions relevant in AR-studies are: what was the working process in this change oriented study? What learning and/or changes took place? What challenges/pitfalls had to be overcome? What were the influence/consequences for the involved parts? When to use If you want to implement new services and want to involve staff and others in the process, an AR methodology is very suitable. The basic advantages of doing AR-based studies are grounded in their participatory and democratic basis and their starting point in problems experienced in practice. Limitations Some of the limitations in AR-studies are that neither of the participants in a project steering group are the only ones to decide. Furthermore, the collective process makes the decision-making procedures relatively complex.

  2. The structural bioinformatics library: modeling in biomolecular science and beyond.

    PubMed

    Cazals, Frédéric; Dreyfus, Tom

    2017-04-01

    Software in structural bioinformatics has mainly been application driven. To favor practitioners seeking off-the-shelf applications, but also developers seeking advanced building blocks to develop novel applications, we undertook the design of the Structural Bioinformatics Library ( SBL , http://sbl.inria.fr ), a generic C ++/python cross-platform software library targeting complex problems in structural bioinformatics. Its tenet is based on a modular design offering a rich and versatile framework allowing the development of novel applications requiring well specified complex operations, without compromising robustness and performances. The SBL involves four software components (1-4 thereafter). For end-users, the SBL provides ready to use, state-of-the-art (1) applications to handle molecular models defined by unions of balls, to deal with molecular flexibility, to model macro-molecular assemblies. These applications can also be combined to tackle integrated analysis problems. For developers, the SBL provides a broad C ++ toolbox with modular design, involving core (2) algorithms , (3) biophysical models and (4) modules , the latter being especially suited to develop novel applications. The SBL comes with a thorough documentation consisting of user and reference manuals, and a bugzilla platform to handle community feedback. The SBL is available from http://sbl.inria.fr. Frederic.Cazals@inria.fr. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  3. Portrait of an Enzyme, a Complete Structural Analysis of a Multimodular beta-N-Acetylglucosaminidase from Clostridium perfringens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ficko-Blean, E.; Gregg, K; Adams, J

    2009-01-01

    Common features of the extracellular carbohydrate-active virulence factors involved in host-pathogen interactions are their large sizes and modular complexities. This has made them recalcitrant to structural analysis, and therefore our understanding of the significance of modularity in these important proteins is lagging. Clostridium perfringens is a prevalent human pathogen that harbors a wide array of large, extracellular carbohydrate-active enzymes and is an excellent and relevant model system to approach this problem. Here we describe the complete structure of C. perfringens GH84C (NagJ), a 1001-amino acid multimodular homolog of the C. perfringens ?-toxin, which was determined using a combination of smallmore » angle x-ray scattering and x-ray crystallography. The resulting structure reveals unprecedented insight into how catalysis, carbohydrate-specific adherence, and the formation of molecular complexes with other enzymes via an ultra-tight protein-protein interaction are spatially coordinated in an enzyme involved in a host-pathogen interaction.« less

  4. Fine-scale traverses in cumulate rocks, Stillwater Complex: A lunar analogue study

    NASA Technical Reports Server (NTRS)

    Elthon, Donald

    1988-01-01

    The objective was to document finite-scale compositional variations in cumulate rocks from the Stillwater Complex in Montana and to interpret these data in the context of planetary magma fractionation processes such as those operative during the formation of the Earth's Moon. This research problem involved collecting samples in the Stillwater Complex and analyzing them by electron microprobe, X-ray fluorescence (XRF), and instrumental neutron activation analysis (INAA). The electron microprobe is used to determine the compositions of cumulus and intercumulus phases in the rocks, the XRF is used to determine the bulk-rock major element and trace element (Y, Sr, Rb, Zr, Ni, and Cr) abundances, and the INAA lab. is used to determine the trace element (Sc, Co, Cr, Ni, Ta, Hf, U, Th, and the REE) abundances of mineral separates and bulk rocks.

  5. Study of Nanocomposites of Amino Acids and Organic Polyethers by Means of Mass Spectrometry and Molecular Dynamics Simulation

    NASA Astrophysics Data System (ADS)

    Zobnina, V. G.; Kosevich, M. V.; Chagovets, V. V.; Boryak, O. A.

    A problem of elucidation of structure of nanomaterials based on combination of proteins and polyether polymers is addressed on the monomeric level of single amino acids and oligomers of PEG-400 and OEG-5 polyethers. Efficiency of application of combined approach involving experimental electrospray mass spectrometry and computer modeling by molecular dynamics simulation is demonstrated. It is shown that oligomers of polyethers form stable complexes with amino acids valine, proline, histidine, glutamic, and aspartic acids. Molecular dynamics simulation has shown that stabilization of amino acid-polyether complexes is achieved due to winding of the polymeric chain around charged groups of amino acids. Structural motives revealed for complexes of single amino acids with polyethers can be realized in structures of protein-polyether nanoparticles currently designed for drug delivery.

  6. Grid Convergence of High Order Methods for Multiscale Complex Unsteady Viscous Compressible Flows

    NASA Technical Reports Server (NTRS)

    Sjoegreen, B.; Yee, H. C.

    2001-01-01

    Grid convergence of several high order methods for the computation of rapidly developing complex unsteady viscous compressible flows with a wide range of physical scales is studied. The recently developed adaptive numerical dissipation control high order methods referred to as the ACM and wavelet filter schemes are compared with a fifth-order weighted ENO (WENO) scheme. The two 2-D compressible full Navier-Stokes models considered do not possess known analytical and experimental data. Fine grid solutions from a standard second-order TVD scheme and a MUSCL scheme with limiters are used as reference solutions. The first model is a 2-D viscous analogue of a shock tube problem which involves complex shock/shear/boundary-layer interactions. The second model is a supersonic reactive flow concerning fuel breakup. The fuel mixing involves circular hydrogen bubbles in air interacting with a planar moving shock wave. Both models contain fine scale structures and are stiff in the sense that even though the unsteadiness of the flows are rapidly developing, extreme grid refinement and time step restrictions are needed to resolve all the flow scales as well as the chemical reaction scales.

  7. The Interaction of Conduct Problems and Depressed Mood in Relation to Adolescent Substance Involvement and Peer Substance Use

    PubMed Central

    Hitchings, Julia E.; Spoth, Richard L.

    2010-01-01

    Conduct problems are strong positive predictors of substance use and problem substance use among teens, whereas predictive associations of depressed mood with these outcomes are mixed. Conduct problems and depressed mood often co-occur, and such co-occurrence may heighten risk for negative outcomes. Thus, this study examined the interaction of conduct problems and depressed mood at age 11 in relation to substance use and problem use at age 18, and possible mediation through peer substance use at age 16. Analyses of multirater longitudinal data collected from 429 rural youths (222 girls) and their families were conducted using a methodology for testing latent variable interactions. The link between the conduct problems X depressed mood interaction and adolescent substance use was negative and statistically significant. Unexpectedly, positive associations of conduct problems with substance use were stronger at lower levels of depressed mood. A significant negative interaction in relation to peer substance use also was observed, and the estimated indirect effect of the interaction on adolescent use through peer use as a mediator was statistically significant. Findings illustrate the complexity of multiproblem youth. PMID:18455886

  8. Scope of Gradient and Genetic Algorithms in Multivariable Function Optimization

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Sen, S. K.

    2007-01-01

    Global optimization of a multivariable function - constrained by bounds specified on each variable and also unconstrained - is an important problem with several real world applications. Deterministic methods such as the gradient algorithms as well as the randomized methods such as the genetic algorithms may be employed to solve these problems. In fact, there are optimization problems where a genetic algorithm/an evolutionary approach is preferable at least from the quality (accuracy) of the results point of view. From cost (complexity) point of view, both gradient and genetic approaches are usually polynomial-time; there are no serious differences in this regard, i.e., the computational complexity point of view. However, for certain types of problems, such as those with unacceptably erroneous numerical partial derivatives and those with physically amplified analytical partial derivatives whose numerical evaluation involves undesirable errors and/or is messy, a genetic (stochastic) approach should be a better choice. We have presented here the pros and cons of both the approaches so that the concerned reader/user can decide which approach is most suited for the problem at hand. Also for the function which is known in a tabular form, instead of an analytical form, as is often the case in an experimental environment, we attempt to provide an insight into the approaches focusing our attention toward accuracy. Such an insight will help one to decide which method, out of several available methods, should be employed to obtain the best (least error) output. *

  9. Blue rubber bleb nevus syndrome with simultaneous neurological and skeletal involvement.

    PubMed

    Tzoufi, Meropi S; Sixlimiri, Polyxeni; Nakou, Iliada; Argyropoulou, Maria I; Stefanidis, Constantinos J; Siamopoulou-Mavridou, Antigone

    2008-08-01

    Blue rubber bleb nevus syndrome (BRBNS) is a rare disorder characterized by venous malformations usually affecting the skin and the gastrointestinal tract. These skin haemangiomas are present at birth and deteriorate as the body grows, causing primarily cosmetic problems. The haemangiomas of the gastrointestinal tract may appear later in life and may bleed, causing chronic anaemia, or may present with severe complications such as rupture, intestinal torsion, and intussusception. Other organs may also be involved. This article describes a 13-year-old boy with multiple haemangiomas of the skin, the mucous membranes, and the gastrointestinal tract, which caused anaemia and ileoileic intussusception. In this patient, the nervous system was significantly affected with a haemangioma of the left occipital lobe, with complications of stroke. He also had multiple paravertebral heamangiomas, which caused pressure signs and symptoms. This boy suffered from complex partial and generalized seizures and cerebral palsy. Multiple skeletal anomalies were also present from birth. In the relevant literature, this is the first case of BRBNS with simultaneous neurological and skeletal involvement. Such cases should be recognized early, as they can lead to serious multiple health problems and handicaps.

  10. Intentionality, degree of damage, and moral judgments.

    PubMed

    Berg-Cross, L G

    1975-12-01

    153 first graders were given Piagetian moral judgment problems with a new simplified methodology as well as the usual story-pair paradigm. The new methodology involved making quantitative judgments about single stories and examined the influence of level of intentionality and degree of damage upon absolute punishment ratings. Contrary to results obtained with a story-pair methodology, it was found that with single stories even 6-year-old children responded to the level of intention in the stories as well as the quantity and quality of damage involved. This suggested that Piaget's methodology may be forcing children to employ a simplifying strategy while under other conditions they are able to perform the mental operations necessary to make complex moral judgments.

  11. Pattern-Based Inverse Modeling for Characterization of Subsurface Flow Models with Complex Geologic Heterogeneity

    NASA Astrophysics Data System (ADS)

    Golmohammadi, A.; Jafarpour, B.; M Khaninezhad, M. R.

    2017-12-01

    Calibration of heterogeneous subsurface flow models leads to ill-posed nonlinear inverse problems, where too many unknown parameters are estimated from limited response measurements. When the underlying parameters form complex (non-Gaussian) structured spatial connectivity patterns, classical variogram-based geostatistical techniques cannot describe the underlying connectivity patterns. Modern pattern-based geostatistical methods that incorporate higher-order spatial statistics are more suitable for describing such complex spatial patterns. Moreover, when the underlying unknown parameters are discrete (geologic facies distribution), conventional model calibration techniques that are designed for continuous parameters cannot be applied directly. In this paper, we introduce a novel pattern-based model calibration method to reconstruct discrete and spatially complex facies distributions from dynamic flow response data. To reproduce complex connectivity patterns during model calibration, we impose a feasibility constraint to ensure that the solution follows the expected higher-order spatial statistics. For model calibration, we adopt a regularized least-squares formulation, involving data mismatch, pattern connectivity, and feasibility constraint terms. Using an alternating directions optimization algorithm, the regularized objective function is divided into a continuous model calibration problem, followed by mapping the solution onto the feasible set. The feasibility constraint to honor the expected spatial statistics is implemented using a supervised machine learning algorithm. The two steps of the model calibration formulation are repeated until the convergence criterion is met. Several numerical examples are used to evaluate the performance of the developed method.

  12. DockTrina: docking triangular protein trimers.

    PubMed

    Popov, Petr; Ritchie, David W; Grudinin, Sergei

    2014-01-01

    In spite of the abundance of oligomeric proteins within a cell, the structural characterization of protein-protein interactions is still a challenging task. In particular, many of these interactions involve heteromeric complexes, which are relatively difficult to determine experimentally. Hence there is growing interest in using computational techniques to model such complexes. However, assembling large heteromeric complexes computationally is a highly combinatorial problem. Nonetheless the problem can be simplified greatly by considering interactions between protein trimers. After dimers and monomers, triangular trimers (i.e. trimers with pair-wise contacts between all three pairs of proteins) are the most frequently observed quaternary structural motifs according to the three-dimensional (3D) complex database. This article presents DockTrina, a novel protein docking method for modeling the 3D structures of nonsymmetrical triangular trimers. The method takes as input pair-wise contact predictions from a rigid body docking program. It then scans and scores all possible combinations of pairs of monomers using a very fast root mean square deviation test. Finally, it ranks the predictions using a scoring function which combines triples of pair-wise contact terms and a geometric clash penalty term. The overall approach takes less than 2 min per complex on a modern desktop computer. The method is tested and validated using a benchmark set of 220 bound and seven unbound protein trimer structures. DockTrina will be made available at http://nano-d.inrialpes.fr/software/docktrina. Copyright © 2013 Wiley Periodicals, Inc.

  13. Developing a framework for qualitative engineering: Research in design and analysis of complex structural systems

    NASA Technical Reports Server (NTRS)

    Franck, Bruno M.

    1990-01-01

    The research is focused on automating the evaluation of complex structural systems, whether for the design of a new system or the analysis of an existing one, by developing new structural analysis techniques based on qualitative reasoning. The problem is to identify and better understand: (1) the requirements for the automation of design, and (2) the qualitative reasoning associated with the conceptual development of a complex system. The long-term objective is to develop an integrated design-risk assessment environment for the evaluation of complex structural systems. The scope of this short presentation is to describe the design and cognition components of the research. Design has received special attention in cognitive science because it is now identified as a problem solving activity that is different from other information processing tasks (1). Before an attempt can be made to automate design, a thorough understanding of the underlying design theory and methodology is needed, since the design process is, in many cases, multi-disciplinary, complex in size and motivation, and uses various reasoning processes involving different kinds of knowledge in ways which vary from one context to another. The objective is to unify all the various types of knowledge under one framework of cognition. This presentation focuses on the cognitive science framework that we are using to represent the knowledge aspects associated with the human mind's abstraction abilities and how we apply it to the engineering knowledge and engineering reasoning in design.

  14. Prenatally diagnosed de novo apparently balanced complex chromosome rearrangements: Two new cases and review of the literature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruiz, C.; Grubs, R.E.; Jewett, T.

    Complex chromosome rearrangements (CCR) are rare structural rearrangements. Currently six cases of prenatally diagnosed balanced de novo CCR have been described. We present two new cases of prenatally ascertained balanced de novo CCR. In the first case, an amniocentesis revealed a balanced de novo three-way CCR involving chromosomes 5,6, and 11 with a pericentric inversion of chromosome 5 [four breaks]. In the second case a balanced de novo rearrangement was identified by amniocentesis which involved a reciprocal translocation between chromosomes 3 and 8 and a CCR involving chromosomes 6,7, and 18 [six breaks]. The use of whole chromosome painting helpedmore » elucidate the nature of these rearrangements. A review of the postnatally ascertained cases suggests that most patients have congenital anomalies, minor anomalies, and/or developmental delay/mental retardation. In addition, there appears to be a relationship between the number of chromosome breaks and the extent of phenotypic effects. The paucity of information regarding prenatally diagnosed CCR and the bias of ascertainment of postnatal CCR cases poses a problem in counseling families. 38 refs., 3 figs., 4 tabs.« less

  15. Psychometric validation of the Italian Rehabilitation Complexity Scale-Extended version 13

    PubMed Central

    Agosti, Maurizio; Merlo, Andrea; Maini, Maurizio; Lombardi, Francesco; Tedeschi, Claudio; Benedetti, Maria Grazia; Basaglia, Nino; Contini, Mara; Nicolotti, Domenico; Brianti, Rodolfo

    2017-01-01

    In Italy, at present, a well-known problem is inhomogeneous provision of rehabilitative services, as stressed by MoH, requiring appropriate criteria and parameters to plan rehabilitation actions. According to the Italian National Rehabilitation Plan, Comorbidity, Disability and Clinical Complexity should be assessed to define the patient’s real needs. However, to date, clinical complexity is still difficult to measure with shared and validated tools. The study aims to psychometrically validate the Italian Rehabilitation Complexity Scale-Extended v13 (RCS-E v13), in order to meet the guidelines requirements. An observational multicentre prospective cohort study, involving 8 intensive rehabilitation facilities of the Emilia-Romagna Region and 1712 in-patients, [823 male (48%) and 889 female (52%), mean age 68.34 years (95% CI 67.69–69.00 years)] showing neurological, orthopaedic and cardiological problems, was carried out. The construct and concurrent validity of the RCS-E v13 was confirmed through its correlation to Barthel Index (disability) and Cumulative Illness Rating Scale (comorbidity) and appropriate admission criteria (not yet published), respectively. Furthermore, the factor analysis indicated two different components (“Basic Care or Risk—Equipment” and “Medical—Nursing Needs and Therapy Disciplines”) of the RCS-E v13. In conclusion, the Italian RCS-E v13 appears to be a useful tool to assess clinical complexity in the Italian rehab scenario case-mix and its psychometric validation may have an important clinical rehabilitation impact allowing the assessment of the rehabilitation needs considering all three dimensions (disability, comorbidity and clinical complexity) as required by the Guidelines and the inhomogeneity could be reduced. PMID:29045409

  16. Dynamic programming methods for concurrent design and dynamic allocation of vehicles embedded in a system-of-systems

    NASA Astrophysics Data System (ADS)

    Nusawardhana

    2007-12-01

    Recent developments indicate a changing perspective on how systems or vehicles should be designed. Such transition comes from the way decision makers in defense related agencies address complex problems. Complex problems are now often posed in terms of the capabilities desired, rather than in terms of requirements for a single systems. As a result, the way to provide a set of capabilities is through a collection of several individual, independent systems. This collection of individual independent systems is often referred to as a "System of Systems'' (SoS). Because of the independent nature of the constituent systems in an SoS, approaches to design an SoS, and more specifically, approaches to design a new system as a member of an SoS, will likely be different than the traditional design approaches for complex, monolithic (meaning the constituent parts have no ability for independent operation) systems. Because a system of system evolves over time, this simultaneous system design and resource allocation problem should be investigated in a dynamic context. Such dynamic optimization problems are similar to conventional control problems. However, this research considers problems which not only seek optimizing policies but also seek the proper system or vehicle to operate under these policies. This thesis presents a framework and a set of analytical tools to solve a class of SoS problems that involves the simultaneous design of a new system and allocation of the new system along with existing systems. Such a class of problems belongs to the problems of concurrent design and control of a new systems with solutions consisting of both optimal system design and optimal control strategy. Rigorous mathematical arguments show that the proposed framework solves the concurrent design and control problems. Many results exist for dynamic optimization problems of linear systems. In contrary, results on optimal nonlinear dynamic optimization problems are rare. The proposed framework is equipped with the set of analytical tools to solve several cases of nonlinear optimal control problems: continuous- and discrete-time nonlinear problems with applications on both optimal regulation and tracking. These tools are useful when mathematical descriptions of dynamic systems are available. In the absence of such a mathematical model, it is often necessary to derive a solution based on computer simulation. For this case, a set of parameterized decision may constitute a solution. This thesis presents a method to adjust these parameters based on the principle of stochastic approximation simultaneous perturbation using continuous measurements. The set of tools developed here mostly employs the methods of exact dynamic programming. However, due to the complexity of SoS problems, this research also develops suboptimal solution approaches, collectively recognized as approximate dynamic programming solutions, for large scale problems. The thesis presents, explores, and solves problems from an airline industry, in which a new aircraft is to be designed and allocated along with an existing fleet of aircraft. Because the life cycle of an aircraft is on the order of 10 to 20 years, this problem is to be addressed dynamically so that the new aircraft design is the best design for the fleet over a given time horizon.

  17. Flow simulations about steady-complex and unsteady moving configurations using structured-overlapped and unstructured grids

    NASA Technical Reports Server (NTRS)

    Newman, James C., III

    1995-01-01

    The limiting factor in simulating flows past realistic configurations of interest has been the discretization of the physical domain on which the governing equations of fluid flow may be solved. In an attempt to circumvent this problem, many Computational Fluid Dynamic (CFD) methodologies that are based on different grid generation and domain decomposition techniques have been developed. However, due to the costs involved and expertise required, very few comparative studies between these methods have been performed. In the present work, the two CFD methodologies which show the most promise for treating complex three-dimensional configurations as well as unsteady moving boundary problems are evaluated. These are namely the structured-overlapped and the unstructured grid schemes. Both methods use a cell centered, finite volume, upwind approach. The structured-overlapped algorithm uses an approximately factored, alternating direction implicit scheme to perform the time integration, whereas, the unstructured algorithm uses an explicit Runge-Kutta method. To examine the accuracy, efficiency, and limitations of each scheme, they are applied to the same steady complex multicomponent configurations and unsteady moving boundary problems. The steady complex cases consist of computing the subsonic flow about a two-dimensional high-lift multielement airfoil and the transonic flow about a three-dimensional wing/pylon/finned store assembly. The unsteady moving boundary problems are a forced pitching oscillation of an airfoil in a transonic freestream and a two-dimensional, subsonic airfoil/store separation sequence. Accuracy was accessed through the comparison of computed and experimentally measured pressure coefficient data on several of the wing/pylon/finned store assembly's components and at numerous angles-of-attack for the pitching airfoil. From this study, it was found that both the structured-overlapped and the unstructured grid schemes yielded flow solutions of comparable accuracy for these simulations. This study also indicated that, overall, the structured-overlapped scheme was slightly more CPU efficient than the unstructured approach.

  18. Practical modeling approaches for geological storage of carbon dioxide.

    PubMed

    Celia, Michael A; Nordbotten, Jan M

    2009-01-01

    The relentless increase of anthropogenic carbon dioxide emissions and the associated concerns about climate change have motivated new ideas about carbon-constrained energy production. One technological approach to control carbon dioxide emissions is carbon capture and storage, or CCS. The underlying idea of CCS is to capture the carbon before it emitted to the atmosphere and store it somewhere other than the atmosphere. Currently, the most attractive option for large-scale storage is in deep geological formations, including deep saline aquifers. Many physical and chemical processes can affect the fate of the injected CO2, with the overall mathematical description of the complete system becoming very complex. Our approach to the problem has been to reduce complexity as much as possible, so that we can focus on the few truly important questions about the injected CO2, most of which involve leakage out of the injection formation. Toward this end, we have established a set of simplifying assumptions that allow us to derive simplified models, which can be solved numerically or, for the most simplified cases, analytically. These simplified models allow calculation of solutions to large-scale injection and leakage problems in ways that traditional multicomponent multiphase simulators cannot. Such simplified models provide important tools for system analysis, screening calculations, and overall risk-assessment calculations. We believe this is a practical and important approach to model geological storage of carbon dioxide. It also serves as an example of how complex systems can be simplified while retaining the essential physics of the problem.

  19. Proper care for the dying: a critical public issue.

    PubMed Central

    Crispell, K R; Gomez, C F

    1987-01-01

    The ability of the medical profession to sustain life, or more appropriately, to prolong dying, in patients with terminal illness, creates a most complex and controversial situation for all involved: the patient, if mentally alert; the patient's family; and the medical care team including physicians, nurses and attendants. This situation is especially complex in large acute care hospitals where medical and nursing students, residents and house officers receive advanced medical training. A major problem, prolonging the dying of the terminally ill, with its medical, legal, ethical and economic complexities now confronts American society. The problem is particularly acute in teaching hospitals, in which one finds a disproportionate number of terminally ill patients. The ability to work at these questions as a community rather than as adversaries will determine much about the ability of the health care system to respect the dignity and autonomy of those who seek aid and comfort when faced with serious illness and impending death. Better communication between the physicians, health care providers, the lawyers and ethicists must be developed in order to solve these problems. Over the next ten years society and our elected representatives will be making very demanding decisions about the use of the health dollar. One possible way to prevent increasing costs is to reach significant agreement on the proper care of the dying. Proper care for the dying is being considered, discussed, and evaluated by very thoughtful people. It is not governments which should decide who is to live or who is to die. There is the serious problem of the 'slippery slope' to euthanasia by omission if cost containment becomes the major force in formulating policy on the proper care of the dying. PMID:3612698

  20. Terahertz reflection imaging using Kirchhoff migration.

    PubMed

    Dorney, T D; Johnson, J L; Van Rudd, J; Baraniuk, R G; Symes, W W; Mittleman, D M

    2001-10-01

    We describe a new imaging method that uses single-cycle pulses of terahertz (THz) radiation. This technique emulates data-collection and image-processing procedures developed for geophysical prospecting and is made possible by the availability of fiber-coupled THz receiver antennas. We use a simple migration procedure to solve the inverse problem; this permits us to reconstruct the location and shape of targets. These results demonstrate the feasibility of the THz system as a test-bed for the exploration of new seismic processing methods involving complex model systems.

  1. Acute severe asthma presenting in late pregnancy.

    PubMed

    Holland, S M; Thomson, K D

    2006-01-01

    Asthma is the commonest pre-existing medical condition to complicate pregnancy. Acute severe asthma in pregnancy is rare, but poses difficult problems. In particular, the decision about when and where to deliver the fetus is complex, since maternal response to asthma treatment is unpredictable. We report the successful management of a parturient presenting with acute severe asthma at 37 weeks' gestation. The controversies involved and the importance of adopting a multi-disciplinary team approach to optimise maternal and neonatal outcomes are discussed.

  2. Novel approach to recurrent cavoatrial renal cell carcinoma.

    PubMed

    Alejo, Jennifer L; George, Timothy J; Beaty, Claude A; Allaf, Mohamad E; Black, James H; Shah, Ashish S

    2012-05-01

    Renal cell carcinoma (RCC) with cavoatrial extension is a rare and complex problem. Complete resection is difficult but correlates with favorable patient outcomes. We present 2 cases of successful reoperative resections of recurrent RCC in patients with level III-IV cavoatrial involvement. We used a thoracoabdominal approach, peripheral cannulation, and hypothermic circulatory arrest. We advocate this novel approach as a successful means of avoiding a more difficult reoperation. Copyright © 2012 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  3. Parallel Architectures and Parallel Algorithms for Integrated Vision Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Choudhary, Alok Nidhi

    1989-01-01

    Computer vision is regarded as one of the most complex and computationally intensive problems. An integrated vision system (IVS) is a system that uses vision algorithms from all levels of processing to perform for a high level application (e.g., object recognition). An IVS normally involves algorithms from low level, intermediate level, and high level vision. Designing parallel architectures for vision systems is of tremendous interest to researchers. Several issues are addressed in parallel architectures and parallel algorithms for integrated vision systems.

  4. The Parallel of Decomposition of Linear Programs

    DTIC Science & Technology

    1989-11-01

    length is 16*(3+86) = 1424 bytes for all the test problems. Sending a message involves loading it into a buffer and copying the buffer into the proper...3 + r.) Primal PoinL and Ray 16 * (3 + r) Dual Point or Ray 8 * (4 + r.) Table 4.2: Message sizes. into a buffer . Subproblems have one mailbox for...model,i.e., to disaggregate. For instance, "dairy products" becomes milk, cheese, yogurt and ice cream. Adding complexity allows a model to give a more

  5. Using Neural Networks in the Mapping of Mixed Discrete/Continuous Design Spaces With Application to Structural Design

    DTIC Science & Technology

    1994-02-01

    desired that the problem to which the design space mapping techniques were applied be easily analyzed, yet provide a design space with realistic complexity...consistent fully stressed solution. 3 DESIGN SPACE MAPPING In order to reduce the computational expense required to optimize design spaces, neural networks...employed in this study. Some of the issues involved in using neural networks to do design space mapping are how to configure the neural network, how much

  6. A return mapping algorithm for isotropic and anisotropic plasticity models using a line search method

    DOE PAGES

    Scherzinger, William M.

    2016-05-01

    The numerical integration of constitutive models in computational solid mechanics codes allows for the solution of boundary value problems involving complex material behavior. Metal plasticity models, in particular, have been instrumental in the development of these codes. Here, most plasticity models implemented in computational codes use an isotropic von Mises yield surface. The von Mises, of J 2, yield surface has a simple predictor-corrector algorithm - the radial return algorithm - to integrate the model.

  7. Current trends in tendinopathy: consensus of the ESSKA basic science committee. Part I: biology, biomechanics, anatomy and an exercise-based approach.

    PubMed

    Abat, F; Alfredson, H; Cucchiarini, M; Madry, H; Marmotti, A; Mouton, C; Oliveira, J M; Pereira, H; Peretti, G M; Romero-Rodriguez, D; Spang, C; Stephen, J; van Bergen, C J A; de Girolamo, L

    2017-12-01

    Chronic tendinopathies represent a major problem in the clinical practice of sports orthopaedic surgeons, sports doctors and other health professionals involved in the treatment of athletes and patients that perform repetitive actions. The lack of consensus relative to the diagnostic tools and treatment modalities represents a management dilemma for these professionals. With this review, the purpose of the ESSKA Basic Science Committee is to establish guidelines for understanding, diagnosing and treating this complex pathology.

  8. Mobile robot exploration and navigation of indoor spaces using sonar and vision

    NASA Technical Reports Server (NTRS)

    Kortenkamp, David; Huber, Marcus; Koss, Frank; Belding, William; Lee, Jaeho; Wu, Annie; Bidlack, Clint; Rodgers, Seth

    1994-01-01

    Integration of skills into an autonomous robot that performs a complex task is described. Time constraints prevented complete integration of all the described skills. The biggest problem was tuning the sensor-based region-finding algorithm to the environment involved. Since localization depended on matching regions found with the a priori map, the robot became lost very quickly. If the low level sensing of the world is not working, then high level reasoning or map making will be unsuccessful.

  9. Enantioselective Organocatalytic α-Fluorination of Cyclic Ketones

    PubMed Central

    Kwiatkowski, Piotr; Beeson, Teresa D.; Conrad, Jay C.

    2011-01-01

    The first highly enantioselective α-fluorination of ketones using organocatalysis has been accomplished. The long-standing problem of enantioselective ketone α-fluorination via enamine activation has been overcome via high-throughput evaluation of a new library of amine catalysts. The optimal system, a primary amine functionalized Cinchona alkaloid, allows the direct and asymmetric α-fluorination of a variety of carbo- and heterocyclic substrates. Furthermore, this protocol also provides diastereo-, regio- and chemoselective catalyst control in fluorinations involving complex carbonyl systems. PMID:21247133

  10. Automatic yield-line analysis of slabs using discontinuity layout optimization

    PubMed Central

    Gilbert, Matthew; He, Linwei; Smith, Colin C.; Le, Canh V.

    2014-01-01

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented. PMID:25104905

  11. New Approaches to HSCT Multidisciplinary Design and Optimization

    NASA Technical Reports Server (NTRS)

    Schrage, Daniel P.; Craig, James I.; Fulton, Robert E.; Mistree, Farrokh

    1999-01-01

    New approaches to MDO have been developed and demonstrated during this project on a particularly challenging aeronautics problem- HSCT Aeroelastic Wing Design. To tackle this problem required the integration of resources and collaboration from three Georgia Tech laboratories: ASDL, SDL, and PPRL, along with close coordination and participation from industry. Its success can also be contributed to the close interaction and involvement of fellows from the NASA Multidisciplinary Analysis and Optimization (MAO) program, which was going on in parallel, and provided additional resources to work the very complex, multidisciplinary problem, along with the methods being developed. The development of the Integrated Design Engineering Simulator (IDES) and its initial demonstration is a necessary first step in transitioning the methods and tools developed to larger industrial sized problems of interest. It also provides a framework for the implementation and demonstration of the methodology. Attachment: Appendix A - List of publications. Appendix B - Year 1 report. Appendix C - Year 2 report. Appendix D - Year 3 report. Appendix E - accompanying CDROM.

  12. A New Classification of Endodontic-Periodontal Lesions

    PubMed Central

    Al-Fouzan, Khalid S.

    2014-01-01

    The interrelationship between periodontal and endodontic disease has always aroused confusion, queries, and controversy. Differentiating between a periodontal and an endodontic problem can be difficult. A symptomatic tooth may have pain of periodontal and/or pulpal origin. The nature of that pain is often the first clue in determining the etiology of such a problem. Radiographic and clinical evaluation can help clarify the nature of the problem. In some cases, the influence of pulpal pathology may cause the periodontal involvement and vice versa. The simultaneous existence of pulpal problems and inflammatory periodontal disease can complicate diagnosis and treatment planning. An endo-perio lesion can have a varied pathogenesis which ranges from simple to relatively complex one. The differential diagnosis of endodontic and periodontal diseases can sometimes be difficult, but it is of vital importance to make a correct diagnosis for providing the appropriate treatment. This paper aims to discuss a modified clinical classification to be considered for accurately diagnosing and treating endo-perio lesion. PMID:24829580

  13. A new classification of endodontic-periodontal lesions.

    PubMed

    Al-Fouzan, Khalid S

    2014-01-01

    The interrelationship between periodontal and endodontic disease has always aroused confusion, queries, and controversy. Differentiating between a periodontal and an endodontic problem can be difficult. A symptomatic tooth may have pain of periodontal and/or pulpal origin. The nature of that pain is often the first clue in determining the etiology of such a problem. Radiographic and clinical evaluation can help clarify the nature of the problem. In some cases, the influence of pulpal pathology may cause the periodontal involvement and vice versa. The simultaneous existence of pulpal problems and inflammatory periodontal disease can complicate diagnosis and treatment planning. An endo-perio lesion can have a varied pathogenesis which ranges from simple to relatively complex one. The differential diagnosis of endodontic and periodontal diseases can sometimes be difficult, but it is of vital importance to make a correct diagnosis for providing the appropriate treatment. This paper aims to discuss a modified clinical classification to be considered for accurately diagnosing and treating endo-perio lesion.

  14. A method of boundary equations for unsteady hyperbolic problems in 3D

    NASA Astrophysics Data System (ADS)

    Petropavlovsky, S.; Tsynkov, S.; Turkel, E.

    2018-07-01

    We consider interior and exterior initial boundary value problems for the three-dimensional wave (d'Alembert) equation. First, we reduce a given problem to an equivalent operator equation with respect to unknown sources defined only at the boundary of the original domain. In doing so, the Huygens' principle enables us to obtain the operator equation in a form that involves only finite and non-increasing pre-history of the solution in time. Next, we discretize the resulting boundary equation and solve it efficiently by the method of difference potentials (MDP). The overall numerical algorithm handles boundaries of general shape using regular structured grids with no deterioration of accuracy. For long simulation times it offers sub-linear complexity with respect to the grid dimension, i.e., is asymptotically cheaper than the cost of a typical explicit scheme. In addition, our algorithm allows one to share the computational cost between multiple similar problems. On multi-processor (multi-core) platforms, it benefits from what can be considered an effective parallelization in time.

  15. Operational research as implementation science: definitions, challenges and research priorities.

    PubMed

    Monks, Thomas

    2016-06-06

    Operational research (OR) is the discipline of using models, either quantitative or qualitative, to aid decision-making in complex implementation problems. The methods of OR have been used in healthcare since the 1950s in diverse areas such as emergency medicine and the interface between acute and community care; hospital performance; scheduling and management of patient home visits; scheduling of patient appointments; and many other complex implementation problems of an operational or logistical nature. To date, there has been limited debate about the role that operational research should take within implementation science. I detail three such roles for OR all grounded in upfront system thinking: structuring implementation problems, prospective evaluation of improvement interventions, and strategic reconfiguration. Case studies from mental health, emergency medicine, and stroke care are used to illustrate each role. I then describe the challenges for applied OR within implementation science at the organisational, interventional, and disciplinary levels. Two key challenges include the difficulty faced in achieving a position of mutual understanding between implementation scientists and research users and a stark lack of evaluation of OR interventions. To address these challenges, I propose a research agenda to evaluate applied OR through the lens of implementation science, the liberation of OR from the specialist research and consultancy environment, and co-design of models with service users. Operational research is a mature discipline that has developed a significant volume of methodology to improve health services. OR offers implementation scientists the opportunity to do more upfront system thinking before committing resources or taking risks. OR has three roles within implementation science: structuring an implementation problem, prospective evaluation of implementation problems, and a tool for strategic reconfiguration of health services. Challenges facing OR as implementation science include limited evidence and evaluation of impact, limited service user involvement, a lack of managerial awareness, effective communication between research users and OR modellers, and availability of healthcare data. To progress the science, a focus is needed in three key areas: evaluation of OR interventions, embedding the knowledge of OR in health services, and educating OR modellers about the aims and benefits of service user involvement.

  16. Crystallization of bi-functional ligand protein complexes.

    PubMed

    Antoni, Claudia; Vera, Laura; Devel, Laurent; Catalani, Maria Pia; Czarny, Bertrand; Cassar-Lajeunesse, Evelyn; Nuti, Elisa; Rossello, Armando; Dive, Vincent; Stura, Enrico Adriano

    2013-06-01

    Homodimerization is important in signal transduction and can play a crucial role in many other biological systems. To obtaining structural information for the design of molecules able to control the signalization pathways, the proteins involved will have to be crystallized in complex with ligands that induce dimerization. Bi-functional drugs have been generated by linking two ligands together chemically and the relative crystallizability of complexes with mono-functional and bi-functional ligands has been evaluated. There are problems associated with crystallization with such ligands, but overall, the advantages appear to be greater than the drawbacks. The study involves two matrix metalloproteinases, MMP-12 and MMP-9. Using flexible and rigid linkers we show that it is possible to control the crystal packing and that by changing the ligand-enzyme stoichiometric ratio, one can toggle between having one bi-functional ligand binding to two enzymes and having the same ligand bound to each enzyme. The nature of linker and its point of attachment on the ligand can be varied to aid crystallization, and such variations can also provide valuable structural information about the interactions made by the linker with the protein. We report here the crystallization and structure determination of seven ligand-dimerized complexes. These results suggest that the use of bi-functional drugs can be extended beyond the realm of protein dimerization to include all drug design projects. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Decoding the Heart through Next Generation Sequencing Approaches.

    PubMed

    Pawlak, Michal; Niescierowicz, Katarzyna; Winata, Cecilia Lanny

    2018-06-07

    : Vertebrate organs develop through a complex process which involves interaction between multiple signaling pathways at the molecular, cell, and tissue levels. Heart development is an example of such complex process which, when disrupted, results in congenital heart disease (CHD). This complexity necessitates a holistic approach which allows the visualization of genome-wide interaction networks, as opposed to assessment of limited subsets of factors. Genomics offers a powerful solution to address the problem of biological complexity by enabling the observation of molecular processes at a genome-wide scale. The emergence of next generation sequencing (NGS) technology has facilitated the expansion of genomics, increasing its output capacity and applicability in various biological disciplines. The application of NGS in various aspects of heart biology has resulted in new discoveries, generating novel insights into this field of study. Here we review the contributions of NGS technology into the understanding of heart development and its disruption reflected in CHD and discuss how emerging NGS based methodologies can contribute to the further understanding of heart repair.

  18. Towards the Rational Design of MRI Contrast Agents: Electron Spin Relaxation Is Largely Unaffected by the Coordination Geometry of Gadolinium(III)–DOTA-Type Complexes

    PubMed Central

    Bean, Jonathan F.; Clarkson, Robert B.; Helm, Lothar; Moriggi, Loïck; Sherry, A. Dean

    2009-01-01

    Electron-spin relaxation is one of the determining factors in the efficacy of MRI contrast agents. Of all the parameters involved in determining relaxivity it remains the least well understood, particularly as it relates to the structure of the complex. One of the reasons for the poor understanding of electron-spin relaxation is that it is closely related to the ligand-field parameters of the Gd3+ ion that forms the basis of MRI contrast agents and these complexes generally exhibit a structural isomerism that inherently complicates the study of electron spin relaxation. We have recently shown that two DOTA-type ligands could be synthesised that, when coordinated to Gd3+, would adopt well defined coordination geometries and are not subject to the problems of intramolecular motion of other complexes. The EPR properties of these two chelates were studied and the results examined with theory to probe their electron-spin relaxation properties. PMID:18283704

  19. Locality for quantum systems on graphs depends on the number field

    NASA Astrophysics Data System (ADS)

    Hall, H. Tracy; Severini, Simone

    2013-07-01

    Adapting a definition of Aaronson and Ambainis (2005 Theory Comput. 1 47-79), we call a quantum dynamics on a digraph saturated Z-local if the nonzero transition amplitudes specifying the unitary evolution are in exact correspondence with the directed edges (including loops) of the digraph. This idea appears recurrently in a variety of contexts including angular momentum, quantum chaos, and combinatorial matrix theory. Complete characterization of the digraph properties that allow such a process to exist is a long-standing open question that can also be formulated in terms of minimum rank problems. We prove that saturated Z-local dynamics involving complex amplitudes occur on a proper superset of the digraphs that allow restriction to the real numbers or, even further, the rationals. Consequently, among these fields, complex numbers guarantee the largest possible choice of topologies supporting a discrete quantum evolution. A similar construction separates complex numbers from the skew field of quaternions. The result proposes a concrete ground for distinguishing between complex and quaternionic quantum mechanics.

  20. Emergence of Scale-Free Syntax Networks

    NASA Astrophysics Data System (ADS)

    Corominas-Murtra, Bernat; Valverde, Sergi; Solé, Ricard V.

    The evolution of human language allowed the efficient propagation of nongenetic information, thus creating a new form of evolutionary change. Language development in children offers the opportunity of exploring the emergence of such complex communication system and provides a window to understanding the transition from protolanguage to language. Here we present the first analysis of the emergence of syntax in terms of complex networks. A previously unreported, sharp transition is shown to occur around two years of age from a (pre-syntactic) tree-like structure to a scale-free, small world syntax network. The observed combinatorial patterns provide valuable data to understand the nature of the cognitive processes involved in the acquisition of syntax, introducing a new ingredient to understand the possible biological endowment of human beings which results in the emergence of complex language. We explore this problem by using a minimal, data-driven model that is able to capture several statistical traits, but some key features related to the emergence of syntactic complexity display important divergences.

  1. From Astrochemistry to prebiotic chemistry? An hypothetical approach toward Astrobiology

    NASA Astrophysics Data System (ADS)

    Le Sergeant d'Hendecourt, L.; Danger, G.

    2012-12-01

    We present in this paper a general perspective about the evolution of molecular complexity, as observed from an astrophysicist point of view and its possible relation to the problem of the origin of life on Earth. Based on the cosmic abundances of the elements and the molecular composition of our life, we propose that life cannot really be based on other elements. We discuss where the necessary molecular complexity is built-up in astrophysical environments, actually within inter/circumstellar solid state materials known as ``grains''. Considerations based on non-directed laboratory experiments, that must be further extended in the prebiotic domain, lead to the hypothesis that if the chemistry at the origin of life may indeed be a rather universal and deterministic phenomenon, once molecular complexity is installed, the chemical evolution that generated the first prebiotic reactions that involve autoreplication must be treated in a systemic approach because of the strong contingency imposed by the complex local environment(s) and associated processes in which these chemical systems have evolved.

  2. A Chebyshev Collocation Method for Moving Boundaries, Heat Transfer, and Convection During Directional Solidification

    NASA Technical Reports Server (NTRS)

    Zhang, Yiqiang; Alexander, J. I. D.; Ouazzani, J.

    1994-01-01

    Free and moving boundary problems require the simultaneous solution of unknown field variables and the boundaries of the domains on which these variables are defined. There are many technologically important processes that lead to moving boundary problems associated with fluid surfaces and solid-fluid boundaries. These include crystal growth, metal alloy and glass solidification, melting and name propagation. The directional solidification of semi-conductor crystals by the Bridgman-Stockbarger method is a typical example of such a complex process. A numerical model of this growth method must solve the appropriate heat, mass and momentum transfer equations and determine the location of the melt-solid interface. In this work, a Chebyshev pseudospectra collocation method is adapted to the problem of directional solidification. Implementation involves a solution algorithm that combines domain decomposition, finite-difference preconditioned conjugate minimum residual method and a Picard type iterative scheme.

  3. The contact sport of rough surfaces

    NASA Astrophysics Data System (ADS)

    Carpick, Robert W.

    2018-01-01

    Describing the way two surfaces touch and make contact may seem simple, but it is not. Fully describing the elastic deformation of ideally smooth contacting bodies, under even low applied pressure, involves second-order partial differential equations and fourth-rank elastic constant tensors. For more realistic rough surfaces, the problem becomes a multiscale exercise in surface-height statistics, even before including complex phenomena such as adhesion, plasticity, and fracture. A recent research competition, the “Contact Mechanics Challenge” (1), was designed to test various approximate methods for solving this problem. A hypothetical rough surface was generated, and the community was invited to model contact with this surface with competing theories for the calculation of properties, including contact area and pressure. A supercomputer-generated numerical solution was kept secret until competition entries were received. The comparison of results (2) provides insights into the relative merits of competing models and even experimental approaches to the problem.

  4. Direct heuristic dynamic programming for damping oscillations in a large power system.

    PubMed

    Lu, Chao; Si, Jennie; Xie, Xiaorong

    2008-08-01

    This paper applies a neural-network-based approximate dynamic programming method, namely, the direct heuristic dynamic programming (direct HDP), to a large power system stability control problem. The direct HDP is a learning- and approximation-based approach to addressing nonlinear coordinated control under uncertainty. One of the major design parameters, the controller learning objective function, is formulated to directly account for network-wide low-frequency oscillation with the presence of nonlinearity, uncertainty, and coupling effect among system components. Results include a novel learning control structure based on the direct HDP with applications to two power system problems. The first case involves static var compensator supplementary damping control, which is used to provide a comprehensive evaluation of the learning control performance. The second case aims at addressing a difficult complex system challenge by providing a new solution to a large interconnected power network oscillation damping control problem that frequently occurs in the China Southern Power Grid.

  5. Theoretical Studies of the Kinetics of First-Order Phase Transitions.

    NASA Astrophysics Data System (ADS)

    Zheng, Qiang

    This thesis involves theoretical studies of the kinetics of orderings in three classes of systems. The first class involves problems of phase separation in which the order parameter is conserved, such as occurs in the binary alloy Al-Zn. A theory is developed for the late stages of phase separation in the droplet regime for two -dimensional systems, namely, Ostwald ripening in two dimensions. The theory considers droplet correlations, which was neglected before, by a proper treatment of the screening effect of the correlations. This correlation effect is found that it does not alert the scaling features of phase separation, but significantly changes the shape of droplet-size distribution function. Further experiments and computer simulations are needed before this long-time subject may be closed. A second class of problem involves a study of the finite-size effects on domain growth described by the Allen-Cahn dynamics. Based on a theoretical approach of Ohta, Jasnow, and Kawasaki the explicit scaling functions for the scattering intensity for hypercubes and films are obtained. These results are for the cases in which the order-parameter is not conserved, such as in an order-disorder transition in alloys. These studies will be relevant to the experimental and computer simulation research projects currently being carried out in the United States and Europe. The last class of problems involves orderings in strong correlated systems, namely, the growth of Breath Figures. A special feature of this class of problems is that the coalescence effect. A theoretical model is proposed which can handle the two growth mechanisms, the individual droplet growth and coalescence simultaneously. Under certain approximations, the droplet-size distribution function is obtained analytically, and is in qualitative agreement with computer simulations. Our model also suggests that there may be an interesting relationship between the growth of Breath Figures and a geometric structure (ultrametricity) of general complex systems.

  6. Abstraction of an Affective-Cognitive Decision Making Model Based on Simulated Behaviour and Perception Chains

    NASA Astrophysics Data System (ADS)

    Sharpanskykh, Alexei; Treur, Jan

    Employing rich internal agent models of actors in large-scale socio-technical systems often results in scalability issues. The problem addressed in this paper is how to improve computational properties of a complex internal agent model, while preserving its behavioral properties. The problem is addressed for the case of an existing affective-cognitive decision making model instantiated for an emergency scenario. For this internal decision model an abstracted behavioral agent model is obtained, which ensures a substantial increase of the computational efficiency at the cost of approximately 1% behavioural error. The abstraction technique used can be applied to a wide range of internal agent models with loops, for example, involving mutual affective-cognitive interactions.

  7. Super resolution reconstruction of infrared images based on classified dictionary learning

    NASA Astrophysics Data System (ADS)

    Liu, Fei; Han, Pingli; Wang, Yi; Li, Xuan; Bai, Lu; Shao, Xiaopeng

    2018-05-01

    Infrared images always suffer from low-resolution problems resulting from limitations of imaging devices. An economical approach to combat this problem involves reconstructing high-resolution images by reasonable methods without updating devices. Inspired by compressed sensing theory, this study presents and demonstrates a Classified Dictionary Learning method to reconstruct high-resolution infrared images. It classifies features of the samples into several reasonable clusters and trained a dictionary pair for each cluster. The optimal pair of dictionaries is chosen for each image reconstruction and therefore, more satisfactory results is achieved without the increase in computational complexity and time cost. Experiments and results demonstrated that it is a viable method for infrared images reconstruction since it improves image resolution and recovers detailed information of targets.

  8. Some human factors issues in the development and evaluation of cockpit alerting and warning systems

    NASA Technical Reports Server (NTRS)

    Randle, R. J., Jr.; Larsen, W. E.; Williams, D. H.

    1980-01-01

    A set of general guidelines for evaluating a newly developed cockpit alerting and warning system in terms of human factors issues are provided. Although the discussion centers around a general methodology, it is made specifically to the issues involved in alerting systems. An overall statement of the current operational problem is presented. Human factors problems with reference to existing alerting and warning systems are described. The methodology for proceeding through system development to system test is discussed. The differences between traditional human factors laboratory evaluations and those required for evaluation of complex man-machine systems under development are emphasized. Performance evaluation in the alerting and warning subsystem using a hypothetical sample system is explained.

  9. Winnerless competition principle and prediction of the transient dynamics in a Lotka-Volterra model

    NASA Astrophysics Data System (ADS)

    Afraimovich, Valentin; Tristan, Irma; Huerta, Ramon; Rabinovich, Mikhail I.

    2008-12-01

    Predicting the evolution of multispecies ecological systems is an intriguing problem. A sufficiently complex model with the necessary predicting power requires solutions that are structurally stable. Small variations of the system parameters should not qualitatively perturb its solutions. When one is interested in just asymptotic results of evolution (as time goes to infinity), then the problem has a straightforward mathematical image involving simple attractors (fixed points or limit cycles) of a dynamical system. However, for an accurate prediction of evolution, the analysis of transient solutions is critical. In this paper, in the framework of the traditional Lotka-Volterra model (generalized in some sense), we show that the transient solution representing multispecies sequential competition can be reproducible and predictable with high probability.

  10. Winnerless competition principle and prediction of the transient dynamics in a Lotka-Volterra model.

    PubMed

    Afraimovich, Valentin; Tristan, Irma; Huerta, Ramon; Rabinovich, Mikhail I

    2008-12-01

    Predicting the evolution of multispecies ecological systems is an intriguing problem. A sufficiently complex model with the necessary predicting power requires solutions that are structurally stable. Small variations of the system parameters should not qualitatively perturb its solutions. When one is interested in just asymptotic results of evolution (as time goes to infinity), then the problem has a straightforward mathematical image involving simple attractors (fixed points or limit cycles) of a dynamical system. However, for an accurate prediction of evolution, the analysis of transient solutions is critical. In this paper, in the framework of the traditional Lotka-Volterra model (generalized in some sense), we show that the transient solution representing multispecies sequential competition can be reproducible and predictable with high probability.

  11. Team approach to treatment of the posttraumatic stiff hand. A case report.

    PubMed

    Morey, K R; Watson, A H

    1986-02-01

    Posttraumatic hand stiffness is a common but complex problem treated in many general clinics and in hand treatment centers. Although much information is available regarding various treatment procedures, the use of a team approach to evaluate and treat hand stiffness has not been examined thoroughly in the Journal. The problems of the patient with a stiff hand include both physical and psychological components that must be addressed in a structured manner. The clinical picture of posttraumatic hand stiffness involves edema, immobility, pain, and the inability to incorporate the affected extremity into daily activities. In this case report, we review the purpose and philosophy of the team approach to hand therapy and the clarification of responsibilities for physical therapy and occupational therapy intervention.

  12. Systems Engineering Awareness

    NASA Technical Reports Server (NTRS)

    Lucero, John

    2016-01-01

    The presentation will provide an overview of the fundamentals and principles of Systems Engineering (SE). This includes understanding the processes that are used to assist the engineer in a successful design, build and implementation of solutions. The context of this presentation will be to describe the involvement of SE throughout the life-cycle of a project from cradle to grave. Due to the ever growing number of complex technical problems facing our world, a Systems Engineering approach is desirable for many reasons. The interdisciplinary technical structure of current systems, technical processes representing System Design, Technical Management and Product Realization are instrumental in the development and integration of new technologies into mainstream applications. This tutorial will demonstrate the application of SE tools to these types of problems..

  13. Improved ALE mesh velocities for complex flows

    DOE PAGES

    Bakosi, Jozsef; Waltz, Jacob I.; Morgan, Nathaniel Ray

    2017-05-31

    A key choice in the development of arbitrary Lagrangian-Eulerian solution algorithms is how to move the computational mesh. The most common approaches are smoothing and relaxation techniques, or to compute a mesh velocity field that produces smooth mesh displacements. We present a method in which the mesh velocity is specified by the irrotational component of the fluid velocity as computed from a Helmholtz decomposition, and excess compression of mesh cells is treated through a noniterative, local spring-force model. This approach allows distinct and separate control over rotational and translational modes. In conclusion, the utility of the new mesh motion algorithmmore » is demonstrated on a number of 3D test problems, including problems that involve both shocks and significant amounts of vorticity.« less

  14. Supramolecular complexation for environmental control.

    PubMed

    Albelda, M Teresa; Frías, Juan C; García-España, Enrique; Schneider, Hans-Jörg

    2012-05-21

    Supramolecular complexes offer a new and efficient way for the monitoring and removal of many substances emanating from technical processes, fertilization, plant and animal protection, or e.g. chemotherapy. Such pollutants range from toxic or radioactive metal ions and anions to chemical side products, herbicides, pesticides to drugs including steroids, and include degradation products from natural sources. The applications involve usually fast and reversible complex formation, due to prevailing non-covalent interactions. This is of importance for sensing as well as for separation techniques, where the often expensive host compounds can then be reused almost indefinitely. Immobilization of host compounds, e.g. on exchange resins or on membranes, and their implementation in smart new materials hold particular promise. The review illustrates how the design of suitable host compounds in combination with modern sensing and separation methods can contribute to solve some of the biggest problems facing chemistry, which arise from the everyday increasing pollution of the environment.

  15. Pure electronic metal-insulator transition at the interface of complex oxides

    DOE PAGES

    Meyers, D.; Liu, Jian; Freeland, J. W.; ...

    2016-06-21

    We observed complex materials in electronic phases and transitions between them often involve coupling between many degrees of freedom whose entanglement convolutes understanding of the instigating mechanism. Metal-insulator transitions are one such problem where coupling to the structural, orbital, charge, and magnetic order parameters frequently obscures the underlying physics. We demonstrate a way to unravel this conundrum by heterostructuring a prototypical multi-ordered complex oxide NdNiO3 in ultra thin geometry, which preserves the metal-to-insulator transition and bulk-like magnetic order parameter, but entirely suppresses the symmetry lowering and long-range charge order parameter. Furthermore, these findings illustrate the utility of heterointerfaces as amore » powerful method for removing competing order parameters to gain greater insight into the nature of the transition, here revealing that the magnetic order generates the transition independently, leading to an exceptionally rare purely electronic metal-insulator transition with no symmetry change.« less

  16. Video Analysis and Remote Digital Ethnography: Approaches to understanding user perspectives and processes involving healthcare information technology.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    Innovations in healthcare information systems promise to revolutionize and streamline healthcare processes worldwide. However, the complexity of these systems and the need to better understand issues related to human-computer interaction have slowed progress in this area. In this chapter the authors describe their work in using methods adapted from usability engineering, video ethnography and analysis of digital log files for improving our understanding of complex real-world healthcare interactions between humans and technology. The approaches taken are cost-effective and practical and can provide detailed ethnographic data on issues health professionals and consumers encounter while using systems as well as potential safety problems. The work is important in that it can be used in techno-anthropology to characterize complex user interactions with technologies and also to provide feedback into redesign and optimization of improved healthcare information systems.

  17. Quantum Speedup for Active Learning Agents

    NASA Astrophysics Data System (ADS)

    Paparo, Giuseppe Davide; Dunjko, Vedran; Makmal, Adi; Martin-Delgado, Miguel Angel; Briegel, Hans J.

    2014-07-01

    Can quantum mechanics help us build intelligent learning agents? A defining signature of intelligent behavior is the capacity to learn from experience. However, a major bottleneck for agents to learn in real-life situations is the size and complexity of the corresponding task environment. Even in a moderately realistic environment, it may simply take too long to rationally respond to a given situation. If the environment is impatient, allowing only a certain time for a response, an agent may then be unable to cope with the situation and to learn at all. Here, we show that quantum physics can help and provide a quadratic speedup for active learning as a genuine problem of artificial intelligence. This result will be particularly relevant for applications involving complex task environments.

  18. Complex regional pain syndrome (CRPS) with resistance to local anesthetic block: a case report.

    PubMed

    Maneksha, F R; Mirza, H; Poppers, P J

    2000-02-01

    We present a case of complex regional pain syndrome (CRPS) Type 1 in a 12-year-old girl. The patient did not respond to the usual therapeutic modalities used to treat CRPS, including physical therapy, lumbar sympathetic block, epidural local anesthetic block, intravenous lidocaine infusion, or other oral medications. Of note is the fact that, during epidural block, the patient demonstrated a resistance to local anesthetic neural blockade in the area of the body involved with the pain problem. The mechanism of this resistance could be related to the changes in the dorsal horn cells of the spinal cord, secondary to activation of N-methyl-D-aspartate receptors, which may play a role in the pathophysiology of this pain syndrome.

  19. Games as Tools to Address Conservation Conflicts.

    PubMed

    Redpath, Steve M; Keane, Aidan; Andrén, Henrik; Baynham-Herd, Zachary; Bunnefeld, Nils; Duthie, A Bradley; Frank, Jens; Garcia, Claude A; Månsson, Johan; Nilsson, Lovisa; Pollard, Chris R J; Rakotonarivo, O Sarobidy; Salk, Carl F; Travers, Henry

    2018-06-01

    Conservation conflicts represent complex multilayered problems that are challenging to study. We explore the utility of theoretical, experimental, and constructivist approaches to games to help to understand and manage these challenges. We show how these approaches can help to develop theory, understand patterns in conflict, and highlight potentially effective management solutions. The choice of approach should be guided by the research question and by whether the focus is on testing hypotheses, predicting behaviour, or engaging stakeholders. Games provide an exciting opportunity to help to unravel the complexity in conflicts, while researchers need an awareness of the limitations and ethical constraints involved. Given the opportunities, this field will benefit from greater investment and development. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Projected regression method for solving Fredholm integral equations arising in the analytic continuation problem of quantum physics

    NASA Astrophysics Data System (ADS)

    Arsenault, Louis-François; Neuberg, Richard; Hannah, Lauren A.; Millis, Andrew J.

    2017-11-01

    We present a supervised machine learning approach to the inversion of Fredholm integrals of the first kind as they arise, for example, in the analytic continuation problem of quantum many-body physics. The approach provides a natural regularization for the ill-conditioned inverse of the Fredholm kernel, as well as an efficient and stable treatment of constraints. The key observation is that the stability of the forward problem permits the construction of a large database of outputs for physically meaningful inputs. Applying machine learning to this database generates a regression function of controlled complexity, which returns approximate solutions for previously unseen inputs; the approximate solutions are then projected onto the subspace of functions satisfying relevant constraints. Under standard error metrics the method performs as well or better than the Maximum Entropy method for low input noise and is substantially more robust to increased input noise. We suggest that the methodology will be similarly effective for other problems involving a formally ill-conditioned inversion of an integral operator, provided that the forward problem can be efficiently solved.

  1. Analysis of Complexity Evolution Management and Human Performance Issues in Commercial Aircraft Automation Systems

    NASA Technical Reports Server (NTRS)

    Vakil, Sanjay S.; Hansman, R. John

    2000-01-01

    Autoflight systems in the current generation of aircraft have been implicated in several recent incidents and accidents. A contributory aspect to these incidents may be the manner in which aircraft transition between differing behaviours or 'modes.' The current state of aircraft automation was investigated and the incremental development of the autoflight system was tracked through a set of aircraft to gain insight into how these systems developed. This process appears to have resulted in a system without a consistent global representation. In order to evaluate and examine autoflight systems, a 'Hybrid Automation Representation' (HAR) was developed. This representation was used to examine several specific problems known to exist in aircraft systems. Cyclomatic complexity is an analysis tool from computer science which counts the number of linearly independent paths through a program graph. This approach was extended to examine autoflight mode transitions modelled with the HAR. A survey was conducted of pilots to identify those autoflight mode transitions which airline pilots find difficult. The transitions identified in this survey were analyzed using cyclomatic complexity to gain insight into the apparent complexity of the autoflight system from the perspective of the pilot. Mode transitions which had been identified as complex by pilots were found to have a high cyclomatic complexity. Further examination was made into a set of specific problems identified in aircraft: the lack of a consistent representation of automation, concern regarding appropriate feedback from the automation, and the implications of physical limitations on the autoflight systems. Mode transitions involved in changing to and leveling at a new altitude were identified across multiple aircraft by numerous pilots. Where possible, evaluation and verification of the behaviour of these autoflight mode transitions was investigated via aircraft-specific high fidelity simulators. Three solution approaches to concerns regarding autoflight systems, and mode transitions in particular, are presented in this thesis. The first is to use training to modify pilot behaviours, or procedures to work around known problems. The second approach is to mitigate problems by enhancing feedback. The third approach is to modify the process by which automation is designed. The Operator Directed Process forces the consideration and creation of an automation model early in the design process for use as the basis of the software specification and training.

  2. Decision Making Under Uncertainty and Complexity: A Model-Based Scenario Approach to Supporting Integrated Water Resources Management

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Gupta, H.; Wagener, T.; Stewart, S.; Mahmoud, M.; Hartmann, H.; Springer, E.

    2007-12-01

    Some of the most challenging issues facing contemporary water resources management are those typified by complex coupled human-environmental systems with poorly characterized uncertainties. In other words, major decisions regarding water resources have to be made in the face of substantial uncertainty and complexity. It has been suggested that integrated models can be used to coherently assemble information from a broad set of domains, and can therefore serve as an effective means for tackling the complexity of environmental systems. Further, well-conceived scenarios can effectively inform decision making, particularly when high complexity and poorly characterized uncertainties make the problem intractable via traditional uncertainty analysis methods. This presentation discusses the integrated modeling framework adopted by SAHRA, an NSF Science & Technology Center, to investigate stakeholder-driven water sustainability issues within the semi-arid southwestern US. The multi-disciplinary, multi-resolution modeling framework incorporates a formal scenario approach to analyze the impacts of plausible (albeit uncertain) alternative futures to support adaptive management of water resources systems. Some of the major challenges involved in, and lessons learned from, this effort will be discussed.

  3. Characterization of Structure and Damage in Materials in Four Dimensions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, I. M.; Schuh, C. A.; Vetrano, J. S.

    2010-09-30

    The materials characterization toolbox has recently experienced a number of parallel revolutionary advances, foreshadowing a time in the near future when materials scientists can quantify material structure across orders of magnitude in length and time scales (i.e., in four dimensions) completely. This paper presents a viewpoint on the materials characterization field, reviewing its recent past, evaluating its present capabilities, and proposing directions for its future development. Electron microscopy; atom-probe tomography; X-ray, neutron and electron tomography; serial sectioning tomography; and diffraction-based analysis methods are reviewed, and opportunities for their future development are highlighted. Particular attention is paid to studies that havemore » pioneered the synergetic use of multiple techniques to provide complementary views of a single structure or process; several of these studies represent the state-of-the-art in characterization, and suggest a trajectory for the continued development of the field. Based on this review, a set of grand challenges for characterization science is identified, including suggestions for instrumentation advances, scientific problems in microstructure analysis, and complex structure evolution problems involving materials damage. The future of microstructural characterization is proposed to be one not only where individual techniques are pushed to their limits, but where the community devises strategies of technique synergy to address complex multiscale problems in materials science and engineering.« less

  4. From Dr. Steven Ashby, Director of PNNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashby, Steven

    Powered by the creativity and imagination of more than 4,000 exceptional scientists, engineers and support professionals, at PNNL we advance the frontiers of science and address some of the most challenging problems in energy, the environment and national security. As DOE’s premier chemistry, environmental sciences and data analytics laboratory, we provide national leadership in four areas: deepening our understanding of climate science; inventing the future power grid; preventing nuclear proliferation; and speeding environmental remediation. Other areas where we make important contributions include energy storage, microbial biology and cyber security. PNNL also is home to EMSL (the Environmental Molecular Sciences Laboratory),more » one of DOE’s scientific user facilities. We apply these science strengths to address both national and international problems in complex adaptive systems that are too difficult for one institution to tackle alone. Take earth systems, for instance. The earth is a complex adaptive system because it involves everything from climate and microbial communities in the soil to emissions from cars and coal-powered industrial plants. All of these factors and others ultimately influence not only our environment and overall quality of life, but cause the earth to adapt in ways that must be further addressed. PNNL researchers are playing a vital role in finding solutions across every area of this complex adaptive system.« less

  5. Working Memory and Reasoning Benefit from Different Modes of Large-scale Brain Dynamics in Healthy Older Adults.

    PubMed

    Lebedev, Alexander V; Nilsson, Jonna; Lövdén, Martin

    2018-07-01

    Researchers have proposed that solving complex reasoning problems, a key indicator of fluid intelligence, involves the same cognitive processes as solving working memory tasks. This proposal is supported by an overlap of the functional brain activations associated with the two types of tasks and by high correlations between interindividual differences in performance. We replicated these findings in 53 older participants but also showed that solving reasoning and working memory problems benefits from different configurations of the functional connectome and that this dissimilarity increases with a higher difficulty load. Specifically, superior performance in a typical working memory paradigm ( n-back) was associated with upregulation of modularity (increased between-network segregation), whereas performance in the reasoning task was associated with effective downregulation of modularity. We also showed that working memory training promotes task-invariant increases in modularity. Because superior reasoning performance is associated with downregulation of modular dynamics, training may thus have fostered an inefficient way of solving the reasoning tasks. This could help explain why working memory training does little to promote complex reasoning performance. The study concludes that complex reasoning abilities cannot be reduced to working memory and suggests the need to reconsider the feasibility of using working memory training interventions to attempt to achieve effects that transfer to broader cognition.

  6. An experimental approach to the fundamental principles of hemodynamics.

    PubMed

    Pontiga, Francisco; Gaytán, Susana P

    2005-09-01

    An experimental model has been developed to give students hands-on experience with the fundamental laws of hemodynamics. The proposed experimental setup is of simple construction but permits the precise measurements of physical variables involved in the experience. The model consists in a series of experiments where different basic phenomena are quantitatively investigated, such as the pressure drop in a long straight vessel and in an obstructed vessel, the transition from laminar to turbulent flow, the association of vessels in vascular networks, or the generation of a critical stenosis. Through these experiments, students acquire a direct appreciation of the importance of the parameters involved in the relationship between pressure and flow rate, thus facilitating the comprehension of more complex problems in hemodynamics.

  7. Toward Modeling the Intrinsic Complexity of Test Problems

    ERIC Educational Resources Information Center

    Shoufan, Abdulhadi

    2017-01-01

    The concept of intrinsic complexity explains why different problems of the same type, tackled by the same problem solver, can require different times to solve and yield solutions of different quality. This paper proposes a general four-step approach that can be used to establish a model for the intrinsic complexity of a problem class in terms of…

  8. Investigating the Effect of Complexity Factors in Stoichiometry Problems Using Logistic Regression and Eye Tracking

    ERIC Educational Resources Information Center

    Tang, Hui; Kirk, John; Pienta, Norbert J.

    2014-01-01

    This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we developed. The…

  9. Superresolution radar imaging based on fast inverse-free sparse Bayesian learning for multiple measurement vectors

    NASA Astrophysics Data System (ADS)

    He, Xingyu; Tong, Ningning; Hu, Xiaowei

    2018-01-01

    Compressive sensing has been successfully applied to inverse synthetic aperture radar (ISAR) imaging of moving targets. By exploiting the block sparse structure of the target image, sparse solution for multiple measurement vectors (MMV) can be applied in ISAR imaging and a substantial performance improvement can be achieved. As an effective sparse recovery method, sparse Bayesian learning (SBL) for MMV involves a matrix inverse at each iteration. Its associated computational complexity grows significantly with the problem size. To address this problem, we develop a fast inverse-free (IF) SBL method for MMV. A relaxed evidence lower bound (ELBO), which is computationally more amiable than the traditional ELBO used by SBL, is obtained by invoking fundamental property for smooth functions. A variational expectation-maximization scheme is then employed to maximize the relaxed ELBO, and a computationally efficient IF-MSBL algorithm is proposed. Numerical results based on simulated and real data show that the proposed method can reconstruct row sparse signal accurately and obtain clear superresolution ISAR images. Moreover, the running time and computational complexity are reduced to a great extent compared with traditional SBL methods.

  10. Systems biomarkers as acute diagnostics and chronic monitoring tools for traumatic brain injury

    NASA Astrophysics Data System (ADS)

    Wang, Kevin K. W.; Moghieb, Ahmed; Yang, Zhihui; Zhang, Zhiqun

    2013-05-01

    Traumatic brain injury (TBI) is a significant biomedical problem among military personnel and civilians. There exists an urgent need to develop and refine biological measures of acute brain injury and chronic recovery after brain injury. Such measures "biomarkers" can assist clinicians in helping to define and refine the recovery process and developing treatment paradigms for the acutely injured to reduce secondary injury processes. Recent biomarker studies in the acute phase of TBI have highlighted the importance and feasibilities of identifying clinically useful biomarkers. However, much less is known about the subacute and chronic phases of TBI. We propose here that for a complex biological problem such as TBI, multiple biomarker types might be needed to harness the wide range of pathological and systemic perturbations following injuries, including acute neuronal death, neuroinflammation, neurodegeneration and neuroregeneration to systemic responses. In terms of biomarker types, they range from brain-specific proteins, microRNA, genetic polymorphism, inflammatory cytokines and autoimmune markers and neuro-endocrine hormones. Furthermore, systems biology-driven biomarkers integration can help present a holistic approach to understanding scenarios and complexity pathways involved in brain injury.

  11. [Errors in wound management].

    PubMed

    Filipović, Marinko; Novinscak, Tomislav

    2014-10-01

    Chronic ulcers have adverse effects on the patient quality of life and productivity, thus posing financial burden upon the healthcare system. Chronic wound healing is a complex process resulting from the interaction of the patient general health status, wound related factors, medical personnel skill and competence, and therapy related products. In clinical practice, considerable improvement has been made in the treatment of chronic wounds, which is evident in the reduced rate of the severe forms of chronic wounds in outpatient clinics. However, in spite of all the modern approaches, efforts invested by medical personnel and agents available for wound care, numerous problems are still encountered in daily practice. Most frequently, the problems arise from inappropriate education, of young personnel in particular, absence of multidisciplinary approach, and inadequate communication among the personnel directly involved in wound treatment. To perceive them more clearly, the potential problems or complications in the management of chronic wounds can be classified into the following groups: problems mostly related to the use of wound coverage and other etiology related specificities of wound treatment; problems related to incompatibility of the agents used in wound treatment; and problems arising from failure to ensure aseptic and antiseptic performance conditions.

  12. Robust Decision Making: The Cognitive and Computational Modeling of Team Problem Solving for Decision Making under Complex and Dynamic Conditions

    DTIC Science & Technology

    2015-07-14

    AFRL-OSR-VA-TR-2015-0202 Robust Decision Making: The Cognitive and Computational Modeling of Team Problem Solving for Decision Making under Complex...Computational Modeling of Team Problem Solving for Decision Making Under Complex and Dynamic Conditions 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-12-1...functioning as they solve complex problems, and propose the means to improve the performance of teams, under changing or adversarial conditions. By

  13. Surrogate assisted multidisciplinary design optimization for an all-electric GEO satellite

    NASA Astrophysics Data System (ADS)

    Shi, Renhe; Liu, Li; Long, Teng; Liu, Jian; Yuan, Bin

    2017-09-01

    State-of-the-art all-electric geostationary earth orbit (GEO) satellites use electric thrusters to execute all propulsive duties, which significantly differ from the traditional all-chemical ones in orbit-raising, station-keeping, radiation damage protection, and power budget, etc. Design optimization task of an all-electric GEO satellite is therefore a complex multidisciplinary design optimization (MDO) problem involving unique design considerations. However, solving the all-electric GEO satellite MDO problem faces big challenges in disciplinary modeling techniques and efficient optimization strategy. To address these challenges, we presents a surrogate assisted MDO framework consisting of several modules, i.e., MDO problem definition, multidisciplinary modeling, multidisciplinary analysis (MDA), and surrogate assisted optimizer. Based on the proposed framework, the all-electric GEO satellite MDO problem is formulated to minimize the total mass of the satellite system under a number of practical constraints. Then considerable efforts are spent on multidisciplinary modeling involving geosynchronous transfer, GEO station-keeping, power, thermal control, attitude control, and structure disciplines. Since orbit dynamics models and finite element structural model are computationally expensive, an adaptive response surface surrogate based optimizer is incorporated in the proposed framework to solve the satellite MDO problem with moderate computational cost, where a response surface surrogate is gradually refined to represent the computationally expensive MDA process. After optimization, the total mass of the studied GEO satellite is decreased by 185.3 kg (i.e., 7.3% of the total mass). Finally, the optimal design is further discussed to demonstrate the effectiveness of our proposed framework to cope with the all-electric GEO satellite system design optimization problems. This proposed surrogate assisted MDO framework can also provide valuable references for other all-electric spacecraft system design.

  14. Physical inactivity as a policy problem: applying a concept from policy analysis to a public health issue.

    PubMed

    Rütten, Alfred; Abu-Omar, Karim; Gelius, Peter; Schow, Diana

    2013-03-07

    Despite the recent rapid development of policies to counteract physical inactivity (PI), only a small number of systematic analyses on the evolution of these policies exists. In this article we analyze how PI, as a public health issue, "translates" into a policy-making issue. First, we discuss why PI has become an increasingly important public health issue during the last two decades. We then follow Guy Peters and conceptualize PI as a "policy problem" that has the potential to be linked to policy instruments and policy impact. Analysis indicates that PI is a policy problem that i) is chronic in nature; ii) involves a high degree of political complexity; iii) can be disaggregated into smaller scales; iv) is addressed through interventions that can be difficult to "sell" to the public when their benefits are not highly divisible; v) cannot be solved by government spending alone; vi) must be addressed through a broad scope of activities; and vii) involves interdependencies among both multiple sectors and levels of government.We conclude that the new perspective on PI proposed in this article might be useful and important for i) describing and mapping policies to counteract PI in different contexts; ii) evaluating whether or not existing policy instruments are appropriate to the policy problem of PI, and iii) explaining the factors and processes that underlie policy development and implementation. More research is warranted in all these areas. In particular, we propose to focus on comparative analyses of how the problem of PI is defined and tackled in different contexts, and on the identification of truly effective policy instruments that are designed to "solve" the PI policy problem.

  15. Vision-related problems among the workers engaged in jewellery manufacturing.

    PubMed

    Salve, Urmi Ravindra

    2015-01-01

    American Optometric Association defines Computer Vision Syndrome (CVS) as "complex of eye and vision problems related to near work which are experienced during or related to computer use." This happens when visual demand of the tasks exceeds the visual ability of the users. Even though problems were initially attributed to computer-related activities subsequently similar problems are also reported while carrying any near point task. Jewellery manufacturing activities involves precision designs, setting the tiny metals and stones which requires high visual attention and mental concentration and are often near point task. It is therefore expected that the workers engaged in jewellery manufacturing may also experience symptoms like CVS. Keeping the above in mind, this study was taken up (1) To identify the prevalence of symptoms like CVS among the workers of the jewellery manufacturing and compare the same with the workers working at computer workstation and (2) To ascertain whether such symptoms have any permanent vision-related problems. Case control study. The study was carried out in Zaveri Bazaar region and at an IT-enabled organization in Mumbai. The study involved the identification of symptoms of CVS using a questionnaire of Eye Strain Journal, opthalmological check-ups and measurement of Spontaneous Eye Blink rate. The data obtained from the jewellery manufacturing was compared with the data of the subjects engaged in computer work and with the data available in the literature. A comparative inferential statistics was used. Results showed that visual demands of the task carried out in jewellery manufacturing were much higher than that of carried out in computer-related work.

  16. Statistical mechanics of complex neural systems and high dimensional data

    NASA Astrophysics Data System (ADS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-03-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.

  17. Addressing the unmet need for visualizing conditional random fields in biological data

    PubMed Central

    2014-01-01

    Background The biological world is replete with phenomena that appear to be ideally modeled and analyzed by one archetypal statistical framework - the Graphical Probabilistic Model (GPM). The structure of GPMs is a uniquely good match for biological problems that range from aligning sequences to modeling the genome-to-phenome relationship. The fundamental questions that GPMs address involve making decisions based on a complex web of interacting factors. Unfortunately, while GPMs ideally fit many questions in biology, they are not an easy solution to apply. Building a GPM is not a simple task for an end user. Moreover, applying GPMs is also impeded by the insidious fact that the “complex web of interacting factors” inherent to a problem might be easy to define and also intractable to compute upon. Discussion We propose that the visualization sciences can contribute to many domains of the bio-sciences, by developing tools to address archetypal representation and user interaction issues in GPMs, and in particular a variety of GPM called a Conditional Random Field(CRF). CRFs bring additional power, and additional complexity, because the CRF dependency network can be conditioned on the query data. Conclusions In this manuscript we examine the shared features of several biological problems that are amenable to modeling with CRFs, highlight the challenges that existing visualization and visual analytics paradigms induce for these data, and document an experimental solution called StickWRLD which, while leaving room for improvement, has been successfully applied in several biological research projects. Software and tutorials are available at http://www.stickwrld.org/ PMID:25000815

  18. Hybrid DG/FV schemes for magnetohydrodynamics and relativistic hydrodynamics

    NASA Astrophysics Data System (ADS)

    Núñez-de la Rosa, Jonatan; Munz, Claus-Dieter

    2018-01-01

    This paper presents a high order hybrid discontinuous Galerkin/finite volume scheme for solving the equations of the magnetohydrodynamics (MHD) and of the relativistic hydrodynamics (SRHD) on quadrilateral meshes. In this approach, for the spatial discretization, an arbitrary high order discontinuous Galerkin spectral element (DG) method is combined with a finite volume (FV) scheme in order to simulate complex flow problems involving strong shocks. Regarding the time discretization, a fourth order strong stability preserving Runge-Kutta method is used. In the proposed hybrid scheme, a shock indicator is computed at the beginning of each Runge-Kutta stage in order to flag those elements containing shock waves or discontinuities. Subsequently, the DG solution in these troubled elements and in the current time step is projected onto a subdomain composed of finite volume subcells. Right after, the DG operator is applied to those unflagged elements, which, in principle, are oscillation-free, meanwhile the troubled elements are evolved with a robust second/third order FV operator. With this approach we are able to numerically simulate very challenging problems in the context of MHD and SRHD in one, and two space dimensions and with very high order polynomials. We make convergence tests and show a comprehensive one- and two dimensional testbench for both equation systems, focusing in problems with strong shocks. The presented hybrid approach shows that numerical schemes of very high order of accuracy are able to simulate these complex flow problems in an efficient and robust manner.

  19. An investigation of reasoning by analogy in schizophrenia and autism spectrum disorder

    PubMed Central

    Krawczyk, Daniel C.; Kandalaft, Michelle R.; Didehbani, Nyaz; Allen, Tandra T.; McClelland, M. Michelle; Tamminga, Carol A.; Chapman, Sandra B.

    2014-01-01

    Relational reasoning ability relies upon by both cognitive and social factors. We compared analogical reasoning performance in healthy controls (HC) to performance in individuals with Autism Spectrum Disorder (ASD), and individuals with schizophrenia (SZ). The experimental task required participants to find correspondences between drawings of scenes. Participants were asked to infer which item within one scene best matched a relational item within the second scene. We varied relational complexity, presence of distraction, and type of objects in the analogies (living or non-living items). We hypothesized that the cognitive differences present in SZ would reduce relational inferences relative to ASD and HC. We also hypothesized that both SZ and ASD would show lower performance on living item problems relative to HC due to lower social function scores. Overall accuracy was higher for HC relative to SZ, consistent with prior research. Across groups, higher relational complexity reduced analogical responding, as did the presence of non-living items. Separate group analyses revealed that the ASD group was less accurate at making relational inferences in problems that involved mainly non-living items and when distractors were present. The SZ group showed differences in problem type similar to the ASD group. Additionally, we found significant correlations between social cognitive ability and analogical reasoning, particularly for the SZ group. These results indicate that differences in cognitive and social abilities impact the ability to infer analogical correspondences along with numbers of relational elements and types of objects present in the problems. PMID:25191240

  20. A New Approach for Constructing Highly Stable High Order CESE Schemes

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung

    2010-01-01

    A new approach is devised to construct high order CESE schemes which would avoid the common shortcomings of traditional high order schemes including: (a) susceptibility to computational instabilities; (b) computational inefficiency due to their local implicit nature (i.e., at each mesh points, need to solve a system of linear/nonlinear equations involving all the mesh variables associated with this mesh point); (c) use of large and elaborate stencils which complicates boundary treatments and also makes efficient parallel computing much harder; (d) difficulties in applications involving complex geometries; and (e) use of problem-specific techniques which are needed to overcome stability problems but often cause undesirable side effects. In fact it will be shown that, with the aid of a conceptual leap, one can build from a given 2nd-order CESE scheme its 4th-, 6th-, 8th-,... order versions which have the same stencil and same stability conditions of the 2nd-order scheme, and also retain all other advantages of the latter scheme. A sketch of multidimensional extensions will also be provided.

  1. Neuropsychological and structural brain lesions in multiple sclerosis: a regional analysis.

    PubMed

    Swirsky-Sacchetti, T; Mitchell, D R; Seward, J; Gonzales, C; Lublin, F; Knobler, R; Field, H L

    1992-07-01

    Quantified lesion scores derived from MRI correlate significantly with neuropsychological testing in patients with multiple sclerosis (MS). Variables used to reflect disease severity include total lesion area (TLA), ventricular-brain ratio, and size of the corpus callosum. We used these general measures of cerebral lesion involvement as well as specific ratings of lesion involvement by frontal, temporal, and parieto-occipital regions to quantify the topographic distribution of lesions and consequent effects upon cognitive function. Lesions were heavily distributed in the parieto-occipital regions bilaterally. Neuropsychological tests were highly related to all generalized measures of cerebral involvement, with TLA being the best predictor of neuropsychological deficit. Mean TLA for the cognitively impaired group was 28.30 cm2 versus 7.41 cm2 for the cognitively intact group (p less than 0.0001). Multiple regression analyses revealed that left frontal lobe involvement best predicted impaired abstract problem solving, memory, and word fluency. Left parieto-occipital lesion involvement best predicted deficits in verbal learning and complex visual-integrative skills. Analysis of regional cerebral lesion load may assist in understanding the particular pattern and course of cognitive deficits in MS.

  2. Exact posterior computation in non-conjugate Gaussian location-scale parameters models

    NASA Astrophysics Data System (ADS)

    Andrade, J. A. A.; Rathie, P. N.

    2017-12-01

    In Bayesian analysis the class of conjugate models allows to obtain exact posterior distributions, however this class quite restrictive in the sense that it involves only a few distributions. In fact, most of the practical applications involves non-conjugate models, thus approximate methods, such as the MCMC algorithms, are required. Although these methods can deal with quite complex structures, some practical problems can make their applications quite time demanding, for example, when we use heavy-tailed distributions, convergence may be difficult, also the Metropolis-Hastings algorithm can become very slow, in addition to the extra work inevitably required on choosing efficient candidate generator distributions. In this work, we draw attention to the special functions as a tools for Bayesian computation, we propose an alternative method for obtaining the posterior distribution in Gaussian non-conjugate models in an exact form. We use complex integration methods based on the H-function in order to obtain the posterior distribution and some of its posterior quantities in an explicit computable form. Two examples are provided in order to illustrate the theory.

  3. Lovelock action with nonsmooth boundaries

    NASA Astrophysics Data System (ADS)

    Cano, Pablo A.

    2018-05-01

    We examine the variational problem in Lovelock gravity when the boundary contains timelike and spacelike segments nonsmoothly glued. We show that two kinds of contributions have to be added to the action. The first one is associated with the presence of a boundary in every segment and it depends on intrinsic and extrinsic curvatures. We can think of this contribution as adding a total derivative to the usual surface term of Lovelock gravity. The second one appears in every joint between two segments and it involves the integral along the joint of the Jacobson-Myers entropy density weighted by the Lorentz boost parameter, which relates the orthonormal frames in each segment. We argue that this term can be straightforwardly extended to the case of joints involving null boundaries. As an application, we compute the contribution of these terms to the complexity of global anti-de Sitter space in Lovelock gravity by using the "complexity =action " proposal and we identify possible universal terms for arbitrary values of the Lovelock couplings. We find that they depend on the charge a* controlling the holographic entanglement entropy and on a new constant that we characterize.

  4. Child Involvement in Interparental Conflict and Child Adjustment Problems: A Longitudinal Study of Violent Families

    PubMed Central

    Jouriles, Ernest N.; Rosenfield, David; McDonald, Renee; Mueller, Victoria

    2014-01-01

    This study examined whether child involvement in interparental conflict predicts child externalizing and internalizing problems in violent families. Participants were 119 families (mothers and children) recruited from domestic violence shelters. One child between the ages of 7 and 10 years in each family (50 female, 69 male) completed measures of involvement in their parents’ conflicts, externalizing problems, and internalizing problems. Mothers completed measures of child externalizing and internalizing problems, and physical intimate partner violence. Measures were completed at three assessments, spaced 6 months apart. Results indicated that children’s involvement in their parents’ conflicts was positively associated with child adjustment problems. These associations emerged in between-subjects and within-subjects analyses, and for child externalizing as well as internalizing problems, even after controlling for the influence of physical intimate partner violence. In addition, child involvement in parental conflicts predicted later child reports of externalizing problems, but child reports of externalizing problems did not predict later involvement in parental conflicts. These findings highlight the importance of considering children’s involvement in their parents’ conflicts in theory and clinical work pertaining to high-conflict families. PMID:24249486

  5. Transport of reacting solutes in porous media: Relation between mathematical nature of problem formulation and chemical nature of reactions

    USGS Publications Warehouse

    Rubin, Jacob

    1983-01-01

    Examples involving six broad reaction classes show that the nature of transport-affecting chemistry may have a profound effect on the mathematical character of solute transport problem formulation. Substantive mathematical diversity among such formulations is brought about principally by reaction properties that determine whether (1) the reaction can be regarded as being controlled by local chemical equilibria or whether it must be considered as being controlled by kinetics, (2) the reaction is homogeneous or heterogeneous, (3) the reaction is a surface reaction (adsorption, ion exchange) or one of the reactions of classical chemistry (e.g., precipitation, dissolution, oxidation, reduction, complex formation). These properties, as well as the choice of means to describe them, stipulate, for instance, (1) the type of chemical entities for which a formulation's basic, mass-balance equations should be written; (2) the nature of mathematical transformations needed to change the problem's basic equations into operational ones. These and other influences determine such mathematical features of problem formulations as the nature of the operational transport-equation system (e.g., whether it involves algebraic, partial-differential, or integro-partial-differential simultaneous equations), the type of nonlinearities of such a system, and the character of the boundaries (e.g., whether they are stationary or moving). Exploration of the reasons for the dependence of transport mathematics on transport chemistry suggests that many results of this dependence stem from the basic properties of the reactions' chemical-relation (i.e., equilibrium or rate) equations.

  6. Association between school bullying levels/types and mental health problems among Taiwanese adolescents.

    PubMed

    Yen, Cheng-Fang; Yang, Pinchen; Wang, Peng-Wei; Lin, Huang-Chi; Liu, Tai-Ling; Wu, Yu-Yu; Tang, Tze-Chun

    2014-04-01

    Few studies have compared the risks of mental health problems among the adolescents with different levels and different types of bullying involvement experiences. Bullying involvement in 6,406 adolescents was determined through use of the Chinese version of the School Bullying Experience Questionnaire. Data were collected regarding the mental health problems, including depression, suicidality, insomnia, general anxiety, social phobia, alcohol abuse, inattention, and hyperactivity/impulsivity. The association between experiences of bullying involvement and mental health problems was examined. The risk of mental health problems was compared among those with different levels/types of bullying involvement. The results found that being a victim of any type of bullying and being a perpetrator of passive bullying were significantly associated with all kinds of mental health problems, and being a perpetrator of active bullying was significantly associated with all kinds of mental health problems except for general anxiety. Victims or perpetrators of both passive and active bullying had a greater risk of some dimensions of mental health problems than those involved in only passive or active bullying. Differences in the risk of mental health problems were also found among adolescents involved in different types of bullying. This difference in comorbid mental health problems should be taken into consideration when assessing adolescents involved in different levels/types of bullying. © 2014.

  7. Formative feedback and scaffolding for developing complex problem solving and modelling outcomes

    NASA Astrophysics Data System (ADS)

    Frank, Brian; Simper, Natalie; Kaupp, James

    2018-07-01

    This paper discusses the use and impact of formative feedback and scaffolding to develop outcomes for complex problem solving in a required first-year course in engineering design and practice at a medium-sized research-intensive Canadian university. In 2010, the course began to use team-based, complex, open-ended contextualised problems to develop problem solving, communications, teamwork, modelling, and professional skills. Since then, formative feedback has been incorporated into: task and process-level feedback on scaffolded tasks in-class, formative assignments, and post-assignment review. Development in complex problem solving and modelling has been assessed through analysis of responses from student surveys, direct criterion-referenced assessment of course outcomes from 2013 to 2015, and an external longitudinal study. The findings suggest that students are improving in outcomes related to complex problem solving over the duration of the course. Most notably, the addition of new feedback and scaffolding coincided with improved student performance.

  8. Conscious thought beats deliberation without attention in diagnostic decision-making: at least when you are an expert

    PubMed Central

    Schmidt, Henk G.; Rikers, Remy M. J. P.; Custers, Eugene J. F. M.; Splinter, Ted A. W.; van Saase, Jan L. C. M.

    2010-01-01

    Contrary to what common sense makes us believe, deliberation without attention has recently been suggested to produce better decisions in complex situations than deliberation with attention. Based on differences between cognitive processes of experts and novices, we hypothesized that experts make in fact better decisions after consciously thinking about complex problems whereas novices may benefit from deliberation-without-attention. These hypotheses were confirmed in a study among doctors and medical students. They diagnosed complex and routine problems under three conditions, an immediate-decision condition and two delayed conditions: conscious thought and deliberation-without-attention. Doctors did better with conscious deliberation when problems were complex, whereas reasoning mode did not matter in simple problems. In contrast, deliberation-without-attention improved novices’ decisions, but only in simple problems. Experts benefit from consciously thinking about complex problems; for novices thinking does not help in those cases. PMID:20354726

  9. Conscious thought beats deliberation without attention in diagnostic decision-making: at least when you are an expert.

    PubMed

    Mamede, Sílvia; Schmidt, Henk G; Rikers, Remy M J P; Custers, Eugene J F M; Splinter, Ted A W; van Saase, Jan L C M

    2010-11-01

    Contrary to what common sense makes us believe, deliberation without attention has recently been suggested to produce better decisions in complex situations than deliberation with attention. Based on differences between cognitive processes of experts and novices, we hypothesized that experts make in fact better decisions after consciously thinking about complex problems whereas novices may benefit from deliberation-without-attention. These hypotheses were confirmed in a study among doctors and medical students. They diagnosed complex and routine problems under three conditions, an immediate-decision condition and two delayed conditions: conscious thought and deliberation-without-attention. Doctors did better with conscious deliberation when problems were complex, whereas reasoning mode did not matter in simple problems. In contrast, deliberation-without-attention improved novices' decisions, but only in simple problems. Experts benefit from consciously thinking about complex problems; for novices thinking does not help in those cases.

  10. The Relationship Between Father Involvement and Child Problem Behaviour in Intact Families: A 7-Year Cross-Lagged Study.

    PubMed

    Flouri, Eirini; Midouhas, Emily; Narayanan, Martina K

    2016-07-01

    This study investigated the cross-lagged relationship between father involvement and child problem behaviour across early-to-middle childhood, and tested whether temperament modulated any cross-lagged child behaviour effects on father involvement. It used data from the first four waves of the UK's Millennium Cohort Study, when children (50.3 % male) were aged 9 months, and 3, 5 and 7 years. The sample was 8302 families where both biological parents were co-resident across the four waves. Father involvement (participation in play and physical and educational activities with the child) was measured at ages 3, 5 and 7, as was child problem behaviour (assessed with the Strengths and Difficulties Questionnaire). Key child and family covariates related to father involvement and child problem behaviour were controlled. Little evidence was found that more father involvement predicted less child problem behaviour two years later, with the exception of father involvement at child's age 5 having a significant, but small, effect on peer problems at age 7. There were two child effects. More hyperactive children at age 3 had more involved fathers at age 5, and children with more conduct problems at age 3 had more involved fathers at age 5. Child temperament did not moderate any child behaviour effects on father involvement. Thus, in young, intact UK families, child adjustment appears to predict, rather than be predicted by, father involvement in early childhood. When children showed more problematic behaviours, fathers did not become less involved. In fact, early hyperactivity and conduct problems in children seemed to elicit more involvement from fathers. At school age, father involvement appeared to affect children's social adjustment rather than vice versa.

  11. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1986-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  12. Comparison of multiobjective evolutionary algorithms: empirical results.

    PubMed

    Zitzler, E; Deb, K; Thiele, L

    2000-01-01

    In this paper, we provide a systematic comparison of various evolutionary approaches to multiobjective optimization using six carefully chosen test functions. Each test function involves a particular feature that is known to cause difficulty in the evolutionary optimization process, mainly in converging to the Pareto-optimal front (e.g., multimodality and deception). By investigating these different problem features separately, it is possible to predict the kind of problems to which a certain technique is or is not well suited. However, in contrast to what was suspected beforehand, the experimental results indicate a hierarchy of the algorithms under consideration. Furthermore, the emerging effects are evidence that the suggested test functions provide sufficient complexity to compare multiobjective optimizers. Finally, elitism is shown to be an important factor for improving evolutionary multiobjective search.

  13. The role of sleep problems and circadian clock genes in attention-deficit hyperactivity disorder and mood disorders during childhood and adolescence: an update.

    PubMed

    Dueck, Alexander; Berger, Christoph; Wunsch, Katharina; Thome, Johannes; Cohrs, Stefan; Reis, Olaf; Haessler, Frank

    2017-02-01

    A more recent branch of research describes the importance of sleep problems in the development and treatment of mental disorders in children and adolescents, such as attention-deficit hyperactivity disorder (ADHD) and mood disorders (MD). Research about clock genes has continued since 2012 with a focus on metabolic processes within all parts of the mammalian body, but particularly within different cerebral regions. Research has focused on complex regulatory circuits involving clock genes themselves and their influence on circadian rhythms of diverse body functions. Current publications on basic research in human and animal models indicate directions for the treatment of mental disorders targeting circadian rhythms and mechanisms. The most significant lines of research are described in this paper.

  14. Using the Social Web to Supplement Classical Learning

    NASA Astrophysics Data System (ADS)

    Trausan-Matu, Stefan; Posea, Vlad; Rebedea, Traian; Chiru, Costin

    The paper describes a complex e-learning experiment that has involved over 700 students that attended the Human-Computer Interaction course at the “Politehnica” University of Bucharest during the last 4 years. The experiment consisted in using social web technologies like blogs and chat conferences to engage students in collaborative learning. The paper presents the learning scenario, the problems encountered and the tools developed for solving these problems and assisting tutors in evaluating the activity of the students. The results of the experiment and of using the blog and chat analysis tools are also covered. Moreover, we show the benefits of using such a scenario for the learning community formed by the students that attended this course in order to supplement the classical teaching and learning paradigm.

  15. Auto Draw from Excel Input Files

    NASA Technical Reports Server (NTRS)

    Strauss, Karl F.; Goullioud, Renaud; Cox, Brian; Grimes, James M.

    2011-01-01

    The design process often involves the use of Excel files during project development. To facilitate communications of the information in the Excel files, drawings are often generated. During the design process, the Excel files are updated often to reflect new input. The problem is that the drawings often lag the updates, often leading to confusion of the current state of the design. The use of this program allows visualization of complex data in a format that is more easily understandable than pages of numbers. Because the graphical output can be updated automatically, the manual labor of diagram drawing can be eliminated. The more frequent update of system diagrams can reduce confusion and reduce errors and is likely to uncover symmetric problems earlier in the design cycle, thus reducing rework and redesign.

  16. Dealing with misconduct in biomedical research: a review of the problems and the proposed methods for improvement.

    PubMed

    Kumar, Malhar N

    2009-11-01

    The increasing complexity of scientific research has been followed by increasing varieties of research misconduct. Dealing with misconduct involves the processes of detection, reporting, and investigation of misconduct. Each of these steps is associated with numerous problems which need to be addressed. Misconduct investigation should not stop with inquiries and disciplinary actions in specific episodes of misconduct. It is necessary to decrease the personal price paid by those who expose misconduct and to protect the personal and professional interests of honest researchers accused of misconduct unfairly or mistakenly. There is no dearth of suggestions to improve the objectivity and fairness of investigations. What is needed is the willingness to test the various options and implement the most suitable ones.

  17. An Approach to Management of Gas in the Elderly

    PubMed Central

    Hogan, David B.

    1989-01-01

    In this article I shall review the physiology, clinical manifestations, and management of gaseousness in the elderly. While not an infrequent complaint, little scientific study has been done of the causes or management of this problem. The regualtion of bowel gas is surprisingly complex. When problems occur, it is usually either because of excessive swallowing of air or because of the intraluminal producation of gas by colonic bacteria. Patients present with excessive belching, abdominal pain and bloating, or excessive passage of flatus. Management is determined, in the main, by the results of the history and physical examination. Medications are usually not indicated. Once a malabsorptive state is ruled out, the mainstay of management usually involves either alterations in the patient's diet or avoidance of aerophagia. PMID:21249002

  18. A comparative study of machine learning models for ethnicity classification

    NASA Astrophysics Data System (ADS)

    Trivedi, Advait; Bessie Amali, D. Geraldine

    2017-11-01

    This paper endeavours to adopt a machine learning approach to solve the problem of ethnicity recognition. Ethnicity identification is an important vision problem with its use cases being extended to various domains. Despite the multitude of complexity involved, ethnicity identification comes naturally to humans. This meta information can be leveraged to make several decisions, be it in target marketing or security. With the recent development of intelligent systems a sub module to efficiently capture ethnicity would be useful in several use cases. Several attempts to identify an ideal learning model to represent a multi-ethnic dataset have been recorded. A comparative study of classifiers such as support vector machines, logistic regression has been documented. Experimental results indicate that the logical classifier provides a much accurate classification than the support vector machine.

  19. Efficient searching in meshfree methods

    NASA Astrophysics Data System (ADS)

    Olliff, James; Alford, Brad; Simkins, Daniel C.

    2018-04-01

    Meshfree methods such as the Reproducing Kernel Particle Method and the Element Free Galerkin method have proven to be excellent choices for problems involving complex geometry, evolving topology, and large deformation, owing to their ability to model the problem domain without the constraints imposed on the Finite Element Method (FEM) meshes. However, meshfree methods have an added computational cost over FEM that come from at least two sources: increased cost of shape function evaluation and the determination of adjacency or connectivity. The focus of this paper is to formally address the types of adjacency information that arises in various uses of meshfree methods; a discussion of available techniques for computing the various adjacency graphs; propose a new search algorithm and data structure; and finally compare the memory and run time performance of the methods.

  20. A Genetic Algorithm Tool (splicer) for Complex Scheduling Problems and the Space Station Freedom Resupply Problem

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Valenzuela-Rendon, Manuel

    1993-01-01

    The Space Station Freedom will require the supply of items in a regular fashion. A schedule for the delivery of these items is not easy to design due to the large span of time involved and the possibility of cancellations and changes in shuttle flights. This paper presents the basic concepts of a genetic algorithm model, and also presents the results of an effort to apply genetic algorithms to the design of propellant resupply schedules. As part of this effort, a simple simulator and an encoding by which a genetic algorithm can find near optimal schedules have been developed. Additionally, this paper proposes ways in which robust schedules, i.e., schedules that can tolerate small changes, can be found using genetic algorithms.

  1. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1989-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  2. Preparing new nurses with complexity science and problem-based learning.

    PubMed

    Hodges, Helen F

    2011-01-01

    Successful nurses function effectively with adaptability, improvability, and interconnectedness, and can see emerging and unpredictable complex problems. Preparing new nurses for complexity requires a significant change in prevalent but dated nursing education models for rising graduates. The science of complexity coupled with problem-based learning and peer review contributes a feasible framework for a constructivist learning environment to examine real-time systems data; explore uncertainty, inherent patterns, and ambiguity; and develop skills for unstructured problem solving. This article describes a pilot study of a problem-based learning strategy guided by principles of complexity science in a community clinical nursing course. Thirty-five senior nursing students participated during a 3-year period. Assessments included peer review, a final project paper, reflection, and a satisfaction survey. Results were higher than expected levels of student satisfaction, increased breadth and analysis of complex data, acknowledgment of community as complex adaptive systems, and overall higher level thinking skills than in previous years. 2011, SLACK Incorporated.

  3. Fundamental mechanisms that influence the estimate of heat transfer to gas turbine blades

    NASA Technical Reports Server (NTRS)

    Graham, R. W.

    1979-01-01

    Estimates of the heat transfer from the gas to stationary (vanes) or rotating blades poses a major uncertainty due to the complexity of the heat transfer processes. The gas flow through these blade rows is three dimensional with complex secondary viscous flow patterns that interact with the endwalls and blade surfaces. In addition, upstream disturbances, stagnation flow, curvature effects, and flow acceleration complicate the thermal transport mechanisms in the boundary layers. Some of these fundamental heat transfer effects are discussed. The chief purpose of the discussion is to acquaint those in the heat transfer community, not directly involved in gas turbines, of the seriousness of the problem and to recommend some basic research that would improve the capability for predicting gas-side heat transfer on turbine blades and vanes.

  4. Multiplexed Predictive Control of a Large Commercial Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Richter, hanz; Singaraju, Anil; Litt, Jonathan S.

    2008-01-01

    Model predictive control is a strategy well-suited to handle the highly complex, nonlinear, uncertain, and constrained dynamics involved in aircraft engine control problems. However, it has thus far been infeasible to implement model predictive control in engine control applications, because of the combination of model complexity and the time allotted for the control update calculation. In this paper, a multiplexed implementation is proposed that dramatically reduces the computational burden of the quadratic programming optimization that must be solved online as part of the model-predictive-control algorithm. Actuator updates are calculated sequentially and cyclically in a multiplexed implementation, as opposed to the simultaneous optimization taking place in conventional model predictive control. Theoretical aspects are discussed based on a nominal model, and actual computational savings are demonstrated using a realistic commercial engine model.

  5. FPGA-based coprocessor for matrix algorithms implementation

    NASA Astrophysics Data System (ADS)

    Amira, Abbes; Bensaali, Faycal

    2003-03-01

    Matrix algorithms are important in many types of applications including image and signal processing. These areas require enormous computing power. A close examination of the algorithms used in these, and related, applications reveals that many of the fundamental actions involve matrix operations such as matrix multiplication which is of O (N3) on a sequential computer and O (N3/p) on a parallel system with p processors complexity. This paper presents an investigation into the design and implementation of different matrix algorithms such as matrix operations, matrix transforms and matrix decompositions using an FPGA based environment. Solutions for the problem of processing large matrices have been proposed. The proposed system architectures are scalable, modular and require less area and time complexity with reduced latency when compared with existing structures.

  6. From Paper to PDA: Design and Evaluation of a Clinical Ward Instruction on a Mobile Device

    NASA Astrophysics Data System (ADS)

    Kanstrup, Anne Marie; Stage, Jan

    Mobile devices with small screens and minimal facilities for interaction are increasingly being used in complex human activities for accessing and processing information, while the user is moving. This paper presents a case study of the design and evaluation of a mobile system, which involved transformation of complex text and tables to digital format on a PDA. The application domain was an emergency medical ward, and the user group was junior registrars. We designed a PDA-based system for accessing information, focusing on the ward instruction, implemented a prototype and evaluated it for usability and utility. The evaluation results indicate significant problems in the interaction with the system as well as the extent to which the system is useful for junior registrars in their daily work.

  7. Outer synchronization of complex networks with internal delay and coupling delay via aperiodically intermittent pinning control

    NASA Astrophysics Data System (ADS)

    Zhang, Chuan; Wang, Xingyuan; Wang, Chunpeng; Xia, Zhiqiu

    This paper concerns the outer synchronization problem between two complex delayed networks via the method of aperiodically intermittent pinning control. Apart from previous works, internal delay and coupling delay are both involved in this model, and the designed intermittent controllers can be aperiodic. The main work in this paper can be summarized as follows: First, two cases of aperiodically intermittent control with constant gain and adaptive gain are implemented, respectively. The intermittent control and pinning control are combined to reduce consumptions further. Then, based on the Lyapunov stability theory, synchronization protocols are given by strict derivation. Especially, the designed controllers are indeed simple and valid in application of theory to practice. Finally, numerical examples put the proposed control methods to the test.

  8. Analysing policy delivery in the United Kingdom: the case of street crime and anti-social behaviour.

    PubMed

    Smith, Martin; Richards, David; Geddes, Andrew; Mathers, Helen

    2011-01-01

    For all governments, the principle of how and whether policies are implemented as intended is fundamental. The aim of this paper is to examine the difficulties for governments in delivering policy goals when they do not directly control the processes of implementation. This paper examines two case studies – anti-social behaviour and street crime – and demonstrates the difficulties faced by policy-makers in translating policy into practice when the policy problems are complex and implementation involves many actors.

  9. Automated image processing of Landsat II digital data for watershed runoff prediction

    NASA Technical Reports Server (NTRS)

    Sasso, R. R.; Jensen, J. R.; Estes, J. E.

    1977-01-01

    Digital image processing of Landsat data from a 230 sq km area was examined as a possible means of generating soil cover information for use in the watershed runoff prediction of Kern County, California. The soil cover information included data on brush, grass, pasture lands and forests. A classification accuracy of 94% for the Landsat-based soil cover survey suggested that the technique could be applied to the watershed runoff estimate. However, problems involving the survey of complex mountainous environments may require further attention

  10. Intelligent tutoring systems as tools for investigating individual differences in learning

    NASA Technical Reports Server (NTRS)

    Shute, Valerie J.

    1987-01-01

    The ultimate goal of this research is to build an improved model-based selection and classification system for the United States Air Force. Researchers are developing innovative approaches to ability testing. The Learning Abilities Measurement Program (LAMP) examines individual differences in learning abilities, seeking answers to the questions of why some people learn more and better than others and whether there are basic cognitive processes applicable across tasks and domains that are predictive of successful performance (or whether there are more complex problem solving behaviors involved).

  11. Early-career experts essential for planetary sustainability

    USGS Publications Warehouse

    Lim, Michelle; Lynch, Abigail J.; Fernández-Llamazares, Alvaro; Balint, Lenke; Basher, Zeenatul; Chan, Ivis; Jaureguiberry, Pedro; Mohamed, A.A.A.; Mwampamba, Tuyeni H.; Palomo, Ignacio; Pliscoff, Patricio; Salimov, R.A.; Samakov, Aibek; Selomane, Odirilwe; Shrestha, Uttam B.; Sidorovich, Anna A.

    2017-01-01

    Early-career experts can play a fundamental role in achieving planetary sustainability by bridging generational divides and developing novel solutions to complex problems. We argue that intergenerational partnerships and interdisciplinary collaboration among early-career experts will enable emerging sustainability leaders to contribute fully to a sustainable future. We review 16 international, interdisciplinary, and sustainability-focused early-career capacity building programs. We conclude that such programs are vital to developing sustainability leaders of the future and that decision-making for sustainability is likely to be best served by strong institutional cultures that promote intergenerational learning and involvement.

  12. Frontiers in Fluid Mechanics: A Collection of Research Papers Written in Commemoration of the 65th Birthday of Stanley Corrsin.

    DTIC Science & Technology

    1985-04-30

    analogous fashion. If the flow variable lFtis taken at x and F 6(xi,t) > d , - 1, . n, n > 1 is required, various subsets of the flow domain atare obtained...discussed: non-premixed and premixed combustion. The chemistry of combustion in the gas phase involves complex systems of reaction steps with numerous...components. In order to keep the problem tractable, only a greatly simplified and global description of chemistry will be employed. In both cases V

  13. Seasonal Affective Disorder

    PubMed Central

    Rohan, Kelly J.

    2005-01-01

    Seasonal affective disorder (SAD), characterized by fall/winter major depression with spring/summer remission, is a prevalent mental health problem. SAD etiology is not certain, but available models focus on neurotransmitters, hormones, circadian rhythm dysregulation, genetic polymorphisms, and psychological factors. Light therapy is established as the best available treatment for SAD. Alternative and/or supplementary approaches involving medications, cognitive-behavioral therapy, and exercise are currently being developed and evaluated. Given the complexity of the disorder, interdisciplinary research stands to make a significant contribution to advancing our understanding of SAD conceptualization and treatment. PMID:21179639

  14. Explicitly solvable complex Chebyshev approximation problems related to sine polynomials

    NASA Technical Reports Server (NTRS)

    Freund, Roland

    1989-01-01

    Explicitly solvable real Chebyshev approximation problems on the unit interval are typically characterized by simple error curves. A similar principle is presented for complex approximation problems with error curves induced by sine polynomials. As an application, some new explicit formulae for complex best approximations are derived.

  15. Addressing Complex Challenges through Adaptive Leadership: A Promising Approach to Collaborative Problem Solving

    ERIC Educational Resources Information Center

    Nelson, Tenneisha; Squires, Vicki

    2017-01-01

    Organizations are faced with solving increasingly complex problems. Addressing these issues requires effective leadership that can facilitate a collaborative problem solving approach where multiple perspectives are leveraged. In this conceptual paper, we critique the effectiveness of earlier leadership models in tackling complex organizational…

  16. Nanoaggregation of inclusion complexes of glibenclamide with cyclodextrins.

    PubMed

    Lucio, David; Irache, Juan Manuel; Font, María; Martínez-Ohárriz, María Cristina

    2017-03-15

    Glibenclamide is a sulfonylurea used for the oral treatment of type II diabetes mellitus. This drug shows low bioavailability as consequence of its low solubility. In order to solve this problem, the interaction with cyclodextrin has been proposed. This study tries to provide an explanation about the processes involved in the formation of GB-βCDs complexes, which have been interpreted in different ways by several authors. Among native cyclodextrins, βCD presents the most appropriate cavity to host glibenclamide molecules showing A L solubility diagrams (K 1:1 ≈1700M -1 ). However, [Formula: see text] solubility profiles were found for βCD derivatives, highlighting the coexistence of several phenomena involved in the drug solubility enhancement. At low CD concentration, the formation of inclusion complexes can be studied and the stability constants can be calculated (K 1:1 ≈1400M -1 ). Whereas at high CD concentration, the enhancement of GB solubility would be mainly attributed to the formation of nanoaggregates of CD and GB-CD complexes (sizes between 100 and 300nm). The inclusion mode into βCD occurs through the cyclohexyl ring of GB, adopting a semi-folded conformation which maximizes the hydrogen bond network. As consequence of all these phenomena, a 150-fold enhancement of drug solubility has been achieved using β-cyclodextrin derivatives. Thus, its use has proven to be an interesting tool to improve the oral administration of glibenclamide in accordance with dosage bulk and dose/solubility ratio requirements. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Bhopal: lessons learned

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenly, G.D. Jr.

    1986-03-01

    The risk assessment lesson learned from the Bhopal tragedy is both simple and complex. Practical planning for toxic material releases must start with an understanding of what the risks and possible consequences are. Additionally, plans must be formulated to ensure immediate decisive actions tailored to site specific scenarios, and the possible impacts projected on both the plant and surrounding communities. Most importantly, the planning process must include the communities that could be affected. Such planning will ultimately provide significant financial savings and provide for good public relations, and this makes good business sense in both developed and developing countries. Paraphrasingmore » the adage ''a penny saved is a penny earned,'' a penny spent on emergency preparedness is dollars earned through public awareness. The complex aspect of these simple concepts is overcoming human inertia, i.e., overcoming the ''it can't happen here'' syndrome in both government and private industry. A world center of excellence (ITRAC), acting as a center for education, research, and development in the area of emergency planning and response, will be the conduit for needed technology transfer to national centers of excellence in emergency planning and response. These national emergency planning and response centers (NARACS), managed by private industry for governments, will be catalysts to action in formulating effective plans involving potentially affected communities and plant management. The ITRAC/NARAC proposal is a simple concept involving complex ideas to solve the simple problem of being prepared for the Bhopal-like emergency which, as experience has demonstrated, will have complex consequences for the unprepared.« less

  18. Relapsing polychondritis and airway involvement.

    PubMed

    Ernst, Armin; Rafeq, Samaan; Boiselle, Phillip; Sung, Arthur; Reddy, Chakravarthy; Michaud, Gaetane; Majid, Adnan; Herth, Felix J F; Trentham, David

    2009-04-01

    To assess the prevalence and characteristics of airway involvement in relapsing polychondritis (RP). Retrospective chart review and data analysis of RP patients seen in the Rheumatology Clinic and the Complex Airway Center at Beth Israel Deaconess Medical Center from January 2004 through February 2008. RP was diagnosed in 145 patients. Thirty-one patients had airway involvement, a prevalence of 21%. Twenty-two patients were women (70%), and they were between 11 and 61 years of age (median age, 42 years) at the time of first symptoms. Airway symptoms were the first manifestation of disease in 17 patients (54%). Dyspnea was the most common symptom in 20 patients (64%), followed by cough, stridor, and hoarseness. Airway problems included the following: subglottic stenosis (n = 8; 26%); focal and diffuse malacia (n = 15; 48%); and focal stenosis in different areas of the bronchial tree in the rest of the patients. Twelve patients (40%) required and underwent intervention including balloon dilatation, stent placement, tracheotomy, or a combination of the above with good success. The majority of patients experienced improvement in airway symptoms after intervention. One patient died during the follow-up period from the progression of airway disease. The rest of the patients continue to undergo periodic evaluation and intervention. In this largest cohort described in the English language literature, we found symptomatic airway involvement in RP to be common and at times severe. The nature of airway problems is diverse, with tracheomalacia being the most common. Airway intervention is frequently required and in experienced hands results in symptom improvement.

  19. X-linked juvenile retinoschisis: mutations at the retinoschisis and Norrie disease gene loci?

    PubMed

    Hiraoka, M; Rossi, F; Trese, M T; Shastry, B S

    2001-01-01

    Juvenile retinoschisis (RS) and Norrie disease (ND) are X-linked recessive retinal disorders. Both disorders, in the majority of cases, are monogenic and are caused by mutations in the RS and ND genes, respectively. Here we report the identification of a family in which mutations in both the RS and ND genes are segregating with RS pathology. Although the mutations identified in this report were not functionally characterized with regard to their pathogenicity, it is likely that both of them are involved in RS pathology in the family analyzed. This suggests the complexity and digenic nature of monogenic human disorders in some cases. If this proves to be a widespread problem, it will complicate the strategies used to identify the genes involved in diseases and to develop methods for intervention.

  20. Modified allocation capacitated planning model in blood supply chain management

    NASA Astrophysics Data System (ADS)

    Mansur, A.; Vanany, I.; Arvitrida, N. I.

    2018-04-01

    Blood supply chain management (BSCM) is a complex process management that involves many cooperating stakeholders. BSCM involves four echelon processes, which are blood collection or procurement, production, inventory, and distribution. This research develops an optimization model of blood distribution planning. The efficiency of decentralization and centralization policies in a blood distribution chain are compared, by optimizing the amount of blood delivered from a blood center to a blood bank. This model is developed based on allocation problem of capacitated planning model. At the first stage, the capacity and the cost of transportation are considered to create an initial capacitated planning model. Then, the inventory holding and shortage costs are added to the model. These additional parameters of inventory costs lead the model to be more realistic and accurate.

  1. Understanding the relationship between followers and leaders.

    PubMed

    Kean, Susanne; Haycock-Stuart, Elaine

    2011-12-01

    Contemporary healthcare policies tend to imply that successful leadership can be attributed to a single leader. Such an understanding of leadership ignores the significant contribution followers make to successful leadership and their influence on leaders. In reality, followers rarely simply follow leaders. Following is a complex process that depends on the context and involves followers making judgements about prospective leaders while deciding whether or not to follow them. This interdependence is ignored all too often or misunderstood by those who see leadership as something that can resolve the problems of the NHS. Using data from a study of leadership in community nursing in which the authors were involved, they argue that senior staff who ignore followers and their contribution to leadership do so at the peril of their organisations.

  2. An analysis of whether a working-age ward-based liaison psychiatry service requires the input of a liaison psychiatrist.

    PubMed

    Guthrie, Elspeth A; McMeekin, Aaron T; Khan, Sylvia; Makin, Sally; Shaw, Ben; Longson, Damien

    2017-06-01

    Aims and method This article presents a 12-month case series to determine the fraction of ward referrals of adults of working age who needed a liaison psychiatrist in a busy tertiary referral teaching hospital. Results The service received 344 referrals resulting in 1259 face-to-face contacts. Depression accounted for the most face-to-face contacts. We deemed the involvement of a liaison psychiatrist necessary in 241 (70.1%) referrals, with medication management as the most common reason. Clinical implications A substantial amount of liaison ward work involves the treatment and management of severe and complex mental health problems. Our analysis suggests that in the majority of cases the input of a liaison psychiatrist is required.

  3. Fully probabilistic control design in an adaptive critic framework.

    PubMed

    Herzallah, Randa; Kárný, Miroslav

    2011-12-01

    Optimal stochastic controller pushes the closed-loop behavior as close as possible to the desired one. The fully probabilistic design (FPD) uses probabilistic description of the desired closed loop and minimizes Kullback-Leibler divergence of the closed-loop description to the desired one. Practical exploitation of the fully probabilistic design control theory continues to be hindered by the computational complexities involved in numerically solving the associated stochastic dynamic programming problem; in particular, very hard multivariate integration and an approximate interpolation of the involved multivariate functions. This paper proposes a new fully probabilistic control algorithm that uses the adaptive critic methods to circumvent the need for explicitly evaluating the optimal value function, thereby dramatically reducing computational requirements. This is a main contribution of this paper. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Spiced-up ANFO mixture leads to super blasts for casting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chironis, N.P.

    1984-05-01

    There is one problem common to many coal operators in the mountainous regions of western Pennsylvania. As coal seams nearer the crop lines of their mine sites are removed, the overburden heights and stripping ratios increase to about 20-to-1, the range where coal becomes uneconomical to mine. Faced with this situation, a mine operator usually pursues one of four options: 1. Drive a drift mine, which means switching to underground operations with all the complexity and costs involved; 2. Purchase a larger dragline, which involves huge capital expenditures; 3. Bring in an augering machine to auger the exposed seams, amore » technique effective only for a very limited distance into the highwalls; 4. Discontinue operations, the route most operators take.« less

  5. Vis-NIR spectrometric determination of Brix and sucrose in sugar production samples using kernel partial least squares with interval selection based on the successive projections algorithm.

    PubMed

    de Almeida, Valber Elias; de Araújo Gomes, Adriano; de Sousa Fernandes, David Douglas; Goicoechea, Héctor Casimiro; Galvão, Roberto Kawakami Harrop; Araújo, Mario Cesar Ugulino

    2018-05-01

    This paper proposes a new variable selection method for nonlinear multivariate calibration, combining the Successive Projections Algorithm for interval selection (iSPA) with the Kernel Partial Least Squares (Kernel-PLS) modelling technique. The proposed iSPA-Kernel-PLS algorithm is employed in a case study involving a Vis-NIR spectrometric dataset with complex nonlinear features. The analytical problem consists of determining Brix and sucrose content in samples from a sugar production system, on the basis of transflectance spectra. As compared to full-spectrum Kernel-PLS, the iSPA-Kernel-PLS models involve a smaller number of variables and display statistically significant superiority in terms of accuracy and/or bias in the predictions. Published by Elsevier B.V.

  6. Simulated annealing algorithm for solving chambering student-case assignment problem

    NASA Astrophysics Data System (ADS)

    Ghazali, Saadiah; Abdul-Rahman, Syariza

    2015-12-01

    The problem related to project assignment problem is one of popular practical problem that appear nowadays. The challenge of solving the problem raise whenever the complexity related to preferences, the existence of real-world constraints and problem size increased. This study focuses on solving a chambering student-case assignment problem by using a simulated annealing algorithm where this problem is classified under project assignment problem. The project assignment problem is considered as hard combinatorial optimization problem and solving it using a metaheuristic approach is an advantage because it could return a good solution in a reasonable time. The problem of assigning chambering students to cases has never been addressed in the literature before. For the proposed problem, it is essential for law graduates to peruse in chambers before they are qualified to become legal counselor. Thus, assigning the chambering students to cases is a critically needed especially when involving many preferences. Hence, this study presents a preliminary study of the proposed project assignment problem. The objective of the study is to minimize the total completion time for all students in solving the given cases. This study employed a minimum cost greedy heuristic in order to construct a feasible initial solution. The search then is preceded with a simulated annealing algorithm for further improvement of solution quality. The analysis of the obtained result has shown that the proposed simulated annealing algorithm has greatly improved the solution constructed by the minimum cost greedy heuristic. Hence, this research has demonstrated the advantages of solving project assignment problem by using metaheuristic techniques.

  7. When and Why Do Neonatal and Pediatric Critical Care Physicians Consult Palliative Care?

    PubMed

    Richards, Claire A; Starks, Helene; O'Connor, M Rebecca; Bourget, Erica; Lindhorst, Taryn; Hays, Ross; Doorenbos, Ardith Z

    2018-06-01

    Parents of children admitted to neonatal and pediatric intensive care units (ICUs) are at increased risk of experiencing acute and post-traumatic stress disorder. The integration of palliative care may improve child and family outcomes, yet there remains a lack of information about indicators for specialty-level palliative care involvement in this setting. To describe neonatal and pediatric critical care physician perspectives on indicators for when and why to involve palliative care consultants. Semistructured interviews were conducted with 22 attending physicians from neonatal, pediatric, and cardiothoracic ICUs in a single quaternary care pediatric hospital. Transcribed interviews were analyzed using content and thematic analyses. We identified 2 themes related to the indicators for involving palliative care consultants: (1) palliative care expertise including support and bridging communication and (2) organizational factors influencing communication including competing priorities and fragmentation of care. Palliative care was most beneficial for families at risk of experiencing communication problems that resulted from organizational factors, including those with long lengths of stay and medical complexity. The ability of palliative care consultants to bridge communication was limited by some of these same organizational factors. Physicians valued the involvement of palliative care consultants when they improved efficiency and promoted harmony. Given the increasing number of children with complex chronic conditions, it is important to support the capacity of ICU clinical teams to provide primary palliative care. We suggest comprehensive system changes and critical care physician training to include topics related to chronic illness and disability.

  8. Possibilities of the particle finite element method for fluid-soil-structure interaction problems

    NASA Astrophysics Data System (ADS)

    Oñate, Eugenio; Celigueta, Miguel Angel; Idelsohn, Sergio R.; Salazar, Fernando; Suárez, Benjamín

    2011-09-01

    We present some developments in the particle finite element method (PFEM) for analysis of complex coupled problems in mechanics involving fluid-soil-structure interaction (FSSI). The PFEM uses an updated Lagrangian description to model the motion of nodes (particles) in both the fluid and the solid domains (the later including soil/rock and structures). A mesh connects the particles (nodes) defining the discretized domain where the governing equations for each of the constituent materials are solved as in the standard FEM. The stabilization for dealing with an incompressibility continuum is introduced via the finite calculus method. An incremental iterative scheme for the solution of the non linear transient coupled FSSI problem is described. The procedure to model frictional contact conditions and material erosion at fluid-solid and solid-solid interfaces is described. We present several examples of application of the PFEM to solve FSSI problems such as the motion of rocks by water streams, the erosion of a river bed adjacent to a bridge foundation, the stability of breakwaters and constructions sea waves and the study of landslides.

  9. Evolution, opportunity and challenges of transboundary water and energy problems in Central Asia.

    PubMed

    Guo, Lidan; Zhou, Haiwei; Xia, Ziqiang; Huang, Feng

    2016-01-01

    Central Asia is one of the regions that suffer the most prominent transboundary water and energy problems in the world. Effective transboundary water-energy resource management and cooperation are closely related with socioeconomic development and stability in the entire Central Asia. Similar to Central Asia, Northwest China has an arid climate and is experiencing a water shortage. It is now facing imbalanced supply-demand relations of water and energy resources. These issues in Northwest China and Central Asia pose severe challenges in the implementation of the Silk Road Economic Belt strategy. Based on the analysis of water and energy distribution characteristics in Central Asia as well as demand characteristics of different countries, the complexity of local transboundary water problems was explored by reviewing corresponding historical problems of involved countries, correlated energy issues, and the evolution of inter-country water-energy cooperation. With references to experiences and lessons of five countries, contradictions, opportunities, challenges and strategies for transboundary water-energy cooperation between China and Central Asia were discussed under the promotion of the Silk Road Economic Belt construction based on current cooperation conditions.

  10. Using Technology to Facilitate and Enhance Project-based Learning in Mathematical Physics

    NASA Astrophysics Data System (ADS)

    Duda, Gintaras

    2011-04-01

    Problem-based and project-based learning are two pedagogical techniques that have several clear advantages over traditional instructional methods: 1) both techniques are active and student centered, 2) students confront real-world and/or highly complex problems, and 3) such exercises model the way science and engineering are done professionally. This talk will present an experiment in project/problem-based learning in a mathematical physics course. The group project in the course involved modeling a zombie outbreak of the type seen in AMC's ``The Walking Dead.'' Students researched, devised, and solved their mathematical models for the spread of zombie-like infection. Students used technology in all stages; in fact, since analytical solutions to the models were often impossible, technology was a necessary and critical component of the challenge. This talk will explore the use of technology in general in problem and project-based learning and will detail some specific examples of how technology was used to enhance student learning in this course. A larger issue of how students use the Internet to learn will also be explored.

  11. Scheduling multirobot operations in manufacturing by truncated Petri nets

    NASA Astrophysics Data System (ADS)

    Chen, Qin; Luh, J. Y.

    1995-08-01

    Scheduling of operational sequences in manufacturing processes is one of the important problems in automation. Methods of applying Petri nets to model and analyze the problem with constraints on precedence relations, multiple resources allocation, etc. have been available in literature. Searching for an optimum schedule can be implemented by combining the branch-and-bound technique with the execution of the timed Petri net. The process usually produces a large Petri net which is practically not manageable. This disadvantage, however, can be handled by a truncation technique which divides the original large Petri net into several smaller size subnets. The complexity involved in the analysis of each subnet individually is greatly reduced. However, when the locally optimum schedules of the resulting subnets are combined together, it may not yield an overall optimum schedule for the original Petri net. To circumvent this problem, algorithms are developed based on the concepts of Petri net execution and modified branch-and-bound process. The developed technique is applied to a multi-robot task scheduling problem of the manufacturing work cell.

  12. EMILiO: a fast algorithm for genome-scale strain design.

    PubMed

    Yang, Laurence; Cluett, William R; Mahadevan, Radhakrishnan

    2011-05-01

    Systems-level design of cell metabolism is becoming increasingly important for renewable production of fuels, chemicals, and drugs. Computational models are improving in the accuracy and scope of predictions, but are also growing in complexity. Consequently, efficient and scalable algorithms are increasingly important for strain design. Previous algorithms helped to consolidate the utility of computational modeling in this field. To meet intensifying demands for high-performance strains, both the number and variety of genetic manipulations involved in strain construction are increasing. Existing algorithms have experienced combinatorial increases in computational complexity when applied toward the design of such complex strains. Here, we present EMILiO, a new algorithm that increases the scope of strain design to include reactions with individually optimized fluxes. Unlike existing approaches that would experience an explosion in complexity to solve this problem, we efficiently generated numerous alternate strain designs producing succinate, l-glutamate and l-serine. This was enabled by successive linear programming, a technique new to the area of computational strain design. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Midbond basis functions for weakly bound complexes

    NASA Astrophysics Data System (ADS)

    Shaw, Robert A.; Hill, J. Grant

    2018-06-01

    Weakly bound systems present a difficult problem for conventional atom-centred basis sets due to large separations, necessitating the use of large, computationally expensive bases. This can be remedied by placing a small number of functions in the region between molecules in the complex. We present compact sets of optimised midbond functions for a range of complexes involving noble gases, alkali metals and small molecules for use in high accuracy coupled -cluster calculations, along with a more robust procedure for their optimisation. It is shown that excellent results are possible with double-zeta quality orbital basis sets when a few midbond functions are added, improving both the interaction energy and the equilibrium bond lengths of a series of noble gas dimers by 47% and 8%, respectively. When used in conjunction with explicitly correlated methods, near complete basis set limit accuracy is readily achievable at a fraction of the cost that using a large basis would entail. General purpose auxiliary sets are developed to allow explicitly correlated midbond function studies to be carried out, making it feasible to perform very high accuracy calculations on weakly bound complexes.

  14. Parasites, ecosystems and sustainability: an ecological and complex systems perspective.

    PubMed

    Horwitz, Pierre; Wilcox, Bruce A

    2005-06-01

    Host-parasite relationships can be conceptualised either narrowly, where the parasite is metabolically dependent on the host, or more broadly, as suggested by an ecological-evolutionary and complex systems perspective. In this view Host-parasite relationships are part of a larger set of ecological and co-evolutionary interdependencies and a complex adaptive system. These interdependencies affect not just the hosts, vectors, parasites, the immediate agents, but also those indirectly or consequentially affected by the relationship. Host-parasite relationships also can be viewed as systems embedded within larger systems represented by ecological communities and ecosystems. So defined, it can be argued that Host-parasite relationships may often benefit their hosts and contribute significantly to the structuring of ecological communities. The broader, complex adaptive system view also contributes to understanding the phenomenon of disease emergence, the ecological and evolutionary mechanisms involved, and the role of parasitology in research and management of ecosystems in light of the apparently growing problem of emerging infectious diseases in wildlife and humans. An expanded set of principles for integrated parasite management is suggested by this perspective.

  15. Development of Six Sigma methodology for CNC milling process improvements

    NASA Astrophysics Data System (ADS)

    Ismail, M. N.; Rose, A. N. M.; Mohammed, N. Z.; Rashid, M. F. F. Ab

    2017-10-01

    Quality and productivity have been identified as an important role in any organization, especially for manufacturing sectors to gain more profit that leads to success of a company. This paper reports a work improvement project in Kolej Kemahiran Tinggi MARA Kuantan. It involves problem identification in production of “Khufi” product and proposing an effective framework to improve the current situation effectively. Based on the observation and data collection on the work in progress (WIP) product, the major problem has been identified related to function of the product which is the parts can’t assemble properly due to dimension of the product is out of specification. The six sigma has been used as a methodology to study and improve of the problems identified. Six Sigma is a highly statistical and data driven approach to solving complex business problems. It uses a methodical five phase approach define, measure, analysis, improve and control (DMAIC) to help understand the process and the variables that affect it so that can be optimized the processes. Finally, the root cause and solution for the production of “Khufi” problem has been identified and implemented then the result for this product was successfully followed the specification of fitting.

  16. The MCNP6 Analytic Criticality Benchmark Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less

  17. Practical problems in aggregating expert opinions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Booker, J.M.; Picard, R.R.; Meyer, M.A.

    1993-11-01

    Expert opinion is data given by a qualified person in response to a technical question. In these analyses, expert opinion provides information where other data are either sparse or non-existent. Improvements in forecasting result from the advantageous addition of expert opinion to observed data in many areas, such as meteorology and econometrics. More generally, analyses of large, complex systems often involve experts on various components of the system supplying input to a decision process; applications include such wide-ranging areas as nuclear reactor safety, management science, and seismology. For large or complex applications, no single expert may be knowledgeable enough aboutmore » the entire application. In other problems, decision makers may find it comforting that a consensus or aggregation of opinions is usually better than a single opinion. Many risk and reliability studies require a single estimate for modeling, analysis, reporting, and decision making purposes. For problems with large uncertainties, the strategy of combining as diverse a set of experts as possible hedges against underestimation of that uncertainty. Decision makers are frequently faced with the task of selecting the experts and combining their opinions. However, the aggregation is often the responsibility of an analyst. Whether the decision maker or the analyst does the aggregation, the input for it, such as providing weights for experts or estimating other parameters, is imperfect owing to a lack of omniscience. Aggregation methods for expert opinions have existed for over thirty years; yet many of the difficulties with their use remain unresolved. The bulk of these problem areas are summarized in the sections that follow: sensitivities of results to assumptions, weights for experts, correlation of experts, and handling uncertainties. The purpose of this paper is to discuss the sources of these problems and describe their effects on aggregation.« less

  18. Hierarchical calibration and validation for modeling bench-scale solvent-based carbon capture. Part 1: Non-reactive physical mass transfer across the wetted wall column: Original Research Article: Hierarchical calibration and validation for modeling bench-scale solvent-based carbon capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Chao; Xu, Zhijie; Lai, Canhai

    A hierarchical model calibration and validation is proposed for quantifying the confidence level of mass transfer prediction using a computational fluid dynamics (CFD) model, where the solvent-based carbon dioxide (CO2) capture is simulated and simulation results are compared to the parallel bench-scale experimental data. Two unit problems with increasing level of complexity are proposed to breakdown the complex physical/chemical processes of solvent-based CO2 capture into relatively simpler problems to separate the effects of physical transport and chemical reaction. This paper focuses on the calibration and validation of the first unit problem, i.e. the CO2 mass transfer across a falling ethanolaminemore » (MEA) film in absence of chemical reaction. This problem is investigated both experimentally and numerically using nitrous oxide (N2O) as a surrogate for CO2. To capture the motion of gas-liquid interface, a volume of fluid method is employed together with a one-fluid formulation to compute the mass transfer between the two phases. Bench-scale parallel experiments are designed and conducted to validate and calibrate the CFD models using a general Bayesian calibration. Two important transport parameters, e.g. Henry’s constant and gas diffusivity, are calibrated to produce the posterior distributions, which will be used as the input for the second unit problem to address the chemical adsorption of CO2 across the MEA falling film, where both mass transfer and chemical reaction are involved.« less

  19. The role of adaptive management as an operational approach for resource management agencies

    USGS Publications Warehouse

    Johnson, B.L.

    1999-01-01

    In making resource management decisions, agencies use a variety of approaches that involve different levels of political concern, historical precedence, data analyses, and evaluation. Traditional decision-making approaches have often failed to achieve objectives for complex problems in large systems, such as the Everglades or the Colorado River. I contend that adaptive management is the best approach available to agencies for addressing this type of complex problem, although its success has been limited thus far. Traditional decision-making approaches have been fairly successful at addressing relatively straightforward problems in small, replicated systems, such as management of trout in small streams or pulp production in forests. However, this success may be jeopardized as more users place increasing demands on these systems. Adaptive management has received little attention from agencies for addressing problems in small-scale systems, but I suggest that it may be a useful approach for creating a holistic view of common problems and developing guidelines that can then be used in simpler, more traditional approaches to management. Although adaptive management may be more expensive to initiate than traditional approaches, it may be less expensive in the long run if it leads to more effective management. The overall goal of adaptive management is not to maintain an optimal condition of the resource, but to develop an optimal management capacity. This is accomplished by maintaining ecological resilience that allows the system to react to inevitable stresses, and generating flexibility in institutions and stakeholders that allows managers to react when conditions change. The result is that, rather than managing for a single, optimal state, we manage within a range of acceptable outcomes while avoiding catastrophes and irreversible negative effects. Copyright ?? 1999 by The Resilience Alliance.

  20. Paternal ADHD symptoms and child conduct problems: is father involvement always beneficial?

    PubMed

    Romirowsky, A M; Chronis-Tuscano, A

    2014-09-01

    Maternal psychopathology robustly predicts poor developmental and treatment outcomes for children with attention-deficit/hyperactivity disorder (ADHD). Despite the high heritability of ADHD, few studies have examined associations between paternal ADHD symptoms and child adjustment, and none have also considered degree of paternal involvement in childrearing. Identification of modifiable risk factors for child conduct problems is particularly important in this population given the serious adverse outcomes resulting from this comorbidity. This cross-sectional study examined the extent to which paternal involvement in childrearing moderated the association between paternal ADHD symptoms and child conduct problems among 37 children with ADHD and their biological fathers. Neither paternal ADHD symptoms nor involvement was independently associated with child conduct problems. However, the interaction between paternal ADHD symptoms and involvement was significant, such that paternal ADHD symptoms were positively associated with child conduct problems only when fathers were highly involved in childrearing. The presence of adult ADHD symptoms may determine whether father involvement in childrearing has a positive or detrimental influence on comorbid child conduct problems.

Top