Solving Complex Problems: A Convergent Approach to Cognitive Load Measurement
ERIC Educational Resources Information Center
Zheng, Robert; Cook, Anne
2012-01-01
The study challenged the current practices in cognitive load measurement involving complex problem solving by manipulating the presence of pictures in multiple rule-based problem-solving situations and examining the cognitive load resulting from both off-line and online measures associated with complex problem solving. Forty-eight participants…
ERIC Educational Resources Information Center
Cerruti, Carlo; Schlaug, Gottfried
2009-01-01
The remote associates test (RAT) is a complex verbal task with associations to both creative thought and general intelligence. RAT problems require not only lateral associations and the internal production of many words but a convergent focus on a single answer. Complex problem-solving of this sort may thus require both substantial verbal…
Ecosystem services and cooperative fisheries research to address a complex fishery problem
The St. Louis River represents a complex fishery management problem. Current fishery management goals have to be developed taking into account bi-state commercial, subsistence and recreational fisheries which are valued for different characteristics by a wide range of anglers, as...
1977-12-01
exponentials encountered are complex and zhey are approximately at harmonic frequencies. Moreover, the real parts of the complex exponencials are much...functions as a basis for expanding the current distribution on an antenna by the method of moments results in a regularized ill-posed problem with respect...to the current distribution on the antenna structure. However, the problem is not regularized with respect to chaoge because the chaPge distribution
ERIC Educational Resources Information Center
Greiff, Samuel; Wustenberg, Sascha; Molnar, Gyongyver; Fischer, Andreas; Funke, Joachim; Csapo, Beno
2013-01-01
Innovative assessments of cross-curricular competencies such as complex problem solving (CPS) have currently received considerable attention in large-scale educational studies. This study investigated the nature of CPS by applying a state-of-the-art approach to assess CPS in high school. We analyzed whether two processes derived from cognitive…
Managing Complex Problems in Rangeland Ecosystems
USDA-ARS?s Scientific Manuscript database
Management of rangelands, and natural resources in general, has become increasingly complex. There is an atmosphere of increasing expectations for conservation efforts associated with a variety of issues from water quality to endangered species. We argue that many current issues are complex by their...
Chinese Algebra: Using Historical Problems to Think about Current Curricula
ERIC Educational Resources Information Center
Tillema, Erik
2005-01-01
The Chinese used the idea of generating equivalent expressions for solving problems where the problems from a historical Chinese text are studied to understand the ways in which the ideas can lead into algebraic calculations and help students to learn algebra. The texts unify algebraic problem solving through complex algebraic thought and afford…
The Effect of a Complex (3-Week) Therapy on the Hip and Knee Joints in Obese Patients
ERIC Educational Resources Information Center
Tóvári, Anett; Hermann, Mária; Tóvári, Ferenc; Prisztóka, Gyöngyvér; Kránicz, János
2015-01-01
Currently, overweight and obesity are the most widespread problems in life-style having a significant impact on everyday life, and thus, conduct of life. Further contributory problems may develop in patients with weight problems: deformities of the joints and skeleton (coxarthrosis and gonarthrosis), circulatory problems and arrhythmia. Overweight…
On Complex Water Conflicts: Role of Enabling Conditions for Pragmatic Resolution
NASA Astrophysics Data System (ADS)
Islam, S.; Choudhury, E.
2016-12-01
Many of our current and emerging water problems are interconnected and cross boundaries, domains, scales, and sectors. These boundary crossing water problems are neither static nor linear; but often are interconnected nonlinearly with other problems and feedback. The solution space for these complex problems - involving interdependent variables, processes, actors, and institutions - can't be pre-stated. We need to recognize the disconnect among values, interests, and tools as well as problems, policies, and politics. Scientific and technological solutions are desired for efficiency and reliability, but need to be politically feasible and actionable. Governing and managing complex water problems require difficult tradeoffs in exploring and sharing benefits and burdens through carefully crafted negotiation processes. The crafting of such negotiation process, we argue, constitutes a pragmatic approach to negotiation - one that is based on the identification of enabling conditions - as opposed to mechanistic casual explanations, and rooted in contextual conditions to specify and ensure the principles of equity and sustainability. We will use two case studies to demonstrate the efficacy of the proposed principled pragmatic approcah to address complex water problems.
NASA Astrophysics Data System (ADS)
Zittersteijn, Michiel; Schildknecht, Thomas; Vananti, Alessandro; Dolado Perez, Juan Carlos; Martinot, Vincent
2016-07-01
Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to efficiently process large data sets with minimal manual intervention. This problem is also known as the Multiple Target Tracking (MTT) problem. The complexity of the MTT problem is defined by its dimension S. Current research tends to focus on the S = 2 MTT problem. The reason for this is that for S = 2 the problem has a P-complexity. However, with S = 2 the decision to associate a set of observations is based on the minimum amount of information, in ambiguous situations (e.g. satellite clusters) this will lead to incorrect associations. The S > 2 MTT problem is an NP-hard combinatorial optimization problem. In previous work an Elitist Genetic Algorithm (EGA) was proposed as a method to approximately solve this problem. It was shown that the EGA is able to find a good approximate solution with a polynomial time complexity. The EGA relies on solving the Lambert problem in order to perform the necessary orbit determinations. This means that the algorithm is restricted to orbits that are described by Keplerian motion. The work presented in this paper focuses on the impact that this restriction has on the algorithm performance.
MANAGING ELECTRONIC DATA TRANSFER IN ENVIRONMENTAL CLEANUPS
The use of computers and electronic information poses a complex problem for potential litigation in space law. The problem currently manifests itself in at least two ways. First, the Environmental Protection Agency (EPA) enforcement of Comprehensive Environmental Response, Compen...
A Study of Complex Deep Learning Networks on High Performance, Neuromorphic, and Quantum Computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potok, Thomas E; Schuman, Catherine D; Young, Steven R
Current Deep Learning models use highly optimized convolutional neural networks (CNN) trained on large graphical processing units (GPU)-based computers with a fairly simple layered network topology, i.e., highly connected layers, without intra-layer connections. Complex topologies have been proposed, but are intractable to train on current systems. Building the topologies of the deep learning network requires hand tuning, and implementing the network in hardware is expensive in both cost and power. In this paper, we evaluate deep learning models using three different computing architectures to address these problems: quantum computing to train complex topologies, high performance computing (HPC) to automatically determinemore » network topology, and neuromorphic computing for a low-power hardware implementation. Due to input size limitations of current quantum computers we use the MNIST dataset for our evaluation. The results show the possibility of using the three architectures in tandem to explore complex deep learning networks that are untrainable using a von Neumann architecture. We show that a quantum computer can find high quality values of intra-layer connections and weights, while yielding a tractable time result as the complexity of the network increases; a high performance computer can find optimal layer-based topologies; and a neuromorphic computer can represent the complex topology and weights derived from the other architectures in low power memristive hardware. This represents a new capability that is not feasible with current von Neumann architecture. It potentially enables the ability to solve very complicated problems unsolvable with current computing technologies.« less
Toward Group Problem Solving Guidelines for 21st Century Teams
ERIC Educational Resources Information Center
Ranieri, Kathryn L.
2004-01-01
Effective problem-solving skills are critical in dealing with ambiguous and often complex issues in the present-day leaner and globally diverse organizations. Yet respected, well-established problem-solving models may be misaligned within the current work environment, particularly within a team context. Models learned from a more bureaucratic,…
Complex space monofilar approximation of diffraction currents on a conducting half plane
NASA Technical Reports Server (NTRS)
Lindell, I. V.
1987-01-01
Simple approximation of diffraction surface currents on a conducting half plane, due to an incoming plane wave, is obtained with a line current (monofile) in complex space. When compared to an approximating current at the edge, the diffraction pattern is seen to improve by an order of magnitude for a minimal increase of computation effort. Thus, the inconvient Fresnel integral functions can be avoided for quick calculations of diffracted fields and the accuracy is good in other directions than along the half plane. The method can be applied to general problems involving planar metal edges.
Analytical approaches to optimizing system "Semiconductor converter-electric drive complex"
NASA Astrophysics Data System (ADS)
Kormilicin, N. V.; Zhuravlev, A. M.; Khayatov, E. S.
2018-03-01
In the electric drives of the machine-building industry, the problem of optimizing the drive in terms of mass-size indicators is acute. The article offers analytical methods that ensure the minimization of the mass of a multiphase semiconductor converter. In multiphase electric drives, the form of the phase current at which the best possible use of the "semiconductor converter-electric drive complex" for active materials is different from the sinusoidal form. It is shown that under certain restrictions on the phase current form, it is possible to obtain an analytical solution. In particular, if one assumes the shape of the phase current to be rectangular, the optimal shape of the control actions will depend on the width of the interpolar gap. In the general case, the proposed algorithm can be used to solve the problem under consideration by numerical methods.
ERIC Educational Resources Information Center
Ohlsson, Stellan
2012-01-01
The research paradigm invented by Allen Newell and Herbert A. Simon in the late 1950s dominated the study of problem solving for more than three decades. But in the early 1990s, problem solving ceased to drive research on complex cognition. As part of this decline, Newell and Simon's most innovative research practices--especially their method for…
Translating concepts of complexity to the field of ergonomics.
Walker, Guy H; Stanton, Neville A; Salmon, Paul M; Jenkins, Daniel P; Rafferty, Laura
2010-10-01
Since 1958 more than 80 journal papers from the mainstream ergonomics literature have used either the words 'complex' or 'complexity' in their titles. Of those, more than 90% have been published in only the past 20 years. This observation communicates something interesting about the way in which contemporary ergonomics problems are being understood. The study of complexity itself derives from non-linear mathematics but many of its core concepts have found analogies in numerous non-mathematical domains. Set against this cross-disciplinary background, the current paper aims to provide a similar initial mapping to the field of ergonomics. In it, the ergonomics problem space, complexity metrics and powerful concepts such as emergence raise complexity to the status of an important contingency factor in achieving a match between ergonomics problems and ergonomics methods. The concept of relative predictive efficiency is used to illustrate how this match could be achieved in practice. What is clear overall is that a major source of, and solution to, complexity are the humans in systems. Understanding complexity on its own terms offers the potential to leverage disproportionate effects from ergonomics interventions and to tighten up the often loose usage of the term in the titles of ergonomics papers. STATEMENT OF RELEVANCE: This paper reviews and discusses concepts from the study of complexity and maps them to ergonomics problems and methods. It concludes that humans are a major source of and solution to complexity in systems and that complexity is a powerful contingency factor, which should be considered to ensure that ergonomics approaches match the true nature of ergonomics problems.
USDA-ARS?s Scientific Manuscript database
Present-day environmental problems of Dryland East Asia are serious, and future prospects look especially disconcerting owing to current trends in population growth and economic development. Land degradation and desertification, invasive species, biodiversity losses, toxic waste and air pollution, a...
Cape, John; Morris, Elena; Burd, Mary; Buszewicz, Marta
2008-01-01
Background How GPs understand mental health problems determines their treatment choices; however, measures describing GPs' thinking about such problems are not currently available. Aim To develop a measure of the complexity of GP explanations of common mental health problems and to pilot its reliability and validity. Design of study A qualitative development of the measure, followed by inter-rater reliability and validation pilot studies. Setting General practices in North London. Method Vignettes of simulated consultations with patients with mental health problems were videotaped, and an anchored measure of complexity of psychosocial explanation in response to these vignettes was developed. Six GPs, four psychologists, and two lay people viewed the vignettes. Their responses were rated for complexity, both using the anchored measure and independently by two experts in primary care mental health. In a second reliability and revalidation study, responses of 50 GPs to two vignettes were rated for complexity. The GPs also completed a questionnaire to determine their interest and training in mental health, and they completed the Depression Attitudes Questionnaire. Results Inter-rater reliability of the measure of complexity of explanation in both pilot studies was satisfactory (intraclass correlation coefficient = 0.78 and 0.72). The measure correlated with expert opinion as to what constitutes a complex explanation, and the responses of psychologists, GPs, and lay people differed in measured complexity. GPs with higher complexity scores had greater interest, more training in mental health, and more positive attitudes to depression. Conclusion Results suggest that the complexity of GPs' psychosocial explanations about common mental health problems can be reliably and validly assessed by this new standardised measure. PMID:18505616
Network Access Control List Situation Awareness
ERIC Educational Resources Information Center
Reifers, Andrew
2010-01-01
Network security is a large and complex problem being addressed by multiple communities. Nevertheless, current theories in networking security appear to overestimate network administrators' ability to understand network access control lists (NACLs), providing few context specific user analyses. Consequently, the current research generally seems to…
Improved Intelligence Warning in an Age of Complexity
2015-05-21
at, and applying complexity science to this problem, which is represented by a multidiscipline study of large networks comprised of interdependent...For analysts and policy makers, complexity science offers methods to improve this understanding. As said by Ms. Irene Sanders, director of the... science to improve intelligence warning. The initial section describes how policy makers and national security leaders understand the current
Fast solver for large scale eddy current non-destructive evaluation problems
NASA Astrophysics Data System (ADS)
Lei, Naiguang
Eddy current testing plays a very important role in non-destructive evaluations of conducting test samples. Based on Faraday's law, an alternating magnetic field source generates induced currents, called eddy currents, in an electrically conducting test specimen. The eddy currents generate induced magnetic fields that oppose the direction of the inducing magnetic field in accordance with Lenz's law. In the presence of discontinuities in material property or defects in the test specimen, the induced eddy current paths are perturbed and the associated magnetic fields can be detected by coils or magnetic field sensors, such as Hall elements or magneto-resistance sensors. Due to the complexity of the test specimen and the inspection environments, the availability of theoretical simulation models is extremely valuable for studying the basic field/flaw interactions in order to obtain a fuller understanding of non-destructive testing phenomena. Theoretical models of the forward problem are also useful for training and validation of automated defect detection systems. Theoretical models generate defect signatures that are expensive to replicate experimentally. In general, modelling methods can be classified into two categories: analytical and numerical. Although analytical approaches offer closed form solution, it is generally not possible to obtain largely due to the complex sample and defect geometries, especially in three-dimensional space. Numerical modelling has become popular with advances in computer technology and computational methods. However, due to the huge time consumption in the case of large scale problems, accelerations/fast solvers are needed to enhance numerical models. This dissertation describes a numerical simulation model for eddy current problems using finite element analysis. Validation of the accuracy of this model is demonstrated via comparison with experimental measurements of steam generator tube wall defects. These simulations generating two-dimension raster scan data typically takes one to two days on a dedicated eight-core PC. A novel direct integral solver for eddy current problems and GPU-based implementation is also investigated in this research to reduce the computational time.
Neil R. Honeycutt
1995-01-01
The urban and wildland interface (mix) problem exists in many communities in the United States. To effectively deal with these complex issues, cooperative approaches should be used to solve regional problems. This panel discussed the unique programs currently at work in Alameda and Contra Costa Counties in northern California. These programs were designed after the...
Designing and Validating Assessments of Complex Thinking in Science
ERIC Educational Resources Information Center
Ryoo, Kihyun; Linn, Marcia C.
2015-01-01
Typical assessment systems often measure isolated ideas rather than the coherent understanding valued in current science classrooms. Such assessments may motivate students to memorize, rather than to use new ideas to solve complex problems. To meet the requirements of the Next Generation Science Standards, instruction needs to emphasize sustained…
Capturing the complexity of first opinion small animal consultations using direct observation
Robinson, N. J.; Brennan, M. L.; Cobb, M.; Dean, R. S.
2015-01-01
Various different methods are currently being used to capture data from small animal consultations. The aim of this study was to develop a tool to record detailed data from consultations by direct observation. A second aim was to investigate the complexity of the consultation by examining the number of problems discussed per patient. A data collection tool was developed and used during direct observation of small animal consultations in eight practices. Data were recorded on consultation type, patient signalment and number of problems discussed. During 16 weeks of data collection, 1901 patients were presented. Up to eight problems were discussed for some patients; more problems were discussed during preventive medicine consultations than during first consultations (P<0.001) or revisits (P<0.001). Fewer problems were discussed for rabbits than cats (P<0.001) or dogs (P<0.001). Age was positively correlated with discussion of specific health problems and negatively correlated with discussion of preventive medicine. Consultations are complex with multiple problems frequently discussed, suggesting comorbidity may be common. Future research utilising practice data should consider how much of this complexity needs to be captured, and use appropriate methods accordingly. The findings here have implications for directing research and education as well as application in veterinary practice. PMID:25262057
Jia, Xiuqin; Liang, Peipeng; Shi, Lin; Wang, Defeng; Li, Kuncheng
2015-01-01
In neuroimaging studies, increased task complexity can lead to increased activation in task-specific regions or to activation of additional regions. How the brain adapts to increased rule complexity during inductive reasoning remains unclear. In the current study, three types of problems were created: simple rule induction (i.e., SI, with rule complexity of 1), complex rule induction (i.e., CI, with rule complexity of 2), and perceptual control. Our findings revealed that increased activations accompany increased rule complexity in the right dorsal lateral prefrontal cortex (DLPFC) and medial posterior parietal cortex (precuneus). A cognitive model predicted both the behavioral and brain imaging results. The current findings suggest that neural activity in frontal and parietal regions is modulated by rule complexity, which may shed light on the neural mechanisms of inductive reasoning. Copyright © 2014. Published by Elsevier Ltd.
The Complexity of Bit Retrieval
Elser, Veit
2018-09-20
Bit retrieval is the problem of reconstructing a periodic binary sequence from its periodic autocorrelation, with applications in cryptography and x-ray crystallography. After defining the problem, with and without noise, we describe and compare various algorithms for solving it. A geometrical constraint satisfaction algorithm, relaxed-reflect-reflect, is currently the best algorithm for noisy bit retrieval.
The Complexity of Bit Retrieval
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elser, Veit
Bit retrieval is the problem of reconstructing a periodic binary sequence from its periodic autocorrelation, with applications in cryptography and x-ray crystallography. After defining the problem, with and without noise, we describe and compare various algorithms for solving it. A geometrical constraint satisfaction algorithm, relaxed-reflect-reflect, is currently the best algorithm for noisy bit retrieval.
Bully/Victim Problems among Preschool Children: A Review of Current Research Evidence
ERIC Educational Resources Information Center
Vlachou, Maria; Andreou, Eleni; Botsoglou, Kafenia; Didaskalou, Eleni
2011-01-01
Bullying in schools has been identified as a serious and complex worldwide problem associated with negative short- and long-term effects on children's psychosocial adjustment (Smith 1999; Ttofi and Farrington, "Aggressive Behav" 34(4):352-368, 2008). Entering kindergarten is a crucial developmental step in many children's lives mainly because it…
Diagnosing forest vegetation for air pollution injury
Keith F. Jensen
1989-01-01
The purpose of this Note is to help you become more technically informed about air pollution when serious problems need to be diagnosed by pollution specialists. (Except for ozone, most of the information discussed does not attempt to describe possible air pollution damage caused by long distance transport. This complex problem is currently under intense study.)
Efficient Network Coding-Based Loss Recovery for Reliable Multicast in Wireless Networks
NASA Astrophysics Data System (ADS)
Chi, Kaikai; Jiang, Xiaohong; Ye, Baoliu; Horiguchi, Susumu
Recently, network coding has been applied to the loss recovery of reliable multicast in wireless networks [19], where multiple lost packets are XOR-ed together as one packet and forwarded via single retransmission, resulting in a significant reduction of bandwidth consumption. In this paper, we first prove that maximizing the number of lost packets for XOR-ing, which is the key part of the available network coding-based reliable multicast schemes, is actually a complex NP-complete problem. To address this limitation, we then propose an efficient heuristic algorithm for finding an approximately optimal solution of this optimization problem. Furthermore, we show that the packet coding principle of maximizing the number of lost packets for XOR-ing sometimes cannot fully exploit the potential coding opportunities, and we then further propose new heuristic-based schemes with a new coding principle. Simulation results demonstrate that the heuristic-based schemes have very low computational complexity and can achieve almost the same transmission efficiency as the current coding-based high-complexity schemes. Furthermore, the heuristic-based schemes with the new coding principle not only have very low complexity, but also slightly outperform the current high-complexity ones.
Donnelly, Lane F; Basta, Kathryne C; Dykes, Anne M; Zhang, Wei; Shook, Joan E
2018-01-01
At a pediatric health system, the Daily Operational Brief (DOB) was updated in 2015 after three years of operation. Quality and safety metrics, the patient volume and staffing assessment, and the readiness assessment are all presented. In addition, in the problem-solving accountability system, problematic issues are categorized as Quick Hits or Complex Issues. Walk-the-Wall, a biweekly meeting attended by hospital senior administrative leadership and quality and safety leaders, is conducted to chart current progress on Complex Issues. The DOB provides a daily standardized approach to evaluate readiness to provide care to current patients and improvement in the care to be provided for future patients. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.
Modeling Security Bridge Certificate Authority Architecture
NASA Astrophysics Data System (ADS)
Ren, Yizhi; Li, Mingchu; Sakurai, Kouichi
Current Public Key Infrastructures suffer from a scaling problem, and some may have security problems, even given the topological simplification of bridge certification authorities. This paper analyzes the security problems in Bridge Certificate Authorities (BCA) model by using the concept of “impersonation risk, ” and proposes a new modified BCA model, which enhances its security, but is a bit more complex incertification path building and implementation than the existing one.
NASA Astrophysics Data System (ADS)
Steen-Eibensteiner, Janice Lee
2006-07-01
A strong science knowledge base and problem solving skills have always been highly valued for employment in the science industry. Skills currently needed for employment include being able to problem solve (Overtoom, 2000). Academia also recognizes the need for effectively teaching students to apply problem solving skills in clinical settings. This thesis investigates how students solve complex science problems in an academic setting in order to inform the development of problem solving skills for the workplace. Students' use of problem solving skills in the form of learned concepts and procedural knowledge was studied as students completed a problem that might come up in real life. Students were taking a community college sophomore biology course, Human Anatomy & Physiology II. The problem topic was negative feedback inhibition of the thyroid and parathyroid glands. The research questions answered were (1) How well do community college students use a complex of conceptual knowledge when solving a complex science problem? (2) What conceptual knowledge are community college students using correctly, incorrectly, or not using when solving a complex science problem? (3) What problem solving procedural knowledge are community college students using successfully, unsuccessfully, or not using when solving a complex science problem? From the whole class the high academic level participants performed at a mean of 72% correct on chapter test questions which was a low average to fair grade of C-. The middle and low academic participants both failed (F) the test questions (37% and 30% respectively); 29% (9/31) of the students show only a fair performance while 71% (22/31) fail. From the subset sample population of 2 students each from the high, middle, and low academic levels selected from the whole class 35% (8/23) of the concepts were used effectively, 22% (5/23) marginally, and 43% (10/23) poorly. Only 1 concept was used incorrectly by 3/6 of the students and identified as a misconception. One of 21 (5%) problem-solving pathway characteristics was used effectively, 7 (33%) marginally, and 13 (62%) poorly. There were very few (0 to 4) problem-solving pathway characteristics used unsuccessfully most were simply not used.
Computation and visualization of geometric partial differential equations
NASA Astrophysics Data System (ADS)
Tiee, Christopher L.
The chief goal of this work is to explore a modern framework for the study and approximation of partial differential equations, recast common partial differential equations into this framework, and prove theorems about such equations and their approximations. A central motivation is to recognize and respect the essential geometric nature of such problems, and take it into consideration when approximating. The hope is that this process will lead to the discovery of more refined algorithms and processes and apply them to new problems. In the first part, we introduce our quantities of interest and reformulate traditional boundary value problems in the modern framework. We see how Hilbert complexes capture and abstract the most important properties of such boundary value problems, leading to generalizations of important classical results such as the Hodge decomposition theorem. They also provide the proper setting for numerical approximations. We also provide an abstract framework for evolution problems in these spaces: Bochner spaces. We next turn to approximation. We build layers of abstraction, progressing from functions, to differential forms, and finally, to Hilbert complexes. We explore finite element exterior calculus (FEEC), which allows us to approximate solutions involving differential forms, and analyze the approximation error. In the second part, we prove our central results. We first prove an extension of current error estimates for the elliptic problem in Hilbert complexes. This extension handles solutions with nonzero harmonic part. Next, we consider evolution problems in Hilbert complexes and prove abstract error estimates. We apply these estimates to the problem for Riemannian hypersurfaces in R. {n+1},generalizing current results for open subsets of R. {n}. Finally, we applysome of the concepts to a nonlinear problem, the Ricci flow on surfaces, and use tools from nonlinear analysis to help develop and analyze the equations. In the appendices, we detail some additional motivation and a source for further examples: canonical geometries that are realized as steady-state solutions to parabolic equations similar to that of Ricci flow. An eventual goal is to compute such solutions using the methods of the previous chapters.
Advances in the Theory of Complex Networks
NASA Astrophysics Data System (ADS)
Peruani, Fernando
An exhaustive and comprehensive review on the theory of complex networks would imply nowadays a titanic task, and it would result in a lengthy work containing plenty of technical details of arguable relevance. Instead, this chapter addresses very briefly the ABC of complex network theory, visiting only the hallmarks of the theoretical founding, to finally focus on two of the most interesting and promising current research problems: the study of dynamical processes on transportation networks and the identification of communities in complex networks.
The Future of Teaching Research in the Social Sciences
ERIC Educational Resources Information Center
Wagner, C.
2009-01-01
Current literature on teaching research methodology in the social sciences highlights the changing nature of our world in terms of its complexity and diversity, and points to how this affects the way in which we search for answers to related problems (Brew 2003, 3; Tashakkori and Teddlie 2003, 74). New ways of approaching research problems that…
ERIC Educational Resources Information Center
Tickles, Virginia C.; Li, Yadong; Walters, Wilbur L.
2013-01-01
Much criticism exists concerning a lack of focus on real-world problem-solving in the science, technology, engineering and mathematics (STEM) infrastructures. Many of these critics say that current educational infrastructures are incapable in preparing future scientists and engineers to solve the complex and multidisciplinary problems this society…
ERIC Educational Resources Information Center
Ford, Julian D.; Grasso, Damion J.; Levine, Joan; Tennen, Howard
2018-01-01
This pilot randomized clinical trial tested an emotion regulation enhancement to cognitive behavior therapy (CBT) with 29 college student problem drinkers with histories of complex trauma and current clinically significant traumatic stress symptoms. Participants received eight face-to-face sessions of manualized Internet-supported CBT for problem…
NASA Astrophysics Data System (ADS)
Goma, Sergio R.
2015-03-01
In current times, mobile technologies are ubiquitous and the complexity of problems is continuously increasing. In the context of advancement of engineering, we explore in this paper possible reasons that could cause a saturation in technology evolution - namely the ability of problem solving based on previous results and the ability of expressing solutions in a more efficient way, concluding that `thinking outside of brain' - as in solving engineering problems that are expressed in a virtual media due to their complexity - would benefit from mobile technology augmentation. This could be the necessary evolutionary step that would provide the efficiency required to solve new complex problems (addressing the `running out of time' issue) and remove the communication of results barrier (addressing the human `perception/expression imbalance' issue). Some consequences are discussed, as in this context the artificial intelligence becomes an automation tool aid instead of a necessary next evolutionary step. The paper concludes that research in modeling as problem solving aid and data visualization as perception aid augmented with mobile technologies could be the path to an evolutionary step in advancing engineering.
Ordinal optimization and its application to complex deterministic problems
NASA Astrophysics Data System (ADS)
Yang, Mike Shang-Yu
1998-10-01
We present in this thesis a new perspective to approach a general class of optimization problems characterized by large deterministic complexities. Many problems of real-world concerns today lack analyzable structures and almost always involve high level of difficulties and complexities in the evaluation process. Advances in computer technology allow us to build computer models to simulate the evaluation process through numerical means, but the burden of high complexities remains to tax the simulation with an exorbitant computing cost for each evaluation. Such a resource requirement makes local fine-tuning of a known design difficult under most circumstances, let alone global optimization. Kolmogorov equivalence of complexity and randomness in computation theory is introduced to resolve this difficulty by converting the complex deterministic model to a stochastic pseudo-model composed of a simple deterministic component and a white-noise like stochastic term. The resulting randomness is then dealt with by a noise-robust approach called Ordinal Optimization. Ordinal Optimization utilizes Goal Softening and Ordinal Comparison to achieve an efficient and quantifiable selection of designs in the initial search process. The approach is substantiated by a case study in the turbine blade manufacturing process. The problem involves the optimization of the manufacturing process of the integrally bladed rotor in the turbine engines of U.S. Air Force fighter jets. The intertwining interactions among the material, thermomechanical, and geometrical changes makes the current FEM approach prohibitively uneconomical in the optimization process. The generalized OO approach to complex deterministic problems is applied here with great success. Empirical results indicate a saving of nearly 95% in the computing cost.
Discrete square root filtering - A survey of current techniques.
NASA Technical Reports Server (NTRS)
Kaminskii, P. G.; Bryson, A. E., Jr.; Schmidt, S. F.
1971-01-01
Current techniques in square root filtering are surveyed and related by applying a duality association. Four efficient square root implementations are suggested, and compared with three common conventional implementations in terms of computational complexity and precision. It is shown that the square root computational burden should not exceed the conventional by more than 50% in most practical problems. An examination of numerical conditioning predicts that the square root approach can yield twice the effective precision of the conventional filter in ill-conditioned problems. This prediction is verified in two examples.
Unusual Voltage-Gated Sodium Currents as Targets for Pain.
Barbosa, C; Cummins, T R
2016-01-01
Pain is a serious health problem that impacts the lives of many individuals. Hyperexcitability of peripheral sensory neurons contributes to both acute and chronic pain syndromes. Because voltage-gated sodium currents are crucial to the transmission of electrical signals in peripheral sensory neurons, the channels that underlie these currents are attractive targets for pain therapeutics. Sodium currents and channels in peripheral sensory neurons are complex. Multiple-channel isoforms contribute to the macroscopic currents in nociceptive sensory neurons. These different isoforms exhibit substantial variations in their kinetics and pharmacology. Furthermore, sodium current complexity is enhanced by an array of interacting proteins that can substantially modify the properties of voltage-gated sodium channels. Resurgent sodium currents, atypical currents that can enhance recovery from inactivation and neuronal firing, are increasingly being recognized as playing potentially important roles in sensory neuron hyperexcitability and pain sensations. Here we discuss unusual sodium channels and currents that have been identified in nociceptive sensory neurons, describe what is known about the molecular determinants of the complex sodium currents in these neurons. Finally, we provide an overview of therapeutic strategies to target voltage-gated sodium currents in nociceptive neurons. Copyright © 2016 Elsevier Inc. All rights reserved.
The Association between Motivation, Affect, and Self-regulated Learning When Solving Problems.
Baars, Martine; Wijnia, Lisette; Paas, Fred
2017-01-01
Self-regulated learning (SRL) skills are essential for learning during school years, particularly in complex problem-solving domains, such as biology and math. Although a lot of studies have focused on the cognitive resources that are needed for learning to solve problems in a self-regulated way, affective and motivational resources have received much less research attention. The current study investigated the relation between affect (i.e., Positive Affect and Negative Affect Scale), motivation (i.e., autonomous and controlled motivation), mental effort, SRL skills, and problem-solving performance when learning to solve biology problems in a self-regulated online learning environment. In the learning phase, secondary education students studied video-modeling examples of how to solve hereditary problems, solved hereditary problems which they chose themselves from a set of problems with different complexity levels (i.e., five levels). In the posttest, students solved hereditary problems, self-assessed their performance, and chose a next problem from the set of problems but did not solve these problems. The results from this study showed that negative affect, inaccurate self-assessments during the posttest, and higher perceptions of mental effort during the posttest were negatively associated with problem-solving performance after learning in a self-regulated way.
Behind the Veil of Conduct Disorder: Challenging Current Assumptions in Search of Strengths
ERIC Educational Resources Information Center
Foltz, Robert
2008-01-01
Advances in neuroscience are providing fresh insights about emotional and behavioral problems of children and youth. However, the flood of brain-related articles is a mixed blessing. Some popular authors on the brain, as well as certain researchers, take a very narrow view of attributing complex social problems to brain disorder. The effect is to…
Theoretical Foundations of Software Technology.
1983-02-14
major research interests are software testing, aritificial intelligence , pattern recogu- tion, and computer graphics. Dr. Chandranekaran is currently...produce PASCAL language code for the problems. Because of its relationship to many issues in Artificial Intelligence , we also investigated problems of...analysis to concurmt-prmcess software re- are not " intelligent " enough to discover these by themselves, ouirl more complex control flow models. The PAF
An Overview of Computational Aeroacoustic Modeling at NASA Langley
NASA Technical Reports Server (NTRS)
Lockard, David P.
2001-01-01
The use of computational techniques in the area of acoustics is known as computational aeroacoustics and has shown great promise in recent years. Although an ultimate goal is to use computational simulations as a virtual wind tunnel, the problem is so complex that blind applications of traditional algorithms are typically unable to produce acceptable results. The phenomena of interest are inherently unsteady and cover a wide range of frequencies and amplitudes. Nonetheless, with appropriate simplifications and special care to resolve specific phenomena, currently available methods can be used to solve important acoustic problems. These simulations can be used to complement experiments, and often give much more detailed information than can be obtained in a wind tunnel. The use of acoustic analogy methods to inexpensively determine far-field acoustics from near-field unsteadiness has greatly reduced the computational requirements. A few examples of current applications of computational aeroacoustics at NASA Langley are given. There remains a large class of problems that require more accurate and efficient methods. Research to develop more advanced methods that are able to handle the geometric complexity of realistic problems using block-structured and unstructured grids are highlighted.
2017-01-01
This work focuses on the design of transmitting coils in weakly coupled magnetic induction communication systems. We propose several optimization methods that reduce the active, reactive and apparent power consumption of the coil. These problems are formulated as minimization problems, in which the power consumed by the transmitting coil is minimized, under the constraint of providing a required magnetic field at the receiver location. We develop efficient numeric and analytic methods to solve the resulting problems, which are of high dimension, and in certain cases non-convex. For the objective of minimal reactive power an analytic solution for the optimal current distribution in flat disc transmitting coils is provided. This problem is extended to general three-dimensional coils, for which we develop an expression for the optimal current distribution. Considering the objective of minimal apparent power, a method is developed to reduce the computational complexity of the problem by transforming it to an equivalent problem of lower dimension, allowing a quick and accurate numeric solution. These results are verified experimentally by testing a number of coil geometries. The results obtained allow reduced power consumption and increased performances in magnetic induction communication systems. Specifically, for wideband systems, an optimal design of the transmitter coil reduces the peak instantaneous power provided by the transmitter circuitry, and thus reduces its size, complexity and cost. PMID:28192463
Problem decomposition by mutual information and force-based clustering
NASA Astrophysics Data System (ADS)
Otero, Richard Edward
The scale of engineering problems has sharply increased over the last twenty years. Larger coupled systems, increasing complexity, and limited resources create a need for methods that automatically decompose problems into manageable sub-problems by discovering and leveraging problem structure. The ability to learn the coupling (inter-dependence) structure and reorganize the original problem could lead to large reductions in the time to analyze complex problems. Such decomposition methods could also provide engineering insight on the fundamental physics driving problem solution. This work forwards the current state of the art in engineering decomposition through the application of techniques originally developed within computer science and information theory. The work describes the current state of automatic problem decomposition in engineering and utilizes several promising ideas to advance the state of the practice. Mutual information is a novel metric for data dependence and works on both continuous and discrete data. Mutual information can measure both the linear and non-linear dependence between variables without the limitations of linear dependence measured through covariance. Mutual information is also able to handle data that does not have derivative information, unlike other metrics that require it. The value of mutual information to engineering design work is demonstrated on a planetary entry problem. This study utilizes a novel tool developed in this work for planetary entry system synthesis. A graphical method, force-based clustering, is used to discover related sub-graph structure as a function of problem structure and links ranked by their mutual information. This method does not require the stochastic use of neural networks and could be used with any link ranking method currently utilized in the field. Application of this method is demonstrated on a large, coupled low-thrust trajectory problem. Mutual information also serves as the basis for an alternative global optimizer, called MIMIC, which is unrelated to Genetic Algorithms. Advancement to the current practice demonstrates the use of MIMIC as a global method that explicitly models problem structure with mutual information, providing an alternate method for globally searching multi-modal domains. By leveraging discovered problem inter- dependencies, MIMIC may be appropriate for highly coupled problems or those with large function evaluation cost. This work introduces a useful addition to the MIMIC algorithm that enables its use on continuous input variables. By leveraging automatic decision tree generation methods from Machine Learning and a set of randomly generated test problems, decision trees for which method to apply are also created, quantifying decomposition performance over a large region of the design space.
A new approach to identify, classify and count drugrelated events
Bürkle, Thomas; Müller, Fabian; Patapovas, Andrius; Sonst, Anja; Pfistermeister, Barbara; Plank-Kiegele, Bettina; Dormann, Harald; Maas, Renke
2013-01-01
Aims The incidence of clinical events related to medication errors and/or adverse drug reactions reported in the literature varies by a degree that cannot solely be explained by the clinical setting, the varying scrutiny of investigators or varying definitions of drug-related events. Our hypothesis was that the individual complexity of many clinical cases may pose relevant limitations for current definitions and algorithms used to identify, classify and count adverse drug-related events. Methods Based on clinical cases derived from an observational study we identified and classified common clinical problems that cannot be adequately characterized by the currently used definitions and algorithms. Results It appears that some key models currently used to describe the relation of medication errors (MEs), adverse drug reactions (ADRs) and adverse drug events (ADEs) can easily be misinterpreted or contain logical inconsistencies that limit their accurate use to all but the simplest clinical cases. A key limitation of current models is the inability to deal with complex interactions such as one drug causing two clinically distinct side effects or multiple drugs contributing to a single clinical event. Using a large set of clinical cases we developed a revised model of the interdependence between MEs, ADEs and ADRs and extended current event definitions when multiple medications cause multiple types of problems. We propose algorithms that may help to improve the identification, classification and counting of drug-related events. Conclusions The new model may help to overcome some of the limitations that complex clinical cases pose to current paper- or software-based drug therapy safety. PMID:24007453
Atmospheric processes over complex terrain
NASA Astrophysics Data System (ADS)
Banta, Robert M.; Berri, G.; Blumen, William; Carruthers, David J.; Dalu, G. A.; Durran, Dale R.; Egger, Joseph; Garratt, J. R.; Hanna, Steven R.; Hunt, J. C. R.
1990-06-01
A workshop on atmospheric processes over complex terrain, sponsored by the American Meteorological Society, was convened in Park City, Utah from 24 vto 28 October 1988. The overall objective of the workshop was one of interaction and synthesis--interaction among atmospheric scientists carrying out research on a variety of orographic flow problems, and a synthesis of their results and points of view into an assessment of the current status of topical research problems. The final day of the workshop was devoted to an open discussion on the research directions that could be anticipated in the next decade because of new and planned instrumentation and observational networks, the recent emphasis on development of mesoscale numerical models, and continual theoretical investigations of thermally forced flows, orographic waves, and stratified turbulence. This monograph represents an outgrowth of the Park City Workshop. The authors have contributed chapters based on their lecture material. Workshop discussions indicated interest in both the remote sensing and predictability of orographic flows. These chapters were solicited following the workshop in order to provide a more balanced view of current progress and future directions in research on atmospheric processes over complex terrain.
Command and Control in a Complex World
2012-05-22
definition of command and control does not adequately address changes introduced through technology trends, our understanding of the global operating...processes. The current joint definition of command and control does not adequately address changes introduced through technology trends, our...the problem is actually solved. There are no definitive , objective solutions to wicked problems. For a complete definition of wicked problems, see
Creativity: Creativity in Complex Military Systems
2017-05-25
generation later in the problem-solving process. The design process is an alternative problem-solving framework individuals or groups use to orient...no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control ...the potential of their formations. 15. SUBJECT TERMS Creativity, Divergent Thinking, Design , Systems Thinking, Operational Art 16. SECURITY
Storage and treatment of SNF of Alfa class nuclear submarines: current status and problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ignatiev, Sviatoslav; Zabudko, Alexey; Pankratov, Dmitry
Available in abstract form only. Full text of publication follows: The current status and main problems associated with storage, defueling and following treatment of spent nuclear fuel (SNF) of Nuclear Submarines (NS) with heavy liquid metal cooled reactors are considered. In the final analysis these solutions could be realized in the form of separate projects to be funded through national and bi- and multilateral funding in the framework of the international collaboration of the Russian Federation on complex utilization of NS and rehabilitation of contaminated objects allocated in the North-West region of Russia. (authors)
Computational problems and signal processing in SETI
NASA Technical Reports Server (NTRS)
Deans, Stanley R.; Cullers, D. K.; Stauduhar, Richard
1991-01-01
The Search for Extraterrestrial Intelligence (SETI), currently being planned at NASA, will require that an enormous amount of data (on the order of 10 exp 11 distinct signal paths for a typical observation) be analyzed in real time by special-purpose hardware. Even though the SETI system design is not based on maximum entropy and Bayesian methods (partly due to the real-time processing constraint), it is expected that enough data will be saved to be able to apply these and other methods off line where computational complexity is not an overriding issue. Interesting computational problems that relate directly to the system design for processing such an enormous amount of data have emerged. Some of these problems are discussed, along with the current status on their solution.
On Chaotic and Hyperchaotic Complex Nonlinear Dynamical Systems
NASA Astrophysics Data System (ADS)
Mahmoud, Gamal M.
Dynamical systems described by real and complex variables are currently one of the most popular areas of scientific research. These systems play an important role in several fields of physics, engineering, and computer sciences, for example, laser systems, control (or chaos suppression), secure communications, and information science. Dynamical basic properties, chaos (hyperchaos) synchronization, chaos control, and generating hyperchaotic behavior of these systems are briefly summarized. The main advantage of introducing complex variables is the reduction of phase space dimensions by a half. They are also used to describe and simulate the physics of detuned laser and thermal convection of liquid flows, where the electric field and the atomic polarization amplitudes are both complex. Clearly, if the variables of the system are complex the equations involve twice as many variables and control parameters, thus making it that much harder for a hostile agent to intercept and decipher the coded message. Chaotic and hyperchaotic complex systems are stated as examples. Finally there are many open problems in the study of chaotic and hyperchaotic complex nonlinear dynamical systems, which need further investigations. Some of these open problems are given.
High-frequency CAD-based scattering model: SERMAT
NASA Astrophysics Data System (ADS)
Goupil, D.; Boutillier, M.
1991-09-01
Specifications for an industrial radar cross section (RCS) calculation code are given: it must be able to exchange data with many computer aided design (CAD) systems, it must be fast, and it must have powerful graphic tools. Classical physical optics (PO) and equivalent currents (EC) techniques have proven their efficiency on simple objects for a long time. Difficult geometric problems occur when objects with very complex shapes have to be computed. Only a specific geometric code can solve these problems. We have established that, once these problems have been solved: (1) PO and EC give good results on complex objects of large size compared to wavelength; and (2) the implementation of these objects in a software package (SERMAT) allows fast and sufficiently precise domain RCS calculations to meet industry requirements in the domain of stealth.
NASA Technical Reports Server (NTRS)
Richmond, J. H.
1974-01-01
Piecewise-sinusoidal expansion functions and Galerkin's method are employed to formulate a solution for an arbitrary thin-wire configuration in a homogeneous conducting medium. The analysis is performed in the real or complex frequency domain. In antenna problems, the solution determines the current distribution, impedance, radiation efficiency, gain and far-field patterns. In scattering problems, the solution determines the absorption cross section, scattering cross section and the polarization scattering matrix. The electromagnetic theory is presented for thin wires and the forward-scattering theorem is developed for an arbitrary target in a homogeneous conducting medium.
The Association between Motivation, Affect, and Self-regulated Learning When Solving Problems
Baars, Martine; Wijnia, Lisette; Paas, Fred
2017-01-01
Self-regulated learning (SRL) skills are essential for learning during school years, particularly in complex problem-solving domains, such as biology and math. Although a lot of studies have focused on the cognitive resources that are needed for learning to solve problems in a self-regulated way, affective and motivational resources have received much less research attention. The current study investigated the relation between affect (i.e., Positive Affect and Negative Affect Scale), motivation (i.e., autonomous and controlled motivation), mental effort, SRL skills, and problem-solving performance when learning to solve biology problems in a self-regulated online learning environment. In the learning phase, secondary education students studied video-modeling examples of how to solve hereditary problems, solved hereditary problems which they chose themselves from a set of problems with different complexity levels (i.e., five levels). In the posttest, students solved hereditary problems, self-assessed their performance, and chose a next problem from the set of problems but did not solve these problems. The results from this study showed that negative affect, inaccurate self-assessments during the posttest, and higher perceptions of mental effort during the posttest were negatively associated with problem-solving performance after learning in a self-regulated way. PMID:28848467
NASA Astrophysics Data System (ADS)
Steinberg, Marc
2011-06-01
This paper presents a selective survey of theoretical and experimental progress in the development of biologicallyinspired approaches for complex surveillance and reconnaissance problems with multiple, heterogeneous autonomous systems. The focus is on approaches that may address ISR problems that can quickly become mathematically intractable or otherwise impractical to implement using traditional optimization techniques as the size and complexity of the problem is increased. These problems require dealing with complex spatiotemporal objectives and constraints at a variety of levels from motion planning to task allocation. There is also a need to ensure solutions are reliable and robust to uncertainty and communications limitations. First, the paper will provide a short introduction to the current state of relevant biological research as relates to collective animal behavior. Second, the paper will describe research on largely decentralized, reactive, or swarm approaches that have been inspired by biological phenomena such as schools of fish, flocks of birds, ant colonies, and insect swarms. Next, the paper will discuss approaches towards more complex organizational and cooperative mechanisms in team and coalition behaviors in order to provide mission coverage of large, complex areas. Relevant team behavior may be derived from recent advances in understanding of the social and cooperative behaviors used for collaboration by tens of animals with higher-level cognitive abilities such as mammals and birds. Finally, the paper will briefly discuss challenges involved in user interaction with these types of systems.
NASA Astrophysics Data System (ADS)
Alpers, Andreas; Gritzmann, Peter
2018-03-01
We consider the problem of reconstructing the paths of a set of points over time, where, at each of a finite set of moments in time the current positions of points in space are only accessible through some small number of their x-rays. This particular particle tracking problem, with applications, e.g. in plasma physics, is the basic problem in dynamic discrete tomography. We introduce and analyze various different algorithmic models. In particular, we determine the computational complexity of the problem (and various of its relatives) and derive algorithms that can be used in practice. As a byproduct we provide new results on constrained variants of min-cost flow and matching problems.
Design concepts for the development of cooperative problem-solving systems
NASA Technical Reports Server (NTRS)
Smith, Philip J.; Mccoy, Elaine; Layton, Chuck; Bihari, Tom
1992-01-01
There are many problem-solving tasks that are too complex to fully automate given the current state of technology. Nevertheless, significant improvements in overall system performance could result from the introduction of well-designed computer aids. We have been studying the development of cognitive tools for one such problem-solving task, enroute flight path planning for commercial airlines. Our goal was two-fold. First, we were developing specific systems designs to help with this important practical problem. Second, we are using this context to explore general design concepts to guide in the development of cooperative problem-solving systems. These designs concepts are described.
Kramers-Kronig relations in Laser Intensity Modulation Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuncer, Enis
2006-01-01
In this short paper, the Kramers-Kronig relations for the Laser Intensity Modulation Method (LIMM) are presented to check the self-consistency of experimentally obtained complex current densities. The numerical procedure yields well defined, precise estimates for the real and the imaginary parts of the LIMM current density calculated from its imaginary and real parts, respectively. The procedure also determines an accurate high frequency real current value which appears to be an intrinsic material parameter similar to that of the dielectric permittivity at optical frequencies. Note that the problem considered here couples two different material properties, thermal and electrical, consequently the validitymore » of the Kramers-Kronig relation indicates that the problem is invariant and linear.« less
NASA Technical Reports Server (NTRS)
Nosenchuck, D. M.; Littman, M. G.
1986-01-01
The Navier-Stokes computer (NSC) has been developed for solving problems in fluid mechanics involving complex flow simulations that require more speed and capacity than provided by current and proposed Class VI supercomputers. The machine is a parallel processing supercomputer with several new architectural elements which can be programmed to address a wide range of problems meeting the following criteria: (1) the problem is numerically intensive, and (2) the code makes use of long vectors. A simulation of two-dimensional nonsteady viscous flows is presented to illustrate the architecture, programming, and some of the capabilities of the NSC.
Combustion and fires in low gravity
NASA Technical Reports Server (NTRS)
Friedman, Robert
1994-01-01
Fire safety always receives priority attention in NASA mission designs and operations, with emphasis on fire prevention and material acceptance standards. Recently, interest in spacecraft fire-safety research and development has increased because improved understanding of the significant differences between low-gravity and normal-gravity combustion suggests that present fire-safety techniques may be inadequate or, at best, non-optimal; and the complex and permanent orbital operations in Space Station Freedom demand a higher level of safety standards and practices. This presentation outlines current practices and problems in fire prevention and detection for spacecraft, specifically the Space Station Freedom's fire protection. Also addressed are current practices and problems in fire extinguishment for spacecraft.
Redundant interferometric calibration as a complex optimization problem
NASA Astrophysics Data System (ADS)
Grobler, T. L.; Bernardi, G.; Kenyon, J. S.; Parsons, A. R.; Smirnov, O. M.
2018-05-01
Observations of the redshifted 21 cm line from the epoch of reionization have recently motivated the construction of low-frequency radio arrays with highly redundant configurations. These configurations provide an alternative calibration strategy - `redundant calibration' - and boost sensitivity on specific spatial scales. In this paper, we formulate calibration of redundant interferometric arrays as a complex optimization problem. We solve this optimization problem via the Levenberg-Marquardt algorithm. This calibration approach is more robust to initial conditions than current algorithms and, by leveraging an approximate matrix inversion, allows for further optimization and an efficient implementation (`redundant STEFCAL'). We also investigated using the preconditioned conjugate gradient method as an alternative to the approximate matrix inverse, but found that its computational performance is not competitive with respect to `redundant STEFCAL'. The efficient implementation of this new algorithm is made publicly available.
Mental health services conceptualised as complex adaptive systems: what can be learned?
Ellis, Louise A; Churruca, Kate; Braithwaite, Jeffrey
2017-01-01
Despite many attempts at promoting systems integration, seamless care, and partnerships among service providers and users, mental health services internationally continue to be fragmented and piecemeal. We exploit recent ideas from complexity science to conceptualise mental health services as complex adaptive systems (CASs). The core features of CASs are described and Australia's headspace initiative is used as an example of the kinds of problems currently being faced. We argue that adopting a CAS lens can transform services, creating more connected care for service users with mental health conditions.
Dibello, Angelo M; Neighbors, Clayton; Rodriguez, Lindsey M; Lindgren, Kristen
2014-01-01
Previous research has shown that both alcohol use and jealousy are related to negative relationship outcomes. Little work, however, has examined direct associations between alcohol use and jealousy. The current study was aimed to build upon existing research examining alcohol use and jealousy. More specifically, findings from current jealousy literature indicate that jealousy is a multifaceted construct with both maladaptive and adaptive aspects. The current study examined the association between maladaptive and adaptive feelings of jealousy and alcohol-related problems in the context of drinking to cope. Given the relationship between coping motives and alcohol-related problems, our primary interest was in predicting alcohol-related problems, but alcohol consumption was also investigated. Undergraduate students at a large Northwestern university (N=657) in the US participated in the study. They completed measures of jealousy, drinking to cope, alcohol use, and alcohol-related problems. Analyses examined associations between jealousy subscales, alcohol use, drinking to cope, and drinking problems. Results indicated that drinking to cope mediated the association between some, but not all, aspects of jealousy and problems with alcohol use. In particular, the more negative or maladaptive aspects of jealousy were related to drinking to cope and drinking problems, while the more adaptive aspects were not, suggesting a more complex view of jealousy than previously understood. © 2013. Published by Elsevier Ltd. All rights reserved.
DiBello, Angelo M.; Neighbors, Clayton; Rodriguez, Lindsey M.; Lindgren, Kristen
2013-01-01
Previous research has shown that both alcohol use and jealousy are related to negative relationship outcomes. Little work, however, has examined direct associations between alcohol use and jealousy. The current study aimed to build upon existing research examining alcohol use and jealousy. More specifically, findings from current jealousy literature indicate that jealousy is a multifaceted construct with both maladaptive and adaptive aspects. The current study examined the association between maladaptive and adaptive feelings of jealousy and alcohol-related problems in the context of drinking to cope. Given the relationship between coping motives and alcohol-related problems, our primary interest was in predicting alcohol-related problems, but alcohol consumption was also investigated. Undergraduate students at a large Northwestern university (N = 657) in the US participated in the study. They completed measures of jealousy, drinking to cope, alcohol use, and alcohol-related problems. Analyses examined associations between jealousy subscales, alcohol use, drinking to cope, and drinking problems. Results indicated that drinking to cope mediated the association between some, but not all, aspects of jealousy and problems with alcohol use. In particular, the more negative or maladaptive aspects of jealousy were related to drinking to cope and drinking problems, while the more adaptive aspects were not, suggesting a more complex view of jealousy than previously understood. PMID:24138965
Wang, Shinmin; Gathercole, Susan E
2013-05-01
The current study investigated the cause of the reported problems in working memory in children with reading difficulties. Verbal and visuospatial simple and complex span tasks, and digit span and reaction times tasks performed singly and in combination, were administered to 46 children with single word reading difficulties and 45 typically developing children matched for age and nonverbal ability. Children with reading difficulties had pervasive deficits in the simple and complex span tasks and had poorer abilities to coordinate two cognitive demanding tasks. These findings indicate that working memory problems in children with reading difficulties may reflect a core deficit in the central executive. Copyright © 2012 Elsevier Inc. All rights reserved.
ARPA surveillance technology for detection of targets hidden in foliage
NASA Astrophysics Data System (ADS)
Hoff, Lawrence E.; Stotts, Larry B.
1994-02-01
The processing of large quantities of synthetic aperture radar data in real time is a complex problem. Even the image formation process taxes today's most advanced computers. The use of complex algorithms with multiple channels adds another dimension to the computational problem. Advanced Research Projects Agency (ARPA) is currently planning on using the Paragon parallel processor for this task. The Paragon is small enough to allow its use in a sensor aircraft. Candidate algorithms will be implemented on the Paragon for evaluation for real time processing. In this paper ARPA technology developments for detecting targets hidden in foliage are reviewed and examples of signal processing techniques on field collected data are presented.
Artificial intelligence approaches to astronomical observation scheduling
NASA Technical Reports Server (NTRS)
Johnston, Mark D.; Miller, Glenn
1988-01-01
Automated scheduling will play an increasing role in future ground- and space-based observatory operations. Due to the complexity of the problem, artificial intelligence technology currently offers the greatest potential for the development of scheduling tools with sufficient power and flexibility to handle realistic scheduling situations. Summarized here are the main features of the observatory scheduling problem, how artificial intelligence (AI) techniques can be applied, and recent progress in AI scheduling for Hubble Space Telescope.
Planning Following Stroke: A Relational Complexity Approach Using the Tower of London
Andrews, Glenda; Halford, Graeme S.; Chappell, Mark; Maujean, Annick; Shum, David H. K.
2014-01-01
Planning on the 4-disk version of the Tower of London (TOL4) was examined in stroke patients and unimpaired controls. Overall TOL4 solution scores indicated impaired planning in the frontal stroke but not non-frontal stroke patients. Consistent with the claim that processing the relations between current states, intermediate states, and goal states is a key process in planning, the domain-general relational complexity metric was a good indicator of the experienced difficulty of TOL4 problems. The relational complexity metric shared variance with task-specific metrics of moves to solution and search depth. Frontal stroke patients showed impaired planning compared to controls on problems at all three complexity levels, but at only two of the three levels of moves to solution, search depth and goal ambiguity. Non-frontal stroke patients showed impaired planning only on the most difficult quaternary-relational and high search depth problems. An independent measure of relational processing (viz., Latin square task) predicted TOL4 solution scores after controlling for stroke status and location, and executive processing (Trail Making Test). The findings suggest that planning involves a domain-general capacity for relational processing that depends on the frontal brain regions. PMID:25566042
ERIC Educational Resources Information Center
Tomczyk, Lukasz; Szotkowski, Rene; Fabis, Artur; Wasinski, Arkadiusz; Chudý, Štefan; Neumeister, Pavel
2017-01-01
The paper presents the complex problems of preparation of pedagogy students to work as teachers in the context of their readiness to use ICT in the didactic process. The complexity of this subject matter has been proved by the current, ongoing, discussion about the direction of the expected transformations of contemporary schools and the…
McDermott, Jason E.; Wang, Jing; Mitchell, Hugh; Webb-Robertson, Bobbie-Jo; Hafen, Ryan; Ramey, John; Rodland, Karin D.
2012-01-01
Introduction The advent of high throughput technologies capable of comprehensive analysis of genes, transcripts, proteins and other significant biological molecules has provided an unprecedented opportunity for the identification of molecular markers of disease processes. However, it has simultaneously complicated the problem of extracting meaningful molecular signatures of biological processes from these complex datasets. The process of biomarker discovery and characterization provides opportunities for more sophisticated approaches to integrating purely statistical and expert knowledge-based approaches. Areas covered In this review we will present examples of current practices for biomarker discovery from complex omic datasets and the challenges that have been encountered in deriving valid and useful signatures of disease. We will then present a high-level review of data-driven (statistical) and knowledge-based methods applied to biomarker discovery, highlighting some current efforts to combine the two distinct approaches. Expert opinion Effective, reproducible and objective tools for combining data-driven and knowledge-based approaches to identify predictive signatures of disease are key to future success in the biomarker field. We will describe our recommendations for possible approaches to this problem including metrics for the evaluation of biomarkers. PMID:23335946
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDermott, Jason E.; Wang, Jing; Mitchell, Hugh D.
2013-01-01
The advent of high throughput technologies capable of comprehensive analysis of genes, transcripts, proteins and other significant biological molecules has provided an unprecedented opportunity for the identification of molecular markers of disease processes. However, it has simultaneously complicated the problem of extracting meaningful signatures of biological processes from these complex datasets. The process of biomarker discovery and characterization provides opportunities both for purely statistical and expert knowledge-based approaches and would benefit from improved integration of the two. Areas covered In this review we will present examples of current practices for biomarker discovery from complex omic datasets and the challenges thatmore » have been encountered. We will then present a high-level review of data-driven (statistical) and knowledge-based methods applied to biomarker discovery, highlighting some current efforts to combine the two distinct approaches. Expert opinion Effective, reproducible and objective tools for combining data-driven and knowledge-based approaches to biomarker discovery and characterization are key to future success in the biomarker field. We will describe our recommendations of possible approaches to this problem including metrics for the evaluation of biomarkers.« less
NASA Technical Reports Server (NTRS)
Boyalakuntla, Kishore; Soni, Bharat K.; Thornburg, Hugh J.; Yu, Robert
1996-01-01
During the past decade, computational simulation of fluid flow around complex configurations has progressed significantly and many notable successes have been reported, however, unsteady time-dependent solutions are not easily obtainable. The present effort involves unsteady time dependent simulation of temporally deforming geometries. Grid generation for a complex configuration can be a time consuming process and temporally varying geometries necessitate the regeneration of such grids for every time step. Traditional grid generation techniques have been tried and demonstrated to be inadequate to such simulations. Non-Uniform Rational B-splines (NURBS) based techniques provide a compact and accurate representation of the geometry. This definition can be coupled with a distribution mesh for a user defined spacing. The present method greatly reduces cpu requirements for time dependent remeshing, facilitating the simulation of more complex unsteady problems. A thrust vectoring nozzle has been chosen to demonstrate the capability as it is of current interest in the aerospace industry for better maneuverability of fighter aircraft in close combat and in post stall regimes. This current effort is the first step towards multidisciplinary design optimization which involves coupling the aerodynamic heat transfer and structural analysis techniques. Applications include simulation of temporally deforming bodies and aeroelastic problems.
Potential Flow Theory and Operation Guide for the Panel Code PMARC. Version 14
NASA Technical Reports Server (NTRS)
Ashby, Dale L.
1999-01-01
The theoretical basis for PMARC, a low-order panel code for modeling complex three-dimensional bodies, in potential flow, is outlined. PMARC can be run on a wide variety of computer platforms, including desktop machines, workstations, and supercomputers. Execution times for PMARC vary tremendously depending on the computer resources used, but typically range from several minutes for simple or moderately complex cases to several hours for very large complex cases. Several of the advanced features currently included in the code, such as internal flow modeling, boundary layer analysis, and time-dependent flow analysis, including problems involving relative motion, are discussed in some detail. The code is written in Fortran77, using adjustable-size arrays so that it can be easily redimensioned to match problem requirements and computer hardware constraints. An overview of the program input is presented. A detailed description of the input parameters is provided in the appendices. PMARC results for several test cases are presented along with analytic or experimental data, where available. The input files for these test cases are given in the appendices. PMARC currently supports plotfile output formats for several commercially available graphics packages. The supported graphics packages are Plot3D, Tecplot, and PmarcViewer.
NASA Technical Reports Server (NTRS)
Smith, Philip J.
1995-01-01
There are many problem-solving tasks that are too complex to fully automate given the current state of technology. Nevertheless, significant improvements in overall system performance could result from the introduction of well-designed computer aids. We have been studying the development of cognitive tools for one such problem-solving task, enroute flight path planning for commercial airlines. Our goal has been two-fold. First, we have been developing specific system designs to help with this important practical problem. Second, we have been using this context to explore general design concepts to guide in the development of cooperative problem-solving systems. These design concepts are described below, along with illustrations of their application.
ERIC Educational Resources Information Center
Cobarrubias, Juan, Ed.
The papers related to Canadian language policy at an international conference are presented: "Language Policy in Canada: Current Issues" (Juan Cobarrubias); "Multiculturalism and Language Policy in Canada" (Jim Cummins, Harold Troper); "Defining Language Policy in a Nationalistic Milieu and in a Complex Industrialized…
Current Research in Land Use Impact Assessment
There is a continuing debate on how to best evaluate land use impacts within the LCA framework. While this problem is spatially and temporally complex, recent advances in tool development are providing options to allow a GIS-based analysis of various ecosystem services given the...
Making science high impact to inform decision-making: Using boundary objects for aquatic research
The St. Louis River represents a complex natural resource management problem. Current ecosystem management decisions must address extensive sediment remediation and habitat restoration goals for the lower river and associated port, as well as recreational users who value differen...
The JPL functional requirements tool
NASA Technical Reports Server (NTRS)
Giffin, Geoff; Skinner, Judith; Stoller, Richard
1987-01-01
Planetary spacecraft are complex vehicles which are built according to many thousands of requirements. Problems encountered in documenting and maintaining these requirements led to the current attempt to reduce or eliminate these problems by a computer automated data base Functional Requirements Tool. The tool developed at JPL and in use on several JPL Projects is described. The organization and functionality of the Tool, together with an explanation of the data base inputs, their relationships, and use are presented. Methods of interfacing with external documents, representation of tables and figures, and methods of approval and change processing are discussed. The options available for disseminating information from the Tool are identified. The implementation of the Requirements Tool is outlined, and the operation is summarized. The conclusions drawn from this work is that the Requirements Tool represents a useful addition to the System Engineer's Tool kit, it is not currently available elsewhere, and a clear development path exists to expand the capabilities of the Tool to serve larger and more complex projects.
Etiology, Treatment and Prevention of Obesity in Childhood and Adolescence: A Decade in Review
Spruijt-Metz, Donna
2010-01-01
Childhood obesity has become an epidemic on a worldwide scale. This article gives an overview of the progress made in childhood and adolescent obesity research in the last decade, with a particular emphasis on the transdisciplinary and complex nature of the problem. The following topics are addressed: 1) current definitions of childhood and adolescent overweight and obesity; 2) demography of childhood and adolescent obesity both in the US and globally; 3) current topics in the physiology of fat and obesity; 4) psychosocial correlates of childhood and adolescent overweight and obesity; 5) the three major obesity-related behaviors, i.e. dietary intake, physical activity and sleep; 6) genes components of childhood and adolescent obesity; 7) environment and childhood and adolescent obesity; and 8) progress in interventions to prevent and treat childhood obesity. The article concludes with recommendations for future research, including the need for large-scale, high dose and long-term interventions that take into account the complex nature of the problem. PMID:21625328
Gosselin, Pierre; Bélanger, Diane; Lapaige, Véronique; Labbé, Yolaine
2011-01-01
This paper presents a public health narrative on Quebec's new climatic conditions and human health, and describes the transdisciplinary nature of the climate change adaptation research currently being adopted in Quebec, characterized by the three phases of problem identification, problem investigation, and problem transformation. A transdisciplinary approach is essential for dealing with complex ill-defined problems concerning human-environment interactions (for example, climate change), for allowing joint research, collective leadership, complex collaborations, and significant exchanges among scientists, decision makers, and knowledge users. Such an approach is widely supported in theory but has proved to be extremely difficult to implement in practice, and those who attempt it have met with heavy resistance, succeeding when they find the occasional opportunity within institutional or social contexts. In this paper we narrate the ongoing struggle involved in tackling the negative effects of climate change in multi-actor contexts at local and regional levels, a struggle that began in a quiet way in 1998. The paper will describe how public health adaptation research is supporting transdisciplinary action and implementation while also preparing for the future, and how this interaction to tackle a life-world problem (adaptation of the Quebec public health sector to climate change) in multi-actors contexts has progressively been established during the last 13 years. The first of the two sections introduces the social context of a Quebec undergoing climate changes. Current climatic conditions and expected changes will be described, and attendant health risks for the Quebec population. The second section addresses the scientific, institutional and normative dimensions of the problem. It corresponds to a "public health narrative" presented in three phases: (1) problem identification (1998-2002) beginning in northern Quebec; (2) problem investigation (2002-2006) in which the issues are successively explored, understood, and conceptualized for all of Quebec, and (3) problem transformation (2006-2009), which discusses major interactions among the stakeholders and the presentation of an Action Plan by a central actor, the Quebec government, in alliance with other stakeholders. In conclusion, we underline the importance, in the current context, of providing for a sustained transdisciplinary adaptation to climatic change. This paper should be helpful for (1) public health professionals confronted with establishing a transdisciplinary approach to a real-world problem other than climate change, (2) professionals in other sectors (such as public safety, built environment) confronted with climate change, who wish to implement transdisciplinary adaptive interventions and/or research, and (3) knowledge users (public and private actors; nongovernment organizations; citizens) from elsewhere in multi-contexts/environments/sectors who wish to promote complex collaborations (with us or not), collective leadership, and "transfrontier knowledge-to-action" for implementing climate change-related adaptation measures.
Gosselin, Pierre; Bélanger, Diane; Lapaige, Véronique; Labbé, Yolaine
2011-01-01
This paper presents a public health narrative on Quebec’s new climatic conditions and human health, and describes the transdisciplinary nature of the climate change adaptation research currently being adopted in Quebec, characterized by the three phases of problem identification, problem investigation, and problem transformation. A transdisciplinary approach is essential for dealing with complex ill-defined problems concerning human–environment interactions (for example, climate change), for allowing joint research, collective leadership, complex collaborations, and significant exchanges among scientists, decision makers, and knowledge users. Such an approach is widely supported in theory but has proved to be extremely difficult to implement in practice, and those who attempt it have met with heavy resistance, succeeding when they find the occasional opportunity within institutional or social contexts. In this paper we narrate the ongoing struggle involved in tackling the negative effects of climate change in multi-actor contexts at local and regional levels, a struggle that began in a quiet way in 1998. The paper will describe how public health adaptation research is supporting transdisciplinary action and implementation while also preparing for the future, and how this interaction to tackle a life-world problem (adaptation of the Quebec public health sector to climate change) in multi-actors contexts has progressively been established during the last 13 years. The first of the two sections introduces the social context of a Quebec undergoing climate changes. Current climatic conditions and expected changes will be described, and attendant health risks for the Quebec population. The second section addresses the scientific, institutional and normative dimensions of the problem. It corresponds to a “public health narrative” presented in three phases: (1) problem identification (1998–2002) beginning in northern Quebec; (2) problem investigation (2002–2006) in which the issues are successively explored, understood, and conceptualized for all of Quebec, and (3) problem transformation (2006–2009), which discusses major interactions among the stakeholders and the presentation of an Action Plan by a central actor, the Quebec government, in alliance with other stakeholders. In conclusion, we underline the importance, in the current context, of providing for a sustained transdisciplinary adaptation to climatic change. This paper should be helpful for (1) public health professionals confronted with establishing a transdisciplinary approach to a real-world problem other than climate change, (2) professionals in other sectors (such as public safety, built environment) confronted with climate change, who wish to implement transdisciplinary adaptive interventions and/or research, and (3) knowledge users (public and private actors; nongovernment organizations; citizens) from elsewhere in multi-contexts/environments/sectors who wish to promote complex collaborations (with us or not), collective leadership, and “transfrontier knowledge-to-action” for implementing climate change-related adaptation measures. PMID:21966228
[Problems of work world and its impact on health. Current financial crisis].
Tomasina, Fernando
2012-06-01
Health and work are complex processes. Besides, they are multiple considering the forms they take. These two processes are linked to each other and they are influenced by each other. According to this, it is possible to establish that work world is extremely complex and heterogeneous. In this world, "old" or traditional risks coexist with "modern risks", derived from the new models of work organization and the incorporation of new technologies. Unemployment, work relationships precariousness and work risks outsourcing are results of neoliberal strategies. Some negative results of health-sickness process derived from transformation in work world and current global economic crisis have been noticed in current work conditions. Finally, the need for reconstructing policies focusing on this situation derived from work world is suggested.
The Proposal of the Model for Developing Dispatch System for Nationwide One-Day Integrative Planning
NASA Astrophysics Data System (ADS)
Kim, Hyun Soo; Choi, Hyung Rim; Park, Byung Kwon; Jung, Jae Un; Lee, Jin Wook
The problems of dispatch planning for container truck are classified as the pickup and delivery problems, which are highly complex issues that consider various constraints in the real world. However, in case of the current situation, it is developed by the control system so that it requires the automated planning system under the view of nationwide integrative planning. Therefore, the purpose of this study is to suggest model to develop the automated dispatch system through the constraint satisfaction problem and meta-heuristic technique-based algorithm. In the further study, the practical system is developed and evaluation is performed in aspect of various results. This study suggests model to undergo the study which promoted the complexity of the problems by considering the various constraints which were not considered in the early study. However, it is suggested that it is necessary to add the study which includes the real-time monitoring function for vehicles and cargos based on the information technology.
Aiding the search: Examining individual differences in multiply-constrained problem solving.
Ellis, Derek M; Brewer, Gene A
2018-07-01
Understanding and resolving complex problems is of vital importance in daily life. Problems can be defined by the limitations they place on the problem solver. Multiply-constrained problems are traditionally examined with the compound remote associates task (CRAT). Performance on the CRAT is partially dependent on an individual's working memory capacity (WMC). These findings suggest that executive processes are critical for problem solving and that there are reliable individual differences in multiply-constrained problem solving abilities. The goals of the current study are to replicate and further elucidate the relation between WMC and CRAT performance. To achieve these goals, we manipulated preexposure to CRAT solutions and measured WMC with complex-span tasks. In Experiment 1, we report evidence that preexposure to CRAT solutions improved problem solving accuracy, WMC was correlated with problem solving accuracy, and that WMC did not moderate the effect of preexposure on problem solving accuracy. In Experiment 2, we preexposed participants to correct and incorrect solutions. We replicated Experiment 1 and found that WMC moderates the effect of exposure to CRAT solutions such that high WMC participants benefit more from preexposure to correct solutions than low WMC (although low WMC participants have preexposure benefits as well). Broadly, these results are consistent with theories of working memory and problem solving that suggest a mediating role of attention control processes. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Lemons, John
1995-03-01
Problems of sustainable development and environmental protection pose a challenge to humanity unprecedented in scope and complexity. Whether and how the problems are resolved have significant implications for human and ecological well-being. In this paper, I discuss briefly recent international recommendations to promote sustainable development and environmental protection. I then offer a perspective on the roles and prospects of the university in promoting sustainable development and environmental protection.
Best Practices In Overset Grid Generation
NASA Technical Reports Server (NTRS)
Chan, William M.; Gomez, Reynaldo J., III; Rogers, Stuart E.; Buning, Pieter G.; Kwak, Dochan (Technical Monitor)
2002-01-01
Grid generation for overset grids on complex geometry can be divided into four main steps: geometry processing, surface grid generation, volume grid generation and domain connectivity. For each of these steps, the procedures currently practiced by experienced users are described. Typical problems encountered are also highlighted and discussed. Most of the guidelines are derived from experience on a variety of problems including space launch and return vehicles, subsonic transports with propulsion and high lift devices, supersonic vehicles, rotorcraft vehicles, and turbomachinery.
Young, Kristie L; Salmon, Paul M
2015-01-01
Distracted driving is acknowledged universally as a large and growing road safety problem. Compounding the problem is that distracted driving is a complex, multifaceted issue influenced by a multitude of factors, organisations and individuals. As such, management of the problem is not straightforward. Numerous countermeasures have been developed and implemented across the globe. The vast majority of these measures have derived from the traditional reductionist, driver-centric approach to distraction and have failed to fully reflect the complex mix of actors and components that give rise to drivers becoming distracted. An alternative approach that is gaining momentum in road safety is the systems approach, which considers all components of the system and their interactions as an integrated whole. In this paper, we review the current knowledge base on driver distraction and argue that the systems approach is not currently being realised in practice. Adopting a more holistic, systems approach to distracted driving will not only improve existing knowledge and interventions from the traditional approach, but will enhance our understanding and management of distraction by considering the complex relationships and interactions of the multiple actors and the myriad sources, enablers and interventions that make up the distracted driving system. It is only by recognising and understanding how all of the system components work together to enable distraction to occur, that we can start to work on solutions to help mitigate the occurrence and consequences of distracted driving. Copyright © 2014 Elsevier Ltd. All rights reserved.
Institutional Problems and Solutions of General Education in Chinese Universities
ERIC Educational Resources Information Center
Meng, Weiqing; Huang, Wei
2018-01-01
Embedding general education in the Chinese university education system is a considerably complex systemic project, and a lack of institutional arrangements beneficial to general education has always been a key barrier in implementation. Currently, the main institutional restricting factors for university general education include substantial…
Teacher Competency: A Public Farce!
ERIC Educational Resources Information Center
Weitman, Catheryn J.
The current popularity of teacher testing allows for content, criterion, and construct validity to be assessed, as pertaining to achievement levels on basic knowledge examinations. Teacher competency is a complex issue that is inaccurately confused with or identified as measures derived from academic testing. The problems in addressing the…
Direct Maximization of Protein Identifications from Tandem Mass Spectra*
Spivak, Marina; Weston, Jason; Tomazela, Daniela; MacCoss, Michael J.; Noble, William Stafford
2012-01-01
The goal of many shotgun proteomics experiments is to determine the protein complement of a complex biological mixture. For many mixtures, most methodological approaches fall significantly short of this goal. Existing solutions to this problem typically subdivide the task into two stages: first identifying a collection of peptides with a low false discovery rate and then inferring from the peptides a corresponding set of proteins. In contrast, we formulate the protein identification problem as a single optimization problem, which we solve using machine learning methods. This approach is motivated by the observation that the peptide and protein level tasks are cooperative, and the solution to each can be improved by using information about the solution to the other. The resulting algorithm directly controls the relevant error rate, can incorporate a wide variety of evidence and, for complex samples, provides 18–34% more protein identifications than the current state of the art approaches. PMID:22052992
Finding order in complexity: themes from the career of Dr. Robert F. Wagner
NASA Astrophysics Data System (ADS)
Myers, Kyle J.
2009-02-01
Over the course of his long and productive career, Dr. Robert F. Wagner built a framework for the evaluation of imaging systems based on a task-based, decision theoretic approach. His most recent contributions involved the consideration of the random effects associated with multiple readers of medical images and the logical extension of this work to the problem of the evaluation of multiple competing classifiers in statistical pattern recognition. This contemporary work expanded on familiar themes from Bob's many SPIE presentations in earlier years. It was driven by the need for practical solutions to current problems facing FDA'S Center for Devices and Radiological Health and the medical imaging community regarding the assessment of new computer-aided diagnosis tools and Bob's unique ability to unify concepts across a range of disciplines as he gave order to increasingly complex problems in our field.
Genetic Diversity in the Paramecium aurelia Species Complex
Catania, Francesco; Wurmser, François; Potekhin, Alexey A.; Przyboś, Ewa; Lynch, Michael
2009-01-01
Current understanding of the population genetics of free-living unicellular eukaryotes is limited, and the amount of genetic variability in these organisms is still a matter of debate. We characterized—reproductively and genetically—worldwide samples of multiple Paramecium species belonging to a cryptic species complex, Paramecium aurelia, whose species have been shown to be reproductively isolated. We found that levels of genetic diversity both in the nucleus and in the mitochondrion are substantial within groups of reproductively compatible P. aurelia strains but drop considerably when strains are partitioned according to their phylogenetic groupings. Our study reveals the existence of discrepancies between the mating behavior of a number of P. aurelia strains and their multilocus genetic profile, a controversial finding that has major consequences for both the current methods of species assignment and the species problem in the P. aurelia complex. PMID:19023087
SEU System Analysis: Not Just the Sum of All Parts
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; Label, Kenneth
2014-01-01
Single event upset (SEU) analysis of complex systems is challenging. Currently, system SEU analysis is performed by component level partitioning and then either: the most dominant SEU cross-sections (SEUs) are used in system error rate calculations; or the partition SEUs are summed to eventually obtain a system error rate. In many cases, system error rates are overestimated because these methods generally overlook system level derating factors. The problem with overestimating is that it can cause overdesign and consequently negatively affect the following: cost, schedule, functionality, and validation/verification. The scope of this presentation is to discuss the risks involved with our current scheme of SEU analysis for complex systems; and to provide alternative methods for improvement.
Putting the puzzle together: the role of ‘problem definition’ in complex clinical judgement
Cristancho, Sayra; Lingard, Lorelei; Forbes, Thomas; Ott, Michael; Novick, Richard
2017-01-01
CONTEXT We teach judgement in pieces; that is, we talk about each aspect separately (patient, plan, resources, technique, etc.). We also let trainees figure out how to put the pieces together. In complex situations, this might be problematic. Using data from a drawing-based study on surgeons’ experiences with complex situations, we explore the notion of ‘problem definition’ in real-world clinical judgement using the theoretical lens of systems engineering. METHODS ‘Emergence’, the sensitising concept for analysis, is rooted in two key systems premises: that person and context are inseparable and that what emerges is an act of choice. Via a ‘gallery walk’ we used these premises to perform analysis on individual drawings as well as cross-comparisons of multiple drawings. Our focus was to understand similarities and differences among the vantage points used by multiple surgeons. RESULTS In this paper we challenge two assumptions from current models of clinical judgement: that experts hold a fixed and static definition of the problem and that consequently the focus of the expert’s work is on solving the problem. Each situation described by our participants revealed different but complementary perspectives of what a surgical problem might come to be: from concerns about ensuring standard of care, to balancing personal emotions versus care choices, to coordinating resources, and to maintaining control while in the midst of personality clashes. CONCLUSION We suggest that it is only at the situation and system level, not at the individual level, that we are able to appreciate the nuances of defining the problem when experts make judgements during real-world complex situations. PMID:27943366
Putting the puzzle together: the role of 'problem definition' in complex clinical judgement.
Cristancho, Sayra; Lingard, Lorelei; Forbes, Thomas; Ott, Michael; Novick, Richard
2017-02-01
We teach judgement in pieces; that is, we talk about each aspect separately (patient, plan, resources, technique, etc.). We also let trainees figure out how to put the pieces together. In complex situations, this might be problematic. Using data from a drawing-based study on surgeons' experiences with complex situations, we explore the notion of 'problem definition' in real-world clinical judgement using the theoretical lens of systems engineering. 'Emergence', the sensitising concept for analysis, is rooted in two key systems premises: that person and context are inseparable and that what emerges is an act of choice. Via a 'gallery walk' we used these premises to perform analysis on individual drawings as well as cross-comparisons of multiple drawings. Our focus was to understand similarities and differences among the vantage points used by multiple surgeons. In this paper we challenge two assumptions from current models of clinical judgement: that experts hold a fixed and static definition of the problem and that consequently the focus of the expert's work is on solving the problem. Each situation described by our participants revealed different but complementary perspectives of what a surgical problem might come to be: from concerns about ensuring standard of care, to balancing personal emotions versus care choices, to coordinating resources, and to maintaining control while in the midst of personality clashes. We suggest that it is only at the situation and system level, not at the individual level, that we are able to appreciate the nuances of defining the problem when experts make judgements during real-world complex situations. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Karl Tennant
1989-01-01
Diverse problems confront the forest manager when planting bottomland hardwoods. Bottomland vegetation types and sites are complex and differ markedly from uplands. There are different and more numerous hardwood species that grow faster in denser stands. Sites are subject to varying intensities and duration of flooding and the action of overflow river currents that...
48 CFR 19.804-1 - Agency evaluation.
Code of Federal Regulations, 2010 CFR
2010-10-01
... items or work similar in nature and complexity to that specified in the business plan; (c) Problems... SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Contracting With the Small Business Administration (the 8(a... support of the 8(a) Program, the agency should evaluate— (a) Its current and future plans to acquire the...
Assessing the Complexity of Students' Knowledge in Chemistry
ERIC Educational Resources Information Center
Bernholt, Sascha; Parchmann, Ilka
2011-01-01
Current reforms in the education policy of various countries are intended to produce a paradigm shift in the educational system towards an outcome orientation. After implementing educational standards as normative objectives, the development of test procedures that adequately reflect these targets and standards is a central problem. This paper…
Governance and Funding of Higher Education in Germany.
ERIC Educational Resources Information Center
Hufner, Klaus
2003-01-01
Describes the complex functioning of decision making in relation to legal, administrative, planning, and financial matters in Germany, examining the current increase of privatization of higher education and the ensuring legal and financial problem, and discussing the introduction of new funding schemes based on performance indicators which augur…
Peri-viable birth: legal considerations.
Sayeed, Sadath A
2014-02-01
Peri-viable birth raises an array of complex moral and legal concerns. This article discusses the problem with defining viability, touches on its relationship to abortion jurisprudence, and analyzes a few interesting normative implications of current medical practice at the time of peri-viable birth. Copyright © 2014 Elsevier Inc. All rights reserved.
Critical Thinking Assessment across Four Sustainability-Related Experiential Learning Settings
ERIC Educational Resources Information Center
Heinrich, William F.; Habron, Geoffrey B.; Johnson, Heather L.; Goralnik, Lissy
2015-01-01
Today's complex societal problems require both critical thinking and an engaged citizenry. Current practices in higher education, such as service learning, suggest that experiential learning can serve as a vehicle to encourage students to become engaged citizens. However, critical thinking is not necessarily a part of every experiential learning…
Bunch Splitting Simulations for the JLEIC Ion Collider Ring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Satogata, Todd J.; Gamage, Randika
2016-05-01
We describe the bunch splitting strategies for the proposed JLEIC ion collider ring at Jefferson Lab. This complex requires an unprecedented 9:6832 bunch splitting, performed in several stages. We outline the problem and current results, optimized with ESME including general parameterization of 1:2 bunch splitting for JLEIC parameters.
NASA Technical Reports Server (NTRS)
Vakil, Sanjay S.; Hansman, R. John
2000-01-01
Autoflight systems in the current generation of aircraft have been implicated in several recent incidents and accidents. A contributory aspect to these incidents may be the manner in which aircraft transition between differing behaviours or 'modes.' The current state of aircraft automation was investigated and the incremental development of the autoflight system was tracked through a set of aircraft to gain insight into how these systems developed. This process appears to have resulted in a system without a consistent global representation. In order to evaluate and examine autoflight systems, a 'Hybrid Automation Representation' (HAR) was developed. This representation was used to examine several specific problems known to exist in aircraft systems. Cyclomatic complexity is an analysis tool from computer science which counts the number of linearly independent paths through a program graph. This approach was extended to examine autoflight mode transitions modelled with the HAR. A survey was conducted of pilots to identify those autoflight mode transitions which airline pilots find difficult. The transitions identified in this survey were analyzed using cyclomatic complexity to gain insight into the apparent complexity of the autoflight system from the perspective of the pilot. Mode transitions which had been identified as complex by pilots were found to have a high cyclomatic complexity. Further examination was made into a set of specific problems identified in aircraft: the lack of a consistent representation of automation, concern regarding appropriate feedback from the automation, and the implications of physical limitations on the autoflight systems. Mode transitions involved in changing to and leveling at a new altitude were identified across multiple aircraft by numerous pilots. Where possible, evaluation and verification of the behaviour of these autoflight mode transitions was investigated via aircraft-specific high fidelity simulators. Three solution approaches to concerns regarding autoflight systems, and mode transitions in particular, are presented in this thesis. The first is to use training to modify pilot behaviours, or procedures to work around known problems. The second approach is to mitigate problems by enhancing feedback. The third approach is to modify the process by which automation is designed. The Operator Directed Process forces the consideration and creation of an automation model early in the design process for use as the basis of the software specification and training.
NASA Astrophysics Data System (ADS)
Mandrà, Salvatore; Giacomo Guerreschi, Gian; Aspuru-Guzik, Alán
2016-07-01
We present an exact quantum algorithm for solving the Exact Satisfiability problem, which belongs to the important NP-complete complexity class. The algorithm is based on an intuitive approach that can be divided into two parts: the first step consists in the identification and efficient characterization of a restricted subspace that contains all the valid assignments of the Exact Satisfiability; while the second part performs a quantum search in such restricted subspace. The quantum algorithm can be used either to find a valid assignment (or to certify that no solution exists) or to count the total number of valid assignments. The query complexities for the worst-case are respectively bounded by O(\\sqrt{{2}n-{M\\prime }}) and O({2}n-{M\\prime }), where n is the number of variables and {M}\\prime the number of linearly independent clauses. Remarkably, the proposed quantum algorithm results to be faster than any known exact classical algorithm to solve dense formulas of Exact Satisfiability. As a concrete application, we provide the worst-case complexity for the Hamiltonian cycle problem obtained after mapping it to a suitable Occupation problem. Specifically, we show that the time complexity for the proposed quantum algorithm is bounded by O({2}n/4) for 3-regular undirected graphs, where n is the number of nodes. The same worst-case complexity holds for (3,3)-regular bipartite graphs. As a reference, the current best classical algorithm has a (worst-case) running time bounded by O({2}31n/96). Finally, when compared to heuristic techniques for Exact Satisfiability problems, the proposed quantum algorithm is faster than the classical WalkSAT and Adiabatic Quantum Optimization for random instances with a density of constraints close to the satisfiability threshold, the regime in which instances are typically the hardest to solve. The proposed quantum algorithm can be straightforwardly extended to the generalized version of the Exact Satisfiability known as Occupation problem. The general version of the algorithm is presented and analyzed.
Atomic switch networks as complex adaptive systems
NASA Astrophysics Data System (ADS)
Scharnhorst, Kelsey S.; Carbajal, Juan P.; Aguilera, Renato C.; Sandouk, Eric J.; Aono, Masakazu; Stieg, Adam Z.; Gimzewski, James K.
2018-03-01
Complexity is an increasingly crucial aspect of societal, environmental and biological phenomena. Using a dense unorganized network of synthetic synapses it is shown that a complex adaptive system can be physically created on a microchip built especially for complex problems. These neuro-inspired atomic switch networks (ASNs) are a dynamic system with inherent and distributed memory, recurrent pathways, and up to a billion interacting elements. We demonstrate key parameters describing self-organized behavior such as non-linearity, power law dynamics, and multistate switching regimes. Device dynamics are then investigated using a feedback loop which provides control over current and voltage power-law behavior. Wide ranging prospective applications include understanding and eventually predicting future events that display complex emergent behavior in the critical regime.
Research on aviation fuel instability
NASA Technical Reports Server (NTRS)
Baker, C. E.; Bittker, D. A.; Cohen, S. M.; Seng, G. T.
1984-01-01
The problems associated with aircraft fuel instability are discussed. What is currently known about the problem is reviewed and a research program to identify those areas where more research is needed is discussed. The term fuel instability generally refers to the gums, sediments, or deposits which can form as a result of a set of complex chemical reactions when a fuel is stored for a long period at ambient conditions or when the fuel is thermally stressed inside the fuel system of an aircraft.
Teaching NMR spectra analysis with nmr.cheminfo.org.
Patiny, Luc; Bolaños, Alejandro; Castillo, Andrés M; Bernal, Andrés; Wist, Julien
2018-06-01
Teaching spectra analysis and structure elucidation requires students to get trained on real problems. This involves solving exercises of increasing complexity and when necessary using computational tools. Although desktop software packages exist for this purpose, nmr.cheminfo.org platform offers students an online alternative. It provides a set of exercises and tools to help solving them. Only a small number of exercises are currently available, but contributors are invited to submit new ones and suggest new types of problems. Copyright © 2018 John Wiley & Sons, Ltd.
Adaptivity and smart algorithms for fluid-structure interaction
NASA Technical Reports Server (NTRS)
Oden, J. Tinsley
1990-01-01
This paper reviews new approaches in CFD which have the potential for significantly increasing current capabilities of modeling complex flow phenomena and of treating difficult problems in fluid-structure interaction. These approaches are based on the notions of adaptive methods and smart algorithms, which use instantaneous measures of the quality and other features of the numerical flowfields as a basis for making changes in the structure of the computational grid and of algorithms designed to function on the grid. The application of these new techniques to several problem classes are addressed, including problems with moving boundaries, fluid-structure interaction in high-speed turbine flows, flow in domains with receding boundaries, and related problems.
Optimisation by hierarchical search
NASA Astrophysics Data System (ADS)
Zintchenko, Ilia; Hastings, Matthew; Troyer, Matthias
2015-03-01
Finding optimal values for a set of variables relative to a cost function gives rise to some of the hardest problems in physics, computer science and applied mathematics. Although often very simple in their formulation, these problems have a complex cost function landscape which prevents currently known algorithms from efficiently finding the global optimum. Countless techniques have been proposed to partially circumvent this problem, but an efficient method is yet to be found. We present a heuristic, general purpose approach to potentially improve the performance of conventional algorithms or special purpose hardware devices by optimising groups of variables in a hierarchical way. We apply this approach to problems in combinatorial optimisation, machine learning and other fields.
[Questions concerning humanitarian action].
Simonnot, C
2002-01-01
Although development of humanitarian action is rooted historical events, the dynamics behind today's international relief organizations can only be understood within the context of the modern world. Relief organizations are currently confronted with major challenges and paradoxes. The challenges include the need to enhance professionalization and standardization of assistance operations and exposure to greater risks. The paradoxes involve the need to implement complex, highly publicized programs in a simplistic manner and problems involved in managing the complex relationship between relief workers and victims, tainted with the almighty powers of the actors.
NASA Astrophysics Data System (ADS)
Szczesniak, Dominik
Recently, monolayer transition metal dichalcogenides have attracted much attention due to their potential use in both nano- and opto-electronics. In such applications, the electronic and transport properties of group-VIB transition metal dichalcogenides (MX2 , where M=Mo, W; X=S, Se, Te) are particularly important. Herein, new insight into these properties is presented by studying the complex band structures (CBS's) of MX2 monolayers while accounting for spin-orbit coupling effects. By using the symmetry-based tight-binding model a nonlinear generalized eigenvalue problem for CBS's is obtained. An efficient method for solving such class of problems is presented and gives a complete set of physically relevant solutions. Next, these solutions are characterized and classified into propagating and evanescent states, where the latter states present not only monotonic but also oscillatory decay character. It is observed that some of the oscillatory evanescent states create characteristic complex loops at the direct band gaps, which describe the tunneling currents in the MX2 materials. The importance of CBS's and tunneling currents is demonstrated by the analysis of the quantum transport across MX2 monolayers within phase field matching theory. Present work has been prepared within the Qatar Energy and Environment Research Institute (QEERI) grand challenge ATHLOC project (Project No. QEERI- GC-3008).
Hollaus, K; Magele, C; Merwa, R; Scharfetter, H
2004-02-01
Magnetic induction tomography of biological tissue is used to reconstruct the changes in the complex conductivity distribution by measuring the perturbation of an alternating primary magnetic field. To facilitate the sensitivity analysis and the solution of the inverse problem a fast calculation of the sensitivity matrix, i.e. the Jacobian matrix, which maps the changes of the conductivity distribution onto the changes of the voltage induced in a receiver coil, is needed. The use of finite differences to determine the entries of the sensitivity matrix does not represent a feasible solution because of the high computational costs of the basic eddy current problem. Therefore, the reciprocity theorem was exploited. The basic eddy current problem was simulated by the finite element method using symmetric tetrahedral edge elements of second order. To test the method various simulations were carried out and discussed.
Practical ethical theory for nurses responding to complexity in care.
Fairchild, Roseanne Moody
2010-05-01
In the context of health care system complexity, nurses need responsive leadership and organizational support to maintain intrinsic motivation, moral sensitivity and a caring stance in the delivery of patient care. The current complexity of nurses' work environment promotes decreases in work motivation and moral satisfaction, thus creating motivational and ethical dissonance in practice. These and other work-related factors increase emotional stress and burnout for nurses, prompting both new and seasoned nurse professionals to leave their current position, or even the profession. This article presents a theoretical conceptual model for professional nurses to review and make sense of the ethical reasoning skills needed to maintain a caring stance in relation to the competing values that must coexist among nurses, health care administrators, patients and families in the context of the complex health care work environments in which nurses are expected to practice. A model, Nurses' Ethical Reasoning Skills, is presented as a framework for nurses' thinking through and problem solving ethical issues in clinical practice in the context of complexity in health care.
Effect of Case-Based Video Support on Cyberbullying Awareness
ERIC Educational Resources Information Center
Akbulut, Yavuz
2014-01-01
When it comes to safe and ethical information technology use, cyberbullying stands out. Indeed, it is seen to be a prevalent and complex problem. Prevention suggestions tend to rely on implications of descriptive and correlational studies rather than true experimental works. In this regard, the current study investigated the effect of case-based…
Measurement of Cruelty in Children: The Cruelty to Animals Inventory
ERIC Educational Resources Information Center
Dadds, Mark R.; Whiting, Clare; Bunn, Paul; Fraser, Jennifer A.; Charlson, Juliana H.; Pirola-Merlo, Andrew
2004-01-01
Cruelty to animals may be a particularly pernicious aspect of problematic child development. Progress in understanding the development of the problem is limited due to the complex nature of cruelty as a construct, and limitations with current assessment measures. The Children and Animals Inventory (CAI) was developed as a brief self- and…
ERIC Educational Resources Information Center
Ogborn, Jon
2004-01-01
"Soft matter" is a lively current field of research, looking at fundamental theoretical questions about the structure and behaviour of complex forms of matter, and at very practical problems of, for example, improving the performance of glues or the texture of ice cream. Foodstuffs provide an excellent way in to this modern topic, which lies on…
CC-LR: Providing Interactive, Challenging and Attractive Collaborative Complex Learning Resources
ERIC Educational Resources Information Center
Caballé, S.; Mora, N.; Feidakis, M.; Gañán, D.; Conesa, J.; Daradoumis, T.; Prieto, J.
2014-01-01
Many researchers argue that students must be meaningfully engaged in the learning resources for effective learning to occur. However, current online learners still report a problematic lack of attractive and challenging learning resources that engage them in the learning process. This endemic problem is even more evident in online collaborative…
Towards Sustainable National Development through Well Managed Early Childhood Education
ERIC Educational Resources Information Center
Abraham, Nath M.
2012-01-01
This paper discusses issues relating to sustainable development and effective management of early childhood education. The child is the "owner" of the future. The problems that confront the current generation are complex and serious that cannot be addressed in the same way they were created. But they can be addressed. The concept of…
Psychological Characteristics of Adolescents Orphans with Different Experience of Living in a Family
ERIC Educational Resources Information Center
Shulga, Tatyana I.; Savchenko, Daria D.; Filinkova, Evgeniya B.
2016-01-01
The complexity of settling adolescents-orphans in foster families and significant number of break-downs in these families are the problems which determine the relevance of current research. Many adolescent orphans get in social institutions repeatedly, because their psychological features lead to difficulties that their foster parents are unable…
The Design, Pedagogy and Practice of an Integrated Public Affairs Leadership Course
ERIC Educational Resources Information Center
Sandfort, Jodi; Gerdes, Kevin
2017-01-01
Current world events demand public affairs leadership training that generates among professionals a sense of capability, agency, and responsibility to engage in complex public problems. In this paper, we describe a unique course operated in the US focused on achieving these learning outcomes. It uses an unconventional schedule and course design…
Learning To Love the Swamp: Reshaping Education for Public Service.
ERIC Educational Resources Information Center
Schall, Ellen
1996-01-01
The world of public service is compared to a swamp in which important, complex, and messy problems are addressed, and it is argued that graduate and professional education must be reshaped to produce leaders who can make sense of current challenges. Education that is more experiential, behavioral, interactive, and collectively oriented is…
Refocusing the Vision: The Future of Instructional Technology
ERIC Educational Resources Information Center
Pence, Harry E.; McIntosh, Steven
2011-01-01
Two decades ago, many campuses mobilized a major effort to deal with a clear problem; faculty and students needed access to desktop computing technologies. Now the situation is much more complex. Responding to the current challenges, like mobile computing and social networking, will be ore difficult but equally important. There is a clear need for…
Current Practice in Psychopharmacology for Children and Adolescents with Autism Spectrum Disorders
ERIC Educational Resources Information Center
Floyd, Elizabeth Freeman; McIntosh, David E.
2009-01-01
Autism spectrum disorders (ASDs) are a complex group of neurodevelopmental conditions that develop in early childhood and involve a range of impairments in core areas of social interaction, communication, and restricted behavior and interests. Associated behavioral problems such as tantrums, aggression, and self-injury frequently compound the core…
Presentation and management of chronic pain.
Rajapakse, Dilini; Liossi, Christina; Howard, Richard F
2014-05-01
Chronic pain is an important clinical problem affecting significant numbers of children and their families. The severity and impact of chronic pain on everyday function is shaped by the complex interaction of biological, psychological and social factors that determine the experience of pain for each individual, rather than a straightforward reflection of the severity of disease or extent of tissue damage. In this article we present the research findings that strongly support a biopsychosocial concept of chronic pain, describe the current best evidence for management strategies and suggest a common general pathway for all types of chronic pain. The principles of management of some of the most important or frequently encountered chronic pain problems in paediatric practice; neuropathic pain, complex regional pain syndrome (CRPS), musculoskeletal pain, abdominal pain and headache are also described.
Vidor, Emmanuel; Soubeyrand, Benoit
2016-12-01
The manufacture of DTP-backboned combination vaccines is complex, and vaccine quality is evaluated by both batch composition and conformance of manufacturing history. Since their first availability, both the manufacturing regulations for DTP combination vaccines and their demand have evolved significantly. This has resulted in a constant need to modify manufacturing and quality control processes. Areas covered: Regulations that govern the manufacture of complex vaccines can be inconsistent between countries and need to be aligned with the regulatory requirements that apply in all countries of distribution. Changes in product mix and quantities can lead to uncertainty in vaccine supply maintenance. These problems are discussed in the context of the importance of these products as essential public health tools. Expert commentary: Increasing demand for complex vaccines globally has led to problems in supply due to intrinsically complex manufacturing and regulatory procedures. Vaccine manufacturers are fully engaged in the resolution of these challenges, but currently changes in demand need ideally to be anticipated approximately 3 years in advance due to long production cycle times.
Invariant Manifolds, the Spatial Three-Body Problem and Space Mission Design
NASA Technical Reports Server (NTRS)
Gomez, G.; Koon, W. S.; Lo, Martin W.; Marsden, J. E.; Masdemont, J.; Ross, S. D.
2001-01-01
The invariant manifold structures of the collinear libration points for the spatial restricted three-body problem provide the framework for understanding complex dynamical phenomena from a geometric point of view. In particular, the stable and unstable invariant manifold 'tubes' associated to libration point orbits are the phase space structures that provide a conduit for orbits between primary bodies for separate three-body systems. These invariant manifold tubes can be used to construct new spacecraft trajectories, such as 'Petit Grand Tour' of the moons of Jupiter. Previous work focused on the planar circular restricted three-body problem. The current work extends the results to the spatial case.
Cirrus microphysics and radiative transfer: Cloud field study on October 28, 1986
NASA Technical Reports Server (NTRS)
Kinne, Stefan; Ackerman, Thomas P.; Heymsfield, Andrew J.; Valero, Francisco P. J.; Sassen, Kenneth; Spinhirne, James D.
1990-01-01
The radiative properties of cirrus clouds present one of the unresolved problems in weather and climate research. Uncertainties in ice particle amount and size and, also, the general inability to model the single scattering properties of their usually complex particle shapes, prevent accurate model predictions. For an improved understanding of cirrus radiative effects, field experiments, as those of the Cirrus IFO of FIRE, are necessary. Simultaneous measurements of radiative fluxes and cirrus microphysics at multiple cirrus cloud altitudes allows the pitting of calculated versus measured vertical flux profiles; with the potential to judge current cirrus cloud modeling. Most of the problems in this study are linked to the inhomogeneity of the cloud field. Thus, only studies on more homogeneous cirrus cloud cases promises a possibility to improve current cirrus parameterizations. Still, the current inability to detect small ice particles will remain as a considerable handicap.
Home Health Nurse Collaboration in the Medical Neighborhood of Children with Medical Complexity.
Nageswaran, Savithri; Golden, Shannon L
2016-10-01
The objectives of this study were to describe how home healthcare nurses collaborate with other clinicians caring for children with medical complexity, and identify barriers to collaboration within the medical neighborhood. Using qualitative data obtained from 20 semistructured interviews (15 English, 5 Spanish) with primary caregivers of children with medical complexity and 18 home healthcare nurses, researchers inquired about experiences with home healthcare nursing services for these children. During an iterative analysis process, recurrent themes were identified by their prevalence and salience in the data. Home healthcare nurses collaborate with many providers within the medical neighborhood of children with medical complexity and perform many different collaborative tasks. This collaboration is valued by caregivers and nurses, but is inconsistent. Home healthcare nurses' communication with other clinicians is important to the delivery of good-quality care to children with medical complexity at home, but is not always present. Home healthcare nurses reported inability to share clinical information with other clinicians, not receiving child-specific information, and lack of support for clinical problem-solving as concerns. Barriers for optimal collaboration included lack of preparedness of parents, availability of physicians for clinical support, reimbursement for collaborative tasks, variability in home healthcare nurses' tasks, and problems at nursing agency level. Home healthcare nurses' collaboration with other clinicians is important, but problems exist in the current system of care. Optimizing collaboration between home healthcare nurses and other clinicians will likely have a positive impact on these children and their families.
NASA Astrophysics Data System (ADS)
Plattner, A.; Maurer, H. R.; Vorloeper, J.; Dahmen, W.
2010-08-01
Despite the ever-increasing power of modern computers, realistic modelling of complex 3-D earth models is still a challenging task and requires substantial computing resources. The overwhelming majority of current geophysical modelling approaches includes either finite difference or non-adaptive finite element algorithms and variants thereof. These numerical methods usually require the subsurface to be discretized with a fine mesh to accurately capture the behaviour of the physical fields. However, this may result in excessive memory consumption and computing times. A common feature of most of these algorithms is that the modelled data discretizations are independent of the model complexity, which may be wasteful when there are only minor to moderate spatial variations in the subsurface parameters. Recent developments in the theory of adaptive numerical solvers have the potential to overcome this problem. Here, we consider an adaptive wavelet-based approach that is applicable to a large range of problems, also including nonlinear problems. In comparison with earlier applications of adaptive solvers to geophysical problems we employ here a new adaptive scheme whose core ingredients arose from a rigorous analysis of the overall asymptotically optimal computational complexity, including in particular, an optimal work/accuracy rate. Our adaptive wavelet algorithm offers several attractive features: (i) for a given subsurface model, it allows the forward modelling domain to be discretized with a quasi minimal number of degrees of freedom, (ii) sparsity of the associated system matrices is guaranteed, which makes the algorithm memory efficient and (iii) the modelling accuracy scales linearly with computing time. We have implemented the adaptive wavelet algorithm for solving 3-D geoelectric problems. To test its performance, numerical experiments were conducted with a series of conductivity models exhibiting varying degrees of structural complexity. Results were compared with a non-adaptive finite element algorithm, which incorporates an unstructured mesh to best-fitting subsurface boundaries. Such algorithms represent the current state-of-the-art in geoelectric modelling. An analysis of the numerical accuracy as a function of the number of degrees of freedom revealed that the adaptive wavelet algorithm outperforms the finite element solver for simple and moderately complex models, whereas the results become comparable for models with high spatial variability of electrical conductivities. The linear dependence of the modelling error and the computing time proved to be model-independent. This feature will allow very efficient computations using large-scale models as soon as our experimental code is optimized in terms of its implementation.
The current state of drug discovery and a potential role for NMR metabolomics.
Powers, Robert
2014-07-24
The pharmaceutical industry has significantly contributed to improving human health. Drugs have been attributed to both increasing life expectancy and decreasing health care costs. Unfortunately, there has been a recent decline in the creativity and productivity of the pharmaceutical industry. This is a complex issue with many contributing factors resulting from the numerous mergers, increase in out-sourcing, and the heavy dependency on high-throughput screening (HTS). While a simple solution to such a complex problem is unrealistic and highly unlikely, the inclusion of metabolomics as a routine component of the drug discovery process may provide some solutions to these problems. Specifically, as the binding affinity of a chemical lead is evolved during the iterative structure-based drug design process, metabolomics can provide feedback on the selectivity and the in vivo mechanism of action. Similarly, metabolomics can be used to evaluate and validate HTS leads. In effect, metabolomics can be used to eliminate compounds with potential efficacy and side effect problems while prioritizing well-behaved leads with druglike characteristics.
Current trends in geomathematics
Griffiths, J.C.
1970-01-01
Geoscience has extended its role and improved its applications by the development of geophysics since the nineteen-thirties, geochemistry since the nineteen-fifties and now, in the late nineteen-sixties, a new synergism leads to geomathematics; again the greatest pressure for change arises from areas of application of geoscience and, as the problems to which geoscience is applied increase in complexity, the analytical tools become more sophisticated, a development which is accelerated by growth in the use of computers in geological problem-solving. In the next decade the problems with greatest public impact appear to be the ones which will receive greatest emphasis and support. This will require that the geosciences comprehend exceedingly complex probabilistic systems and these, in turn, demand the use of operations research, cybernetics and systems analysis. Such a development may well lead to a change in the paradigms underlying geoscience; they will certainly include more realistic models of "real-world" systems and the tool of simulation with cybernetic models may well become the basis for rejuvenation of experimentation in the geosciences. ?? 1970.
The emergence of complex behaviours in molecular magnetic materials.
Goss, Karin; Gatteschi, Dante; Bogani, Lapo
2014-09-14
Molecular magnetism is considered an area where magnetic phenomena that are usually difficult to demonstrate can emerge with particular clarity. Over the years, however, less understandable systems have appeared in the literature of molecular magnetic materials, in some cases showing features that hint at the spontaneous emergence of global structures out of local interactions. This ingredient is typical of a wider class of problems, called complex behaviours, where the theory of complexity is currently being developed. In this perspective we wish to focus our attention on these systems and the underlying problematic that they highlight. We particularly highlight the emergence of the signatures of complexity in several molecular magnetic systems, which may provide unexplored opportunities for physical and chemical investigations.
NASA Astrophysics Data System (ADS)
Ghosh, Sukanya; Roy, Souvanic; Sanyal, Manas Kumar
2016-09-01
With the help of a case study, the article has explored current practices of implementation of governmental affordable housing programme for urban poor in a slum of India. This work shows that the issues associated with the problems of governmental affordable housing programme has to be addressed to with a suitable methodology as complexities are not only dealing with quantitative data but qualitative data also. The Hard System Methodologies (HSM), which is conventionally applied to address the issues, deals with real and known problems which can be directly solved. Since most of the issues of affordable housing programme as found in the case study are subjective and complex in nature, Soft System Methodology (SSM) has been tried for better representation from subjective points of views. The article explored drawing of Rich Picture as an SSM approach for better understanding and analysing complex issues and constraints of affordable housing programme so that further exploration of the issues is possible.
Recent experience in simultaneous control-structure optimization
NASA Technical Reports Server (NTRS)
Salama, M.; Ramaker, R.; Milman, M.
1989-01-01
To show the feasibility of simultaneous optimization as design procedure, low order problems were used in conjunction with simple control formulations. The numerical results indicate that simultaneous optimization is not only feasible, but also advantageous. Such advantages come at the expense of introducing complexities beyond those encountered in structure optimization alone, or control optimization alone. Examples include: larger design parameter space, optimization may combine continuous and combinatoric variables, and the combined objective function may be nonconvex. Future extensions to include large order problems, more complex objective functions and constraints, and more sophisticated control formulations will require further research to ensure that the additional complexities do not outweigh the advantages of simultaneous optimization. Some areas requiring more efficient tools than currently available include: multiobjective criteria and nonconvex optimization. Efficient techniques to deal with optimization over combinatoric and continuous variables, and with truncation issues for structure and control parameters of both the model space as well as the design space need to be developed.
Magnetic localization and orientation of the capsule endoscope based on a random complex algorithm.
He, Xiaoqi; Zheng, Zizhao; Hu, Chao
2015-01-01
The development of the capsule endoscope has made possible the examination of the whole gastrointestinal tract without much pain. However, there are still some important problems to be solved, among which, one important problem is the localization of the capsule. Currently, magnetic positioning technology is a suitable method for capsule localization, and this depends on a reliable system and algorithm. In this paper, based on the magnetic dipole model as well as magnetic sensor array, we propose nonlinear optimization algorithms using a random complex algorithm, applied to the optimization calculation for the nonlinear function of the dipole, to determine the three-dimensional position parameters and two-dimensional direction parameters. The stability and the antinoise ability of the algorithm is compared with the Levenberg-Marquart algorithm. The simulation and experiment results show that in terms of the error level of the initial guess of magnet location, the random complex algorithm is more accurate, more stable, and has a higher "denoise" capacity, with a larger range for initial guess values.
Atkinson, Jo-An; Page, Andrew; Wells, Robert; Milat, Andrew; Wilson, Andrew
2015-03-03
In the design of public health policy, a broader understanding of risk factors for disease across the life course, and an increasing awareness of the social determinants of health, has led to the development of more comprehensive, cross-sectoral strategies to tackle complex problems. However, comprehensive strategies may not represent the most efficient or effective approach to reducing disease burden at the population level. Rather, they may act to spread finite resources less intensively over a greater number of programs and initiatives, diluting the potential impact of the investment. While analytic tools are available that use research evidence to help identify and prioritise disease risk factors for public health action, they are inadequate to support more targeted and effective policy responses for complex public health problems. This paper discusses the limitations of analytic tools that are commonly used to support evidence-informed policy decisions for complex problems. It proposes an alternative policy analysis tool which can integrate diverse evidence sources and provide a platform for virtual testing of policy alternatives in order to design solutions that are efficient, effective, and equitable. The case of suicide prevention in Australia is presented to demonstrate the limitations of current tools to adequately inform prevention policy and discusses the utility of the new policy analysis tool. In contrast to popular belief, a systems approach takes a step beyond comprehensive thinking and seeks to identify where best to target public health action and resources for optimal impact. It is concerned primarily with what can be reasonably left out of strategies for prevention and can be used to explore where disinvestment may occur without adversely affecting population health (or equity). Simulation modelling used for policy analysis offers promise in being able to better operationalise research evidence to support decision making for complex problems, improve targeting of public health policy, and offers a foundation for strengthening relationships between policy makers, stakeholders, and researchers.
Hauser, Tobias U; Rütsche, Bruno; Wurmitzer, Karoline; Brem, Silvia; Ruff, Christian C; Grabner, Roland H
A small but increasing number of studies suggest that non-invasive brain stimulation by means of transcranial direct current stimulation (tDCS) can modulate arithmetic processes that are essential for higher-order mathematical skills and that are impaired in dyscalculic individuals. However, little is known about the neural mechanisms underlying such stimulation effects, and whether they are specific to cognitive processes involved in different arithmetic tasks. We addressed these questions by applying tDCS during simultaneous functional magnetic resonance imaging (fMRI) while participants were solving two types of complex subtraction problems: repeated problems, relying on arithmetic fact learning and problem-solving by fact retrieval, and novel problems, requiring calculation procedures. Twenty participants receiving left parietal anodal plus right frontal cathodal stimulation were compared with 20 participants in a sham condition. We found a strong cognitive and neural dissociation between repeated and novel problems. Repeated problems were solved more accurately and elicited increased activity in the bilateral angular gyri and medial plus lateral prefrontal cortices. Solving novel problems, in contrast, was accompanied by stronger activation in the bilateral intraparietal sulci and the dorsomedial prefrontal cortex. Most importantly, tDCS decreased the activation of the right inferior frontal cortex while solving novel (compared to repeated) problems, suggesting that the cathodal stimulation rendered this region unable to respond to the task-specific cognitive demand. The present study revealed that tDCS during arithmetic problem-solving can modulate the neural activity in proximity to the electrodes specifically when the current demands lead to an engagement of this area. Copyright © 2016 Elsevier Inc. All rights reserved.
The design of nonlinear observers for wind turbine dynamic state and parameter estimation
NASA Astrophysics Data System (ADS)
Ritter, B.; Schild, A.; Feldt, M.; Konigorski, U.
2016-09-01
This contribution addresses the dynamic state and parameter estimation problem which arises with more advanced wind turbine controllers. These control devices need precise information about the system's current state to outperform conventional industrial controllers effectively. First, the necessity of a profound scientific treatment on nonlinear observers for wind turbine application is highlighted. Secondly, the full estimation problem is introduced and the variety of nonlinear filters is discussed. Finally, a tailored observer architecture is proposed and estimation results of an illustrative application example from a complex simulation set-up are presented.
Visual control of prey-capture flight in dragonflies.
Olberg, Robert M
2012-04-01
Interacting with a moving object poses a computational problem for an animal's nervous system. This problem has been elegantly solved by the dragonfly, a formidable visual predator on flying insects. The dragonfly computes an interception flight trajectory and steers to maintain it during its prey-pursuit flight. This review summarizes current knowledge about pursuit behavior and neurons thought to control interception in the dragonfly. When understood, this system has the potential for explaining how a small group of neurons can control complex interactions with moving objects. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Tian, Jianhui; Porter, Adam; Zelkowitz, Marvin V.
1992-01-01
Identification of high cost modules has been viewed as one mechanism to improve overall system reliability, since such modules tend to produce more than their share of problems. A decision tree model was used to identify such modules. In this current paper, a previously developed axiomatic model of program complexity is merged with the previously developed decision tree process for an improvement in the ability to identify such modules. This improvement was tested using data from the NASA Software Engineering Laboratory.
Case Study on Optimal Routing in Logistics Network by Priority-based Genetic Algorithm
NASA Astrophysics Data System (ADS)
Wang, Xiaoguang; Lin, Lin; Gen, Mitsuo; Shiota, Mitsushige
Recently, research on logistics caught more and more attention. One of the important issues on logistics system is to find optimal delivery routes with the least cost for products delivery. Numerous models have been developed for that reason. However, due to the diversity and complexity of practical problem, the existing models are usually not very satisfying to find the solution efficiently and convinently. In this paper, we treat a real-world logistics case with a company named ABC Co. ltd., in Kitakyusyu Japan. Firstly, based on the natures of this conveyance routing problem, as an extension of transportation problem (TP) and fixed charge transportation problem (fcTP) we formulate the problem as a minimum cost flow (MCF) model. Due to the complexity of fcTP, we proposed a priority-based genetic algorithm (pGA) approach to find the most acceptable solution to this problem. In this pGA approach, a two-stage path decoding method is adopted to develop delivery paths from a chromosome. We also apply the pGA approach to this problem, and compare our results with the current logistics network situation, and calculate the improvement of logistics cost to help the management to make decisions. Finally, in order to check the effectiveness of the proposed method, the results acquired are compared with those come from the two methods/ software, such as LINDO and CPLEX.
Not Solving Problems, Managing Messes: Competent Systems in Early Childhood Education and Care
ERIC Educational Resources Information Center
Urban, Mathias
2014-01-01
EU 2020, the current strategic framework of the European Union (European Commission, 2010) sets ambitious policy goals based on a rather bleak analysis of a complex crisis scenario the Union finds itself in. A key role is given to early childhood education and care to achieve these goals, and "'highest benefits" are predicted for…
NASA Astrophysics Data System (ADS)
Ogborn, Jon
2004-01-01
'Soft matter' is a lively current field of research, looking at fundamental theoretical questions about the structure and behaviour of complex forms of matter, and at very practical problems of, for example, improving the performance of glues or the texture of ice cream. Foodstuffs provide an excellent way in to this modern topic, which lies on the boundary between physics and chemistry.
Time Well Spent: Making Choices and Setting Priorities in Adult Numeracy Instruction
ERIC Educational Resources Information Center
Braaten, Melissa
2017-01-01
In her Forum piece, "What's an Adult Numeracy Teacher to Teach? Negotiating the Complexity of Adult Numeracy Instruction," Lynda Ginsburg set the stage of the current problem (poor numeracy levels in American adults) and the bevy of standards, legislation, and new exams that have recently been developed to address it. Ginsburg also…
Etiology, Treatment, and Prevention of Obesity in Childhood and Adolescence: A Decade in Review
ERIC Educational Resources Information Center
Spruijt-Metz, Donna
2011-01-01
Childhood obesity has become an epidemic on a worldwide scale. This article gives an overview of the progress made in childhood and adolescent obesity research in the last decade, with a particular emphasis on the transdisciplinary and complex nature of the problem. The following topics are addressed: (1) current definitions of childhood and…
Currents in Environmental Education: Mapping a Complex and Evolving Pedagogical Field
ERIC Educational Resources Information Center
Sauve, Lucie
2005-01-01
The purpose of this article is to bring to light and celebrate the richness of the environmental education field, thereby paying homage to the pedagogical creativity of its architects over the course of the last thirty years, as well as to their contribution in reflecting on the meaning, problems and possibilities of our relationship to the…
Is Relational Reasoning Dependent on Language? A Voxel-Based Lesion Symptom Mapping Study
ERIC Educational Resources Information Center
Baldo, Juliana V.; Bunge, Silvia A.; Wilson, Stephen M.; Dronkers, Nina F.
2010-01-01
Previous studies with brain-injured patients have suggested that language abilities are necessary for complex problem-solving, even when tasks are non-verbal. In the current study, we tested this notion by analyzing behavioral and neuroimaging data from a large group of left-hemisphere stroke patients (n = 107) suffering from a range of language…
ERIC Educational Resources Information Center
Wang, Shinmin; Gathercole, Susan E.
2013-01-01
The current study investigated the cause of the reported problems in working memory in children with reading difficulties. Verbal and visuospatial simple and complex span tasks, and digit span and reaction times tasks performed singly and in combination, were administered to 46 children with single word reading difficulties and 45 typically…
Embracing Social Sustainability in Design Education: A Reflection on a Case Study in Haiti
ERIC Educational Resources Information Center
Kjøllesdal, Anders; Asheim, Jonas; Boks, Casper
2014-01-01
Sustainable design issues are complex and multi-faceted and need integration in the education of young designers. Current research recommends a holistic view based on problem-solving and inter-disciplinary work, yet few design educators have brought these ideas to their full consequence. Sustainability education for designers is still often rooted…
ERIC Educational Resources Information Center
Brathwaite, Frank
Despite the current need for strong leadership skills to facilitate task achievement, individual development, and social action in an increasingly complex society, women are failing to make significant headway in educational administration. Lack of leadership opportunities for women limits both individual and organizational potential. The problem…
Social science findings in the United States
Sarah McCaffrey; Eric Toman; Melanie Stidham; Bruce Shindler
2015-01-01
The rising number of acres burned annually and growing number of people living in or adjacent to fire-prone areas in the United States make wildfire management an increasingly complex and challenging problem. Given the prominence of social issues in shaping the current challenges and determining paths forward, it will be important to have an accurate understanding of...
A novel heuristic algorithm for capacitated vehicle routing problem
NASA Astrophysics Data System (ADS)
Kır, Sena; Yazgan, Harun Reşit; Tüncel, Emre
2017-09-01
The vehicle routing problem with the capacity constraints was considered in this paper. It is quite difficult to achieve an optimal solution with traditional optimization methods by reason of the high computational complexity for large-scale problems. Consequently, new heuristic or metaheuristic approaches have been developed to solve this problem. In this paper, we constructed a new heuristic algorithm based on the tabu search and adaptive large neighborhood search (ALNS) with several specifically designed operators and features to solve the capacitated vehicle routing problem (CVRP). The effectiveness of the proposed algorithm was illustrated on the benchmark problems. The algorithm provides a better performance on large-scaled instances and gained advantage in terms of CPU time. In addition, we solved a real-life CVRP using the proposed algorithm and found the encouraging results by comparison with the current situation that the company is in.
Franz, Annabel O; McKinney, Cliff
2018-03-26
Previous literature has not examined the processes underlying the relations among parent-child relationship quality, parental psychopathology, and child psychopathology in the context of gender. Further, research examining these variables in emerging adulthood is lacking. The current study examined whether parent-child relationship quality would mediate the relation between parental and child psychopathology, and whether gender moderated these associations. Participants were emerging adults (N = 665) who reported on perceptions of their parents' and their own psychological problems as well as their parent-child relationship quality. Results indicated that the relation between parental internalizing problems and parent-child relationship quality was positive for males, and that mother-child relationship quality was related positively to psychological problems in males. This suggests that sons may grow closer to their parents (particularly their mother) who are exhibiting internalizing problems; in turn, this enmeshed relationship may facilitate transmission of psychopathology. Mediational paths were conditional upon gender, suggesting moderated mediation. Overall, the current study emphasizes that the complexities of parenting must be understood in the context of gender. Further, the mother-son dyad may particularly warrant further attention.
Food technology problems related to space feeding.
Hollender, H A; Klicka, M V; Smith, M C
1970-01-01
The development of foods suitable for extraterrestrial consumption posed unique problems. Limitations on weight, volume and stability of space food together with the lack of refrigeration favored the use of dehydrated foods on Gemini and Apollo menus. Environmental constraints, cabin pressures of 1/3 atmosphere with exposure of the food assembly to the vacuum of space in conjunction with extravehicular activities and zero gravity required special packaging and adaptation of foods considered suitable for space flight use. Requirements for acceptable, familiar, crumb free, low residue, non-gas producing, stable foods added to the complexity of the developmental effort. Four basic approaches: semisolid foods in metal tubes, dehydrated bite-size foods to be eaten dry, dehydrated foods to be reconstituted before eating and flexibly packaged thermostabilized wet meat products have been utilized in the feeding systems developed for Projects Mercury, Gemini and Apollo. The development of each type posed many interesting technologic problems. Data from current Apollo flights have pointed to certain deficiencies which still remain to be corrected. Work is progressing to eliminate current problems and to provide feeding systems suitable for both short-term and long-term space flights.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diduck, A.
1999-10-01
Changing rom current patterns of resource use to a sustainable and equitable economy is a complex and intractable problem. This paper suggests that critical education may form part of the solution. Critical environmental assessment (EA) education, the model explored in this paper, offers a tool for resource and environmental managers to use in managing public involvement processes. This model challenges current patterns of resource use and addresses criticisms of public involvement processes. Critical EA education, involving both cognitive development and personal empowerment, focuses on critical intelligence, problem solving and social action. The concept is offered as a means to facilitatemore » and improve public involvement and, thereby, empower local communities to take greater control of resource use decisions affecting their lives. Positive implications of critical EA education for change, complexity, uncertainty and conflict, which are four enduring themes in resource and environmental management, are discussed in the paper. The implications include: cognitive development and personal empowerment at the level of local resource communities; simplification of the often complex discourse encountered in resource management; reduction in feelings of powerlessness often experienced by members of the public in environmental assessment scenarios; a reduction of ignorance and indeterminacy regarding resource management issues; conflict resolution at the cognitive level; and, clarification of the opposing values, interests or actions at the heart of a conflict.« less
Direct EIT reconstructions of complex admittivities on a chest-shaped domain in 2-D.
Hamilton, Sarah J; Mueller, Jennifer L
2013-04-01
Electrical impedance tomography (EIT) is a medical imaging technique in which current is applied on electrodes on the surface of the body, the resulting voltage is measured, and an inverse problem is solved to recover the conductivity and/or permittivity in the interior. Images are then formed from the reconstructed conductivity and permittivity distributions. In the 2-D geometry, EIT is clinically useful for chest imaging. In this work, an implementation of a D-bar method for complex admittivities on a general 2-D domain is presented. In particular, reconstructions are computed on a chest-shaped domain for several realistic phantoms including a simulated pneumothorax, hyperinflation, and pleural effusion. The method demonstrates robustness in the presence of noise. Reconstructions from trigonometric and pairwise current injection patterns are included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clore, G. Marius; Venditti, Vincenzo
2013-10-01
The bacterial phosphotransferase system (PTS) couples phosphoryl transfer, via a series of bimolecular protein–protein interactions, to sugar transport across the membrane. The multitude of complexes in the PTS provides a paradigm for studying protein interactions, and for understanding how the same binding surface can specifically recognize a diverse array of targets. Fifteen years of work aimed at solving the solution structures of all soluble protein–protein complexes of the PTS has served as a test bed for developing NMR and integrated hybrid approaches to study larger complexes in solution and to probe transient, spectroscopically invisible states, including encounter complexes. We reviewmore » these approaches, highlighting the problems that can be tackled with these methods, and summarize the current findings on protein interactions.« less
Steinhäuser, C; Kressin, K; Kuprijanova, E; Weber, M; Seifert, G
1994-10-01
In the present study, we were interested in a quantitative analysis of voltage-activated channels in a subpopulation of hippocampal glial cells, termed "complex" cells. The patch-clamp technique in the whole-cell mode was applied to identified cells in situ and to glial cells acutely isolated from tissue slices. The outward current was composed of two components: a sustained and a transient current. The transient K+ channel had electrophysiological and pharmacological properties resembling those of the channel through which the A-currents pass. In addition, this glial A-type current possessed a significant Ca2+ dependence. The current parameters determined in situ or in isolated cells corresponded well. Due to space clamp problems in situ, properties of voltage-dependent Na+ currents were only analysed in suspended glial cells. The tetrodotoxin (TTX) sensitivity and the stationary and kinetic characteristics of this current were similar to corresponding properties of hippocampal neurons. These quantitative data demonstrate that at an early postnatal stage of central nervous system maturation, glial cells in situ express a complex pattern of voltage-gated ion channels. The results are compared to findings in other preparations and the possible consequences of transmitter-mediated channel modulation in glial cells are discussed.
Bipolar disorder-methodological problems and future perspectives
Angst, Jules
2008-01-01
Since its “rebirth” in 1966, bipolar disorder (BPD) has rapidly come to occupy a central position in the research and treatment of mood disorders. Compared with major depressive disorder (MDD), BPD is a more serious condition, characterized by much more frequent recurrence, more complex comorbidity, and higher mortality. One major problem is the lack of valid definitions in adult and in child psychiatry; the current definitions are unsatisfactory, and heavily favor an overdiagnosis of MDD. Biological research is partially based on those definitions, which have a short half-life. An additional, dimensional, approach, quantifying hypomania, depression, and anxiety by self-assessment and symptom checklists is recommended, A further, related problem is the early recognition of the onset of BPD, especially in adolescence, and the identification of correlates in childhood. Early and timely diagnosis of BPD is necessary to enable prompt intervention and secondary prevention of the disorder. The paper describes the current status and future directions of developing clinical concepts of bipolarity PMID:18689284
Designing the future of healthcare.
Fidsa, Gianfranco Zaccai
2009-01-01
This paper describes the application of a holistic design process to a variety of problems plaguing current healthcare systems. A design process for addressing complex, multifaceted problems is contrasted with the piecemeal application of technological solutions to specific medical or administrative problems. The goal of this design process is the ideal customer experience, specifically the ideal experience for patients, healthcare providers, and caregivers within a healthcare system. Holistic design is shown to be less expensive and wasteful in the long run because it avoids solving one problem within a complex system at the cost of creating other problems within that system. The article applies this approach to the maintenance of good health throughout life; to the creation of an ideal experience when a person does need medical care; to the maintenance of personal independence as one ages; and to the enjoyment of a comfortable and dignified death. Virginia Mason Medical Center is discussed as an example of a healthcare institution attempting to create ideal patient and caregiver experiences, in this case by applying the principles of the Toyota Production System ("lean manufacturing") to healthcare. The article concludes that healthcare is inherently dedicated to an ideal, that science and technology have brought it closer to that ideal, and that design can bring it closer still.
Discharge summary for medically complex infants transitioning to primary care.
Peacock, Jennifer J
2014-01-01
Improvements in the care of the premature infant and advancements in technology are increasing life expectancy of infants with medical conditions once considered lethal; these infants are at risk of becoming a medically complex infant. Complex infants have a significant existing problem list, are on several medications, and receive medical care by several specialists. Deficits in communication and information transfer at the time of discharge remain problematic for this population. A questionnaire was developed for primary care providers (PCPs) to explore the effectiveness of the current discharge summary because it is related to effective communication when assuming the care of a new patient with medical complexity. PCPs assuming the care of these infants agree that an evidence-based tool, in the form of a specialized summary for this population, would be of value.
A Multifaceted Mathematical Approach for Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexander, F.; Anitescu, M.; Bell, J.
2012-03-07
Applied mathematics has an important role to play in developing the tools needed for the analysis, simulation, and optimization of complex problems. These efforts require the development of the mathematical foundations for scientific discovery, engineering design, and risk analysis based on a sound integrated approach for the understanding of complex systems. However, maximizing the impact of applied mathematics on these challenges requires a novel perspective on approaching the mathematical enterprise. Previous reports that have surveyed the DOE's research needs in applied mathematics have played a key role in defining research directions with the community. Although these reports have had significantmore » impact, accurately assessing current research needs requires an evaluation of today's challenges against the backdrop of recent advances in applied mathematics and computing. To address these needs, the DOE Applied Mathematics Program sponsored a Workshop for Mathematics for the Analysis, Simulation and Optimization of Complex Systems on September 13-14, 2011. The workshop had approximately 50 participants from both the national labs and academia. The goal of the workshop was to identify new research areas in applied mathematics that will complement and enhance the existing DOE ASCR Applied Mathematics Program efforts that are needed to address problems associated with complex systems. This report describes recommendations from the workshop and subsequent analysis of the workshop findings by the organizing committee.« less
Effects of historical and predictive information on ability of transport pilot to predict an alert
NASA Technical Reports Server (NTRS)
Trujillo, Anna C.
1994-01-01
In the aviation community, the early detection of the development of a possible subsystem problem during a flight is potentially useful for increasing the safety of the flight. Commercial airlines are currently using twin-engine aircraft for extended transport operations over water, and the early detection of a possible problem might increase the flight crew's options for safely landing the aircraft. One method for decreasing the severity of a developing problem is to predict the behavior of the problem so that appropriate corrective actions can be taken. To investigate the pilots' ability to predict long-term events, a computer workstation experiment was conducted in which 18 airline pilots predicted the alert time (the time to an alert) using 3 different dial displays and 3 different parameter behavior complexity levels. The three dial displays were as follows: standard (resembling current aircraft round dial presentations); history (indicating the current value plus the value of the parameter 5 sec in the past); and predictive (indicating the current value plus the value of the parameter 5 sec into the future). The time profiles describing the behavior of the parameter consisted of constant rate-of-change profiles, decelerating profiles, and accelerating-then-decelerating profiles. Although the pilots indicated that they preferred the near term predictive dial, the objective data did not support its use. The objective data did show that the time profiles had the most significant effect on performance in estimating the time to an alert.
ERIC Educational Resources Information Center
Schwartz, Judah L.
Educational assessment tools are used for accountability; selection and licensure, and to measure the effects of instruction for student diagnosis and treatment. Psychometric instruments currently in use are flawed in two ways: they attempt to rank people on fundamentally multidimensional traits, and the problem of the validity of these…
ERIC Educational Resources Information Center
Canadian Teachers' Federation, Ottawa (Ontario).
This report contains the proceedings of a 2-day interprovincial conference attended by 38 teachers, registrars, university professors, and trustees to discuss means of reducing the complexity of teacher certification regulations and of enabling teachers to move more freely among the various provinces. Included are four presentations on the…
Multiprofessional education to stimulate collaboration: a circular argument and its consequences.
Roodbol, Petrie F
2010-01-01
The current developments in healthcare are unprecedented. The organization of health care is complex. Collaboration is essential to meet all the healthcare needs of patients and to achieve coordinated and unambiguous information. Multiprofessional education (MPE) or multidisciplinary training (MDT) seems a logical step to stimulate teamwork. However, collaboration and MPE are wrestling with the same problems: social identity and acceptance.
Website on Protein Interaction and Protein Structure Related Work
NASA Technical Reports Server (NTRS)
Samanta, Manoj; Liang, Shoudan; Biegel, Bryan (Technical Monitor)
2003-01-01
In today's world, three seemingly diverse fields - computer information technology, nanotechnology and biotechnology are joining forces to enlarge our scientific knowledge and solve complex technological problems. Our group is dedicated to conduct theoretical research exploring the challenges in this area. The major areas of research include: 1) Yeast Protein Interactions; 2) Protein Structures; and 3) Current Transport through Small Molecules.
Oversight of human participants research: identifying problems to evaluate reform proposals.
Emanuel, Ezekiel J; Wood, Anne; Fleischman, Alan; Bowen, Angela; Getz, Kenneth A; Grady, Christine; Levine, Carol; Hammerschmidt, Dale E; Faden, Ruth; Eckenwiler, Lisa; Muse, Carianne Tucker; Sugarman, Jeremy
2004-08-17
The oversight of research involving human participants is widely believed to be inadequate. The U.S. Congress, national commissions, the Department of Health and Human Services, the Institute of Medicine, numerous professional societies, and others are proposing remedies based on the assumption that the main problems are researchers' conflict of interest, lack of institutional review board (IRB) resources, and the volume and complexity of clinical research. Developing appropriate reform proposals requires carefully delineating the problems of the current system to know what reforms are needed. To stimulate a more informed and meaningful debate, we delineate 15 current problems into 3 broad categories. First, structural problems encompass 8 specific problems related to the way the research oversight system is organized. Second, procedural problems constitute 5 specific problems related to the operations of IRB review. Finally, performance assessment problems include 2 problems related to absence of systematic assessment of the outcomes of the oversight system. We critically assess proposed reforms, such as accreditation and central IRBs, according to how well they address these 15 problems. None of the reforms addresses all 15 problems. Indeed, most focus on the procedural problems, failing to address either the structure or the performance assessment problems. Finally, on the basis of the delineation of problems, we outline components of a more effective reform proposal, including bringing all research under federal oversight, a permanent advisory committee to address recurrent ethical issues in clinical research, mandatory single-time review for multicenter research protocols, additional financial support for IRB functions, and a standardized system for collecting and disseminating data on both adverse events and the performance assessment of IRBs.
Complex Problem Solving: What It Is and What It Is Not
Dörner, Dietrich; Funke, Joachim
2017-01-01
Computer-simulated scenarios have been part of psychological research on problem solving for more than 40 years. The shift in emphasis from simple toy problems to complex, more real-life oriented problems has been accompanied by discussions about the best ways to assess the process of solving complex problems. Psychometric issues such as reliable assessments and addressing correlations with other instruments have been in the foreground of these discussions and have left the content validity of complex problem solving in the background. In this paper, we return the focus to content issues and address the important features that define complex problems. PMID:28744242
Clarification process: Resolution of decision-problem conditions
NASA Technical Reports Server (NTRS)
Dieterly, D. L.
1980-01-01
A model of a general process which occurs in both decisionmaking and problem-solving tasks is presented. It is called the clarification model and is highly dependent on information flow. The model addresses the possible constraints of individual indifferences and experience in achieving success in resolving decision-problem conditions. As indicated, the application of the clarification process model is only necessary for certain classes of the basic decision-problem condition. With less complex decision problem conditions, certain phases of the model may be omitted. The model may be applied across a wide range of decision problem conditions. The model consists of two major components: (1) the five-phase prescriptive sequence (based on previous approaches to both concepts) and (2) the information manipulation function (which draws upon current ideas in the areas of information processing, computer programming, memory, and thinking). The two components are linked together to provide a structure that assists in understanding the process of resolving problems and making decisions.
Crashworthiness simulations with DYNA3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schauer, D.A.; Hoover, C.G.; Kay, G.J.
1996-04-01
Current progress in parallel algorithm research and applications in vehicle crash simulation is described for the explicit, finite element algorithms in DYNA3D. Problem partitioning methods and parallel algorithms for contact at material interfaces are the two challenging algorithm research problems that are addressed. Two prototype parallel contact algorithms have been developed for treating the cases of local and arbitrary contact. Demonstration problems for local contact are crashworthiness simulations with 222 locally defined contact surfaces and a vehicle/barrier collision modeled with arbitrary contact. A simulation of crash tests conducted for a vehicle impacting a U-channel small sign post embedded in soilmore » has been run on both the serial and parallel versions of DYNA3D. A significant reduction in computational time has been observed when running these problems on the parallel version. However, to achieve maximum efficiency, complex problems must be appropriately partitioned, especially when contact dominates the computation.« less
Modelling DC responses of 3D complex fracture networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beskardes, Gungor Didem; Weiss, Chester Joseph
Here, the determination of the geometrical properties of fractures plays a critical role in many engineering problems to assess the current hydrological and mechanical states of geological media and to predict their future states. However, numerical modeling of geoelectrical responses in realistic fractured media has been challenging due to the explosive computational cost imposed by the explicit discretizations of fractures at multiple length scales, which often brings about a tradeoff between computational efficiency and geologic realism. Here, we use the hierarchical finite element method to model electrostatic response of realistically complex 3D conductive fracture networks with minimal computational cost.
Modelling DC responses of 3D complex fracture networks
Beskardes, Gungor Didem; Weiss, Chester Joseph
2018-03-01
Here, the determination of the geometrical properties of fractures plays a critical role in many engineering problems to assess the current hydrological and mechanical states of geological media and to predict their future states. However, numerical modeling of geoelectrical responses in realistic fractured media has been challenging due to the explosive computational cost imposed by the explicit discretizations of fractures at multiple length scales, which often brings about a tradeoff between computational efficiency and geologic realism. Here, we use the hierarchical finite element method to model electrostatic response of realistically complex 3D conductive fracture networks with minimal computational cost.
Reflections and meditations upon complex chromosomal exchanges.
Savage, John R K
2002-12-01
The application of FISH chromosome painting techniques, especially the recent mFISH (and its equivalents) where all 23 human chromosome pairs can be distinguished, has demonstrated that many chromosome-type structural exchanges are much more complicated (involving more "break-rejoins" and arms) than has hitherto been assumed. It is clear that we have been greatly under-estimating the damage produced in chromatin by such agents as ionising radiation. This article gives a brief historical summary of observations leading up to this conclusion, and after outlining some of the problems surrounding the formation of complex chromosomes exchanges, speculates about possible solutions currently being proposed.
Dry mouth: Xerostomia and salivary gland hypofunction.
Frydrych, Agnieszka M
2016-07-01
Mouth dryness may present as salivary gland hypofunction (SGH), xerostomia or both. It is considered one of the most underappreciated, underdiagnosed and undermanaged oral health conditions. Despite its common presentation and adverse impact on life quality, it is also generally poorly understood. Increased awareness of the condition is important in addressing these problems. This article discusses SGH and xerostomia, and the associated intra-oral and extra-oral implications. It also summarises currently available management approaches and the evidence behind them. SGH and xerostomia are complex problems. None of the currently available management approaches are entirely satisfactory. Addressing the causative or contributing factors is therefore paramount. While oral health complaints are generally left up to the dental professional to manage, the nature of mouth dryness necessitates increased dialogue between the dental and medical professions to ensure optimal patient care.
The Third NASA Goddard Conference on Mass Storage Systems and Technologies
NASA Technical Reports Server (NTRS)
Kobler, Benjamin (Editor); Hariharan, P. C. (Editor)
1993-01-01
This report contains copies of nearly all of the technical papers and viewgraphs presented at the Goddard Conference on Mass Storage Systems and Technologies held in October 1993. The conference served as an informational exchange forum for topics primarily relating to the ingestion and management of massive amounts of data and the attendant problems involved. Discussion topics include the necessary use of computers in the solution of today's infinitely complex problems, the need for greatly increased storage densities in both optical and magnetic recording media, currently popular storage media and magnetic media storage risk factors, data archiving standards including a talk on the current status of the IEEE Storage Systems Reference Model (RM). Additional topics addressed System performance, data storage system concepts, communications technologies, data distribution systems, data compression, and error detection and correction.
Optical Estimation of Depth and Current in a Ebb Tidal Delta Environment
NASA Astrophysics Data System (ADS)
Holman, R. A.; Stanley, J.
2012-12-01
A key limitation to our ability to make nearshore environmental predictions is the difficulty of obtaining up-to-date bathymetry measurements at a reasonable cost and frequency. Due to the high cost and complex logistics of in-situ methods, research into remote sensing approaches has been steady and has finally yielded fairly robust methods like the cBathy algorithm for optical Argus data that show good performance on simple barred beach profiles and near immunity to noise and signal problems. In May, 2012, data were collected in a more complex ebb tidal delta environment during the RIVET field experiment at New River Inlet, NC. The presence of strong reversing tidal currents led to significant errors in cBathy depths that were phase-locked to the tide. In this paper we will test methods for the robust estimation of both depths and vector currents in a tidal delta domain. In contrast to previous Fourier methods, wavenumber estimation in cBathy can be done on small enough scales to resolve interesting nearshore features.
Sparse dynamical Boltzmann machine for reconstructing complex networks with binary dynamics
NASA Astrophysics Data System (ADS)
Chen, Yu-Zhong; Lai, Ying-Cheng
2018-03-01
Revealing the structure and dynamics of complex networked systems from observed data is a problem of current interest. Is it possible to develop a completely data-driven framework to decipher the network structure and different types of dynamical processes on complex networks? We develop a model named sparse dynamical Boltzmann machine (SDBM) as a structural estimator for complex networks that host binary dynamical processes. The SDBM attains its topology according to that of the original system and is capable of simulating the original binary dynamical process. We develop a fully automated method based on compressive sensing and a clustering algorithm to construct the SDBM. We demonstrate, for a variety of representative dynamical processes on model and real world complex networks, that the equivalent SDBM can recover the network structure of the original system and simulates its dynamical behavior with high precision.
Sparse dynamical Boltzmann machine for reconstructing complex networks with binary dynamics.
Chen, Yu-Zhong; Lai, Ying-Cheng
2018-03-01
Revealing the structure and dynamics of complex networked systems from observed data is a problem of current interest. Is it possible to develop a completely data-driven framework to decipher the network structure and different types of dynamical processes on complex networks? We develop a model named sparse dynamical Boltzmann machine (SDBM) as a structural estimator for complex networks that host binary dynamical processes. The SDBM attains its topology according to that of the original system and is capable of simulating the original binary dynamical process. We develop a fully automated method based on compressive sensing and a clustering algorithm to construct the SDBM. We demonstrate, for a variety of representative dynamical processes on model and real world complex networks, that the equivalent SDBM can recover the network structure of the original system and simulates its dynamical behavior with high precision.
Superconducting racetrack booster for the ion complex of MEIC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Filatov, Yu; Kondratenko, A. M.; Kondratenko, M. A.
2016-02-01
The current design of the Medium-energy Electron-Ion Collider (MEIC) project at Jefferson lab features a single 8 GeV/c figure-8 booster based on super-ferric magnets. Reducing the circumference of the booster by switching to a racetrack design may improve its performance by limiting the space charge effect and lower its cost. We consider problems of preserving proton and deuteron polarizations in a superconducting racetrack booster. We show that using magnets based on hollow high-current NbTi composite superconducting cable similar to those designed at JINR for the Nuclotron guarantees preservation of the ion polarization in a racetrack booster up to 8 GeV/c.more » The booster operation cycle would be a few seconds that would improve the operating efficiency of the MEIC ion complex.« less
Formation of current singularity in a topologically constrained plasma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Yao; Huang, Yi-Min; Qin, Hong
2016-02-01
Recently a variational integrator for ideal magnetohydrodynamics in Lagrangian labeling has been developed. Its built-in frozen-in equation makes it optimal for studying current sheet formation. We use this scheme to study the Hahm-Kulsrud-Taylor problem, which considers the response of a 2D plasma magnetized by a sheared field under sinusoidal boundary forcing. We obtain an equilibrium solution that preserves the magnetic topology of the initial field exactly, with a fluid mapping that is non-differentiable. Unlike previous studies that examine the current density output, we identify a singular current sheet from the fluid mapping. These results are benchmarked with a constrained Grad-Shafranovmore » solver. The same signature of current singularity can be found in other cases with more complex magnetic topologies.« less
The Current State of Drug Discovery and a Potential Role for NMR Metabolomics
2015-01-01
The pharmaceutical industry has significantly contributed to improving human health. Drugs have been attributed to both increasing life expectancy and decreasing health care costs. Unfortunately, there has been a recent decline in the creativity and productivity of the pharmaceutical industry. This is a complex issue with many contributing factors resulting from the numerous mergers, increase in out-sourcing, and the heavy dependency on high-throughput screening (HTS). While a simple solution to such a complex problem is unrealistic and highly unlikely, the inclusion of metabolomics as a routine component of the drug discovery process may provide some solutions to these problems. Specifically, as the binding affinity of a chemical lead is evolved during the iterative structure-based drug design process, metabolomics can provide feedback on the selectivity and the in vivo mechanism of action. Similarly, metabolomics can be used to evaluate and validate HTS leads. In effect, metabolomics can be used to eliminate compounds with potential efficacy and side effect problems while prioritizing well-behaved leads with druglike characteristics. PMID:24588729
A Social-Medical Approach to Violence in Colombia
Franco, Saul
2003-01-01
Violence is the main public health problem in Colombia. Many theoretical and methodological approaches to solving this problem have been attempted from different disciplines. My past work has focused on homicide violence from the perspective of social medicine. In this article I present the main conceptual and methodological aspects and the chief findings of my research over the past 15 years. Findings include a quantitative description of the current situation and the introduction of the category of explanatory contexts as a contribution to the study of Colombian violence. The complexity and severity of this problem demand greater theoretical discussion, more plans for action and a faster transition between the two. Social medicine may make a growing contribution to this field. PMID:14652328
A social-medical approach to violence in Colombia.
Franco, Saul
2003-12-01
Violence is the main public health problem in Colombia. Many theoretical and methodological approaches to solving this problem have been attempted from different disciplines. My past work has focused on homicide violence from the perspective of social medicine. In this article I present the main conceptual and methodological aspects and the chief findings of my research over the past 15 years. Findings include a quantitative description of the current situation and the introduction of the category of explanatory contexts as a contribution to the study of Colombian violence. The complexity and severity of this problem demand greater theoretical discussion, more plans for action and a faster transition between the two. Social medicine may make a growing contribution to this field.
NASA Astrophysics Data System (ADS)
Chen, Xiaoguang; Liang, Lin; Liu, Fei; Xu, Guanghua; Luo, Ailing; Zhang, Sicong
2012-05-01
Nowadays, Motor Current Signature Analysis (MCSA) is widely used in the fault diagnosis and condition monitoring of machine tools. However, although the current signal has lower SNR (Signal Noise Ratio), it is difficult to identify the feature frequencies of machine tools from complex current spectrum that the feature frequencies are often dense and overlapping by traditional signal processing method such as FFT transformation. With the study in the Motor Current Signature Analysis (MCSA), it is found that the entropy is of importance for frequency identification, which is associated with the probability distribution of any random variable. Therefore, it plays an important role in the signal processing. In order to solve the problem that the feature frequencies are difficult to be identified, an entropy optimization technique based on motor current signal is presented in this paper for extracting the typical feature frequencies of machine tools which can effectively suppress the disturbances. Some simulated current signals were made by MATLAB, and a current signal was obtained from a complex gearbox of an iron works made in Luxembourg. In diagnosis the MCSA is combined with entropy optimization. Both simulated and experimental results show that this technique is efficient, accurate and reliable enough to extract the feature frequencies of current signal, which provides a new strategy for the fault diagnosis and the condition monitoring of machine tools.
Leon, Juan S; Winskell, Kate; McFarland, Deborah A; del Rio, Carlos
2015-03-01
Global health is a dynamic, emerging, and interdisciplinary field. To address current and emerging global health challenges, we need a public health workforce with adaptable and collaborative problem-solving skills. In the 2013-2014 academic year, the Hubert Department of Global Health at the Rollins School of Public Health-Emory University launched an innovative required core course for its first-year Master of Public Health students in the global health track. The course uses a case-based, problem-based learning approach to develop global health competencies. Small teams of students propose solutions to these problems by identifying learning issues and critically analyzing and synthesizing new information. We describe the course structure and logistics used to apply this approach in the context of a large class and share lessons learned.
Water insoluble and soluble lipids for gene delivery.
Mahato, Ram I
2005-04-05
Among various synthetic gene carriers currently in use, liposomes composed of cationic lipids and co-lipids remain the most efficient transfection reagents. Physicochemical properties of lipid/plasmid complexes, such as cationic lipid structure, cationic lipid to co-lipid ratio, charge ratio, particle size and zeta potential have significant influence on gene expression and biodistribution. However, most cationic lipids are toxic and cationic liposomes/plasmid complexes do not disperse well inside the target tissues because of their large particle size. To overcome the problems associated with cationic lipids, we designed water soluble lipopolymers for gene delivery to various cells and tissues. This review provides a critical discussion on how the components of water insoluble and soluble lipids affect their transfection efficiency and biodistribution of lipid/plasmid complexes.
Mühlbacher, Axel C; Kaczynski, Anika
2016-02-01
Healthcare decision making is usually characterized by a low degree of transparency. The demand for transparent decision processes can be fulfilled only when assessment, appraisal and decisions about health technologies are performed under a systematic construct of benefit assessment. The benefit of an intervention is often multidimensional and, thus, must be represented by several decision criteria. Complex decision problems require an assessment and appraisal of various criteria; therefore, a decision process that systematically identifies the best available alternative and enables an optimal and transparent decision is needed. For that reason, decision criteria must be weighted and goal achievement must be scored for all alternatives. Methods of multi-criteria decision analysis (MCDA) are available to analyse and appraise multiple clinical endpoints and structure complex decision problems in healthcare decision making. By means of MCDA, value judgments, priorities and preferences of patients, insurees and experts can be integrated systematically and transparently into the decision-making process. This article describes the MCDA framework and identifies potential areas where MCDA can be of use (e.g. approval, guidelines and reimbursement/pricing of health technologies). A literature search was performed to identify current research in healthcare. The results showed that healthcare decision making is addressing the problem of multiple decision criteria and is focusing on the future development and use of techniques to weight and score different decision criteria. This article emphasizes the use and future benefit of MCDA.
ERIC Educational Resources Information Center
Mayrath, Michael C., Ed.; Clarke-Midura, Jody, Ed.; Robinson, Daniel H., Ed.; Schraw, Gregory, Ed.
2012-01-01
Creative problem solving, collaboration, and technology fluency are core skills requisite of any nation's workforce that strives to be competitive in the 21st Century. Teaching these types of skills is an economic imperative, and assessment is a fundamental component of any pedagogical program. Yet, measurement of these skills is complex due to…
How Do We Take Care of Our Own? Principal Support and Development in Rocky Top Public Schools
ERIC Educational Resources Information Center
Griffin, Jennifer Shaw
2017-01-01
Principals are isolated in their work and suffer from low morale. The role of the principal has become increasingly complex and demanding especially within the current accountability model with the public nature of school report cards. This is a problem in Rocky Top Public Schools and in school districts across the country. The purpose of this…
ERIC Educational Resources Information Center
Shaw, Angela
2014-01-01
This paper examines current part-time mature learners' views on the potential impact upon future students as full fees are introduced from 2012. It investigates the problems which part-time mature learners may face with the advent of student loans and subsequent debt, given that they are usually combining complex lives with their studies, with…
Multiprofessional education to stimulate collaboration: a circular argument and its consequences
Roodbol, Petrie F.
2010-01-01
The current developments in healthcare are unprecedented. The organization of health care is complex. Collaboration is essential to meet all the healthcare needs of patients and to achieve coordinated and unambiguous information. Multiprofessional education (MPE) or multidisciplinary training (MDT) seems a logical step to stimulate teamwork. However, collaboration and MPE are wrestling with the same problems: social identity and acceptance. PMID:21818197
Visions of Automation and Realities of Certification
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Holloway, Michael C.
2005-01-01
Quite a lot of people envision automation as the solution to many of the problems in aviation and air transportation today, across all sectors: commercial, private, and military. This paper explains why some recent experiences with complex, highly-integrated, automated systems suggest that this vision will not be realized unless significant progress is made over the current state-of-the-practice in software system development and certification.
[Algorithms for treatment of complex hand injuries].
Pillukat, T; Prommersberger, K-J
2011-07-01
The primary treatment strongly influences the course and prognosis of hand injuries. Complex injuries which compromise functional recovery are especially challenging. Despite an apparently unlimited number of injury patterns it is possible to develop strategies which facilitate a standardized approach to operative treatment. In this situation algorithms can be important guidelines for a rational approach. The following algorithms have been proven in the treatment of complex injuries of the hand by our own experience. They were modified according to the current literature and refer to prehospital care, emergency room management, basic strategy in general and reconstruction of bone and joints, vessels, nerves, tendons and soft tissue coverage in detail. Algorithms facilitate the treatment of severe hand injuries. Applying simple yes/no decisions complex injury patterns are split into distinct partial problems which can be managed step by step.
McGovern, Eimear; Kelleher, Eoin; Snow, Aisling; Walsh, Kevin; Gadallah, Bassem; Kutty, Shelby; Redmond, John M; McMahon, Colin J
2017-09-01
In recent years, three-dimensional printing has demonstrated reliable reproducibility of several organs including hearts with complex congenital cardiac anomalies. This represents the next step in advanced image processing and can be used to plan surgical repair. In this study, we describe three children with complex univentricular hearts and abnormal systemic or pulmonary venous drainage, in whom three-dimensional printed models based on CT data assisted with preoperative planning. For two children, after group discussion and examination of the models, a decision was made not to proceed with surgery. We extend the current clinical experience with three-dimensional printed modelling and discuss the benefits of such models in the setting of managing complex surgical problems in children with univentricular circulation and abnormal systemic or pulmonary venous drainage.
Dölitzsch, Claudia; Kölch, Michael; Fegert, Jörg M; Schmeck, Klaus; Schmid, Marc
2016-11-15
The current analyses examined whether the dysregulation profile (DP) 1) could be used to identify children and adolescents at high risk for complex and serious psychopathology and 2) was correlated to other emotional and behavioral problems (such as delinquent behavior or suicide ideation). DP was assessed using both the Child Behavior Checklist (CBCL) and the Youth Self Report (YSR) in a residential care sample. Children and adolescents (N=374) aged 10-18 years living in residential care in Switzerland completed the YSR, and their professional caregivers completed the CBCL. Participants meeting criteria for DP (T-score ≥67 on the anxious/depressed, attention problems, and aggressive behavior scales of the YSR/CBCL) were compared against those who did not for the presence of complex psychopathology (defined as the presence of both emotional and behavioral disorders), and also for the prevalence of several psychiatric diagnoses, suicidal ideation, traumatic experiences, delinquent behaviors, and problems related to quality of life. The diagnostic criteria for CBCL-DP and YSR-DP were met by just 44 (11.8%) and 25 (6.7%) of participants. Only eight participants (2.1%) met the criteria on both instruments. Further analyses were conducted separately for the CBCL-DP and YSR-DP groups. DP was associated with complex psychopathology in only 34.4% of cases according to CBCL and in 60% of cases according to YSR. YSR-DP was somewhat more likely to be associated with psychiatric disorders and associated problems than was the CBCL-DP. Because of the relatively small overlap between the CBCL-DP and YSR-DP, analyses were conducted largely with different samples, likely contributing to the different results. Despite a high rate of psychopathology in the population studied, both the YSR-DP and the CBCL-DP were able to detect only a small proportion of those with complex psychiatric disorders. This result questions the validity of YSR-DP and the CBCL-DP in detecting subjects with complex and serious psychopathology. It is possible that different screening instruments may be more effective. Copyright © 2016 Elsevier B.V. All rights reserved.
Adaptive Wavelet Modeling of Geophysical Data
NASA Astrophysics Data System (ADS)
Plattner, A.; Maurer, H.; Dahmen, W.; Vorloeper, J.
2009-12-01
Despite the ever-increasing power of modern computers, realistic modeling of complex three-dimensional Earth models is still a challenging task and requires substantial computing resources. The overwhelming majority of current geophysical modeling approaches includes either finite difference or non-adaptive finite element algorithms, and variants thereof. These numerical methods usually require the subsurface to be discretized with a fine mesh to accurately capture the behavior of the physical fields. However, this may result in excessive memory consumption and computing times. A common feature of most of these algorithms is that the modeled data discretizations are independent of the model complexity, which may be wasteful when there are only minor to moderate spatial variations in the subsurface parameters. Recent developments in the theory of adaptive numerical solvers have the potential to overcome this problem. Here, we consider an adaptive wavelet based approach that is applicable to a large scope of problems, also including nonlinear problems. To the best of our knowledge such algorithms have not yet been applied in geophysics. Adaptive wavelet algorithms offer several attractive features: (i) for a given subsurface model, they allow the forward modeling domain to be discretized with a quasi minimal number of degrees of freedom, (ii) sparsity of the associated system matrices is guaranteed, which makes the algorithm memory efficient, and (iii) the modeling accuracy scales linearly with computing time. We have implemented the adaptive wavelet algorithm for solving three-dimensional geoelectric problems. To test its performance, numerical experiments were conducted with a series of conductivity models exhibiting varying degrees of structural complexity. Results were compared with a non-adaptive finite element algorithm, which incorporates an unstructured mesh to best fit subsurface boundaries. Such algorithms represent the current state-of-the-art in geoelectrical modeling. An analysis of the numerical accuracy as a function of the number of degrees of freedom revealed that the adaptive wavelet algorithm outperforms the finite element solver for simple and moderately complex models, whereas the results become comparable for models with spatially highly variable electrical conductivities. The linear dependency of the modeling error and the computing time proved to be model-independent. This feature will allow very efficient computations using large-scale models as soon as our experimental code is optimized in terms of its implementation.
Lectures on Selected Topics in Mathematical Physics: Elliptic Functions and Elliptic Integrals
NASA Astrophysics Data System (ADS)
Schwalm, William A.
2015-12-01
This volume is a basic introduction to certain aspects of elliptic functions and elliptic integrals. Primarily, the elliptic functions stand out as closed solutions to a class of physical and geometrical problems giving rise to nonlinear differential equations. While these nonlinear equations may not be the types of greatest interest currently, the fact that they are solvable exactly in terms of functions about which much is known makes up for this. The elliptic functions of Jacobi, or equivalently the Weierstrass elliptic functions, inhabit the literature on current problems in condensed matter and statistical physics, on solitons and conformal representations, and all sorts of famous problems in classical mechanics. The lectures on elliptic functions have evolved as part of the first semester of a course on theoretical and mathematical methods given to first- and second-year graduate students in physics and chemistry at the University of North Dakota. They are for graduate students or for researchers who want an elementary introduction to the subject that nevertheless leaves them with enough of the details to address real problems. The style is supposed to be informal. The intention is to introduce the subject as a moderate extension of ordinary trigonometry in which the reference circle is replaced by an ellipse. This entre depends upon fewer tools and has seemed less intimidating that other typical introductions to the subject that depend on some knowledge of complex variables. The first three lectures assume only calculus, including the chain rule and elementary knowledge of differential equations. In the later lectures, the complex analytic properties are introduced naturally so that a more complete study becomes possible.
Fitting methods to paradigms: are ergonomics methods fit for systems thinking?
Salmon, Paul M; Walker, Guy H; M Read, Gemma J; Goode, Natassia; Stanton, Neville A
2017-02-01
The issues being tackled within ergonomics problem spaces are shifting. Although existing paradigms appear relevant for modern day systems, it is worth questioning whether our methods are. This paper asks whether the complexities of systems thinking, a currently ubiquitous ergonomics paradigm, are outpacing the capabilities of our methodological toolkit. This is achieved through examining the contemporary ergonomics problem space and the extent to which ergonomics methods can meet the challenges posed. Specifically, five key areas within the ergonomics paradigm of systems thinking are focused on: normal performance as a cause of accidents, accident prediction, system migration, systems concepts and ergonomics in design. The methods available for pursuing each line of inquiry are discussed, along with their ability to respond to key requirements. In doing so, a series of new methodological requirements and capabilities are identified. It is argued that further methodological development is required to provide researchers and practitioners with appropriate tools to explore both contemporary and future problems. Practitioner Summary: Ergonomics methods are the cornerstone of our discipline. This paper examines whether our current methodological toolkit is fit for purpose given the changing nature of ergonomics problems. The findings provide key research and practice requirements for methodological development.
NASA Astrophysics Data System (ADS)
Swiderski, Waldemar
2016-10-01
Eddy current thermography is a new NDT-technique for the detection of cracks in electro conductive materials. It combines the well-established inspection techniques of eddy current testing and thermography. The technique uses induced eddy currents to heat the sample being tested and defect detection is based on the changes of induced eddy currents flows revealed by thermal visualization captured by an infrared camera. The advantage of this method is to use the high performance of eddy current testing that eliminates the known problem of the edge effect. Especially for components of complex geometry this is an important factor which may overcome the increased expense for inspection set-up. The paper presents the possibility of applying eddy current thermography method for detecting defects in ballistic covers made of carbon fiber reinforced composites used in the construction of military vehicles.
Acoustic streaming: an arbitrary Lagrangian-Eulerian perspective.
Nama, Nitesh; Huang, Tony Jun; Costanzo, Francesco
2017-08-25
We analyse acoustic streaming flows using an arbitrary Lagrangian Eulerian (ALE) perspective. The formulation stems from an explicit separation of time scales resulting in two subproblems: a first-order problem, formulated in terms of the fluid displacement at the fast scale, and a second-order problem, formulated in terms of the Lagrangian flow velocity at the slow time scale. Following a rigorous time-averaging procedure, the second-order problem is shown to be intrinsically steady, and with exact boundary conditions at the oscillating walls. Also, as the second-order problem is solved directly for the Lagrangian velocity, the formulation does not need to employ the notion of Stokes drift, or any associated post-processing, thus facilitating a direct comparison with experiments. Because the first-order problem is formulated in terms of the displacement field, our formulation is directly applicable to more complex fluid-structure interaction problems in microacoustofluidic devices. After the formulation's exposition, we present numerical results that illustrate the advantages of the formulation with respect to current approaches.
Acoustic streaming: an arbitrary Lagrangian–Eulerian perspective
Nama, Nitesh; Huang, Tony Jun; Costanzo, Francesco
2017-01-01
We analyse acoustic streaming flows using an arbitrary Lagrangian Eulerian (ALE) perspective. The formulation stems from an explicit separation of time scales resulting in two subproblems: a first-order problem, formulated in terms of the fluid displacement at the fast scale, and a second-order problem, formulated in terms of the Lagrangian flow velocity at the slow time scale. Following a rigorous time-averaging procedure, the second-order problem is shown to be intrinsically steady, and with exact boundary conditions at the oscillating walls. Also, as the second-order problem is solved directly for the Lagrangian velocity, the formulation does not need to employ the notion of Stokes drift, or any associated post-processing, thus facilitating a direct comparison with experiments. Because the first-order problem is formulated in terms of the displacement field, our formulation is directly applicable to more complex fluid–structure interaction problems in microacoustofluidic devices. After the formulation’s exposition, we present numerical results that illustrate the advantages of the formulation with respect to current approaches. PMID:29051631
NASA Astrophysics Data System (ADS)
Verkhoglyadova, Olga P.; Zank, Gary P.; Li, Gang
2015-02-01
Understanding the physics of Solar Energetic Particle (SEP) events is of importance to the general question of particle energization throughout the cosmos as well as playing a role in the technologically critical impact of space weather on society. The largest, and often most damaging, events are the so-called gradual SEP events, generally associated with shock waves driven by coronal mass ejections (CMEs). We review the current state of knowledge about particle acceleration at evolving interplanetary shocks with application to SEP events that occur in the inner heliosphere. Starting with a brief outline of recent theoretical progress in the field, we focus on current observational evidence that challenges conventional models of SEP events, including complex particle energy spectra, the blurring of the distinction between gradual and impulsive events, and the difference inherent in particle acceleration at quasi-parallel and quasi-perpendicular shocks. We also review the important problem of the seed particle population and its injection into particle acceleration at a shock. We begin by discussing the properties and characteristics of non-relativistic interplanetary shocks, from their formation close to the Sun to subsequent evolution through the inner heliosphere. The association of gradual SEP events with shocks is discussed. Several approaches to the energization of particles have been proposed, including shock drift acceleration, diffusive shock acceleration (DSA), acceleration by large-scale compression regions, acceleration by random velocity fluctuations (sometimes known as the "pump mechanism"), and others. We review these various mechanisms briefly and focus on the DSA mechanism. Much of our emphasis will be on our current understanding of the parallel and perpendicular diffusion coefficients for energetic particles and models of plasma turbulence in the vicinity of the shock. Because of its importance both to the DSA mechanism itself and to the particle composition of SEP events, we address in some detail the injection problem. Although steady-state models can improve our understanding of the diffusive shock acceleration mechanism, SEP events are inherently time-dependent. We therefore review the time-dependent theory of DSA in some detail, including estimating possible maximum particle energies and particle escape from the shock complex. We also discuss generalizations of the diffusive transport approach to modeling particle acceleration by considering a more general description based on the focused transport equation. The escape of accelerated particles from the shock requires that their subsequent transport in the interplanetary medium be modeled and the consequence of interplanetary transport can lead to the complex spectra and compositional profiles that are observed frequently. The different approaches to particle transport in the inner heliosphere are reviewed. The various numerical models that have been developed to solve the gradual SEP problem are reviewed. Explicit comparisons of modeling results with observations of large SEP events are discussed. A summary of current progress and the outlook on the SEP problem and remaining open questions conclude the review.
Cserpán, Dorottya; Meszéna, Domokos; Wittner, Lucia; Tóth, Kinga; Ulbert, István; Somogyvári, Zoltán
2017-01-01
Revealing the current source distribution along the neuronal membrane is a key step on the way to understanding neural computations; however, the experimental and theoretical tools to achieve sufficient spatiotemporal resolution for the estimation remain to be established. Here, we address this problem using extracellularly recorded potentials with arbitrarily distributed electrodes for a neuron of known morphology. We use simulations of models with varying complexity to validate the proposed method and to give recommendations for experimental applications. The method is applied to in vitro data from rat hippocampus. PMID:29148974
Transport mechanisms in Schottky diodes realized on GaN
NASA Astrophysics Data System (ADS)
Amor, Sarrah; Ahaitouf, Ali; Ahaitouf, Abdelaziz; Salvestrini, Jean Paul; Ougazzaden, Abdellah
2017-03-01
This work is focused on the conducted transport mechanisms involved on devices based in gallium nitride GaN and its alloys. With considering all conduction mechanisms of current, its possible to understanded these transport phenomena. Thanks to this methodology the current-voltage characteristics of structures with unusual behaviour are further understood and explain. Actually, the barrier height (SBH) is a complex problem since it depends on several parameters like the quality of the metal-semiconductor interface. This study is particularly interesting as solar cells are made on this material and their qualification is closely linked to their transport properties.
NASA Technical Reports Server (NTRS)
Fear, J. S.
1983-01-01
An assessment is made of the results of Phase 1 screening testing of current and advanced combustion system concepts using several broadened-properties fuels. The severity of each of several fuels-properties effects on combustor performance or liner life is discussed, as well as design techniques with the potential to offset these adverse effects. The selection of concepts to be pursued in Phase 2 refinement testing is described. This selection takes into account the relative costs and complexities of the concepts, the current outlook on pollutant emissions control, and practical operational problems.
Data based identification and prediction of nonlinear and complex dynamical systems
NASA Astrophysics Data System (ADS)
Wang, Wen-Xu; Lai, Ying-Cheng; Grebogi, Celso
2016-07-01
The problem of reconstructing nonlinear and complex dynamical systems from measured data or time series is central to many scientific disciplines including physical, biological, computer, and social sciences, as well as engineering and economics. The classic approach to phase-space reconstruction through the methodology of delay-coordinate embedding has been practiced for more than three decades, but the paradigm is effective mostly for low-dimensional dynamical systems. Often, the methodology yields only a topological correspondence of the original system. There are situations in various fields of science and engineering where the systems of interest are complex and high dimensional with many interacting components. A complex system typically exhibits a rich variety of collective dynamics, and it is of great interest to be able to detect, classify, understand, predict, and control the dynamics using data that are becoming increasingly accessible due to the advances of modern information technology. To accomplish these goals, especially prediction and control, an accurate reconstruction of the original system is required. Nonlinear and complex systems identification aims at inferring, from data, the mathematical equations that govern the dynamical evolution and the complex interaction patterns, or topology, among the various components of the system. With successful reconstruction of the system equations and the connecting topology, it may be possible to address challenging and significant problems such as identification of causal relations among the interacting components and detection of hidden nodes. The "inverse" problem thus presents a grand challenge, requiring new paradigms beyond the traditional delay-coordinate embedding methodology. The past fifteen years have witnessed rapid development of contemporary complex graph theory with broad applications in interdisciplinary science and engineering. The combination of graph, information, and nonlinear dynamical systems theories with tools from statistical physics, optimization, engineering control, applied mathematics, and scientific computing enables the development of a number of paradigms to address the problem of nonlinear and complex systems reconstruction. In this Review, we describe the recent advances in this forefront and rapidly evolving field, with a focus on compressive sensing based methods. In particular, compressive sensing is a paradigm developed in recent years in applied mathematics, electrical engineering, and nonlinear physics to reconstruct sparse signals using only limited data. It has broad applications ranging from image compression/reconstruction to the analysis of large-scale sensor networks, and it has become a powerful technique to obtain high-fidelity signals for applications where sufficient observations are not available. We will describe in detail how compressive sensing can be exploited to address a diverse array of problems in data based reconstruction of nonlinear and complex networked systems. The problems include identification of chaotic systems and prediction of catastrophic bifurcations, forecasting future attractors of time-varying nonlinear systems, reconstruction of complex networks with oscillatory and evolutionary game dynamics, detection of hidden nodes, identification of chaotic elements in neuronal networks, reconstruction of complex geospatial networks and nodal positioning, and reconstruction of complex spreading networks with binary data.. A number of alternative methods, such as those based on system response to external driving, synchronization, and noise-induced dynamical correlation, will also be discussed. Due to the high relevance of network reconstruction to biological sciences, a special section is devoted to a brief survey of the current methods to infer biological networks. Finally, a number of open problems including control and controllability of complex nonlinear dynamical networks are discussed. The methods outlined in this Review are principled on various concepts in complexity science and engineering such as phase transitions, bifurcations, stabilities, and robustness. The methodologies have the potential to significantly improve our ability to understand a variety of complex dynamical systems ranging from gene regulatory systems to social networks toward the ultimate goal of controlling such systems.
Dynamics of inductors for heating of the metal under deformation
NASA Astrophysics Data System (ADS)
Zimin, L. S.; Yeghiazaryan, A. S.; Protsenko, A. N.
2018-01-01
Current issues of creating powerful systems for hot sheet rolling with induction heating application in mechanical engineering and metallurgy were discussed. Electrodynamical and vibroacoustic problems occurring due to the induction heating of objects with complex shapes, particularly the slabs heating prior to rolling, were analysed. The numerical mathematical model using the method of related contours and the principle of virtual displacements is recommended for electrodynamical calculations. For the numerical solution of the vibrational problem, it is reasonable to use the finite element method (FEM). In general, for calculating the distribution forces, the law of Biot-Savart-Laplace method providing the determination of the current density of the skin layer in slab was used. The form of the optimal design of the inductor based on maximum hardness was synthesized while researching the vibrodynamic model of the system "inductor-metal" which provided allowable sound level meeting all established sanitary standards.
Douglas, P; Hayes, E T; Williams, W B; Tyrrel, S F; Kinnersley, R P; Walsh, K; O'Driscoll, M; Longhurst, P J; Pollard, S J T; Drew, G H
2017-12-01
With the increase in composting asa sustainable waste management option, biological air pollution (bioaerosols) from composting facilities have become a cause of increasing concern due to their potential health impacts. Estimating community exposure to bioaerosols is problematic due to limitations in current monitoring methods. Atmospheric dispersion modelling can be used to estimate exposure concentrations, however several issues arise from the lack of appropriate bioaerosol data to use as inputs into models, and the complexity of the emission sources at composting facilities. This paper analyses current progress in using dispersion models for bioaerosols, examines the remaining problems and provides recommendations for future prospects in this area. A key finding is the urgent need for guidance for model users to ensure consistent bioaerosol modelling practices. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Research and simulation of the decoupling transformation in AC motor vector control
NASA Astrophysics Data System (ADS)
He, Jiaojiao; Zhao, Zhongjie; Liu, Ken; Zhang, Yongping; Yao, Tuozhong
2018-04-01
Permanent magnet synchronous motor (PMSM) is a nonlinear, strong coupling, multivariable complex object, and transformation decoupling can solve the coupling problem of permanent magnet synchronous motor. This paper gives a permanent magnet synchronous motor (PMSM) mathematical model, introduces the permanent magnet synchronous motor vector control coordinate transformation in the process of modal matrix inductance matrix transform through the matrix related knowledge of different coordinates of diagonalization, which makes the coupling between the independent, realize the control of motor current and excitation the torque current coupling separation, and derived the coordinate transformation matrix, the thought to solve the coupling problem of AC motor. Finally, in the Matlab/Simulink environment, through the establishment and combination between the PMSM ontology, coordinate conversion module, built the simulation model of permanent magnet synchronous motor vector control, introduces the model of each part, and analyzed the simulation results.
The problem of motivating teaching staff in a complex amalgamation.
Kenrick, M A
1993-09-01
This paper addresses some of the problems brought about by the merger of a number of schools of nursing into a new complex amalgamation. A very real concern in the new colleges of nursing and midwifery in the United Kingdom is the effect of amalgamation on management systems and staff morale. The main focus of this paper is the motivation of staff during this time of change. There is currently a lack of security amongst staff and in many instances the personal job satisfaction of nurse teachers and managers of nurse education has been reduced, which has made the task of motivating staff difficult. Hence, two major theories of motivation and the implications of these theories for managers of nurse education are discussed. The criteria used for the selection of managers within the new colleges, leadership styles and organizational structures are reviewed. The amalgamations have brought about affiliation with higher-education institutions. Some problems associated with these mergers and the effects on the motivation of staff both within the higher-education institutions and the nursing colleges are outlined. Strategies for overcoming some of the problems are proposed including job enlargement, job enrichment, potential achievement rewards and the use of individual performance reviews which may be useful for assessing the ability of all staff, including managers, in the new amalgamations.
NASA Technical Reports Server (NTRS)
Cahan, Boris D.
1991-01-01
The Iterative Boundary Integral Equation Method (I-BIEM) has been applied to the problem of frequency dispersion at a disk electrode in a finite geometry. The I-BIEM permits the direct evaluation of the AC potential (a complex variable) using complex boundary conditions. The point spacing was made highly nonuniform, to give extremely high resolution in those regions where the variables change most rapidly, i.e., in the vicinity of the edge of the disk. Results are analyzed with respect to IR correction, equipotential surfaces, and reference electrode placement. The current distribution is also examined for a ring-disk configuration, with the ring and the disk at the same AC potential. It is shown that the apparent impedance of the disk is inductive at higher frequencies. The results are compared to analytic calculations from the literature, and usually agree to better than 0.001 percent.
NASA Technical Reports Server (NTRS)
Cahan, Boris D.
1991-01-01
The Iterative Boundary Integral Equation Method (I-BIEM) has been applied to the problem of frequency dispersion at a disk electrode in a finite geometry. The I-BIEM permits the direct evaluation of the AC potential (a complex variable) using complex boundary conditions. The point spacing was made highly nonuniform, to give extremely high resolution in those regions where the variables change most rapidly, i.e., in the vicinity of the edge of the disk. Results are analyzed with respect to IR correction, equipotential surfaces, and reference electrode placement. The current distribution is also examined for a ring-disk configuration, with the ring and the disk at the same AC potential. It is shown that the apparent impedance of the disk is inductive at higher frequencies. The results are compared to analytic calculations from the literature, and usually agree to better than 0.001 percent.
a Unified Matrix Polynomial Approach to Modal Identification
NASA Astrophysics Data System (ADS)
Allemang, R. J.; Brown, D. L.
1998-04-01
One important current focus of modal identification is a reformulation of modal parameter estimation algorithms into a single, consistent mathematical formulation with a corresponding set of definitions and unifying concepts. Particularly, a matrix polynomial approach is used to unify the presentation with respect to current algorithms such as the least-squares complex exponential (LSCE), the polyreference time domain (PTD), Ibrahim time domain (ITD), eigensystem realization algorithm (ERA), rational fraction polynomial (RFP), polyreference frequency domain (PFD) and the complex mode indication function (CMIF) methods. Using this unified matrix polynomial approach (UMPA) allows a discussion of the similarities and differences of the commonly used methods. the use of least squares (LS), total least squares (TLS), double least squares (DLS) and singular value decomposition (SVD) methods is discussed in order to take advantage of redundant measurement data. Eigenvalue and SVD transformation methods are utilized to reduce the effective size of the resulting eigenvalue-eigenvector problem as well.
Algorithms and architectures for robot vision
NASA Technical Reports Server (NTRS)
Schenker, Paul S.
1990-01-01
The scope of the current work is to develop practical sensing implementations for robots operating in complex, partially unstructured environments. A focus in this work is to develop object models and estimation techniques which are specific to requirements of robot locomotion, approach and avoidance, and grasp and manipulation. Such problems have to date received limited attention in either computer or human vision - in essence, asking not only how perception is in general modeled, but also what is the functional purpose of its underlying representations. As in the past, researchers are drawing on ideas from both the psychological and machine vision literature. Of particular interest is the development 3-D shape and motion estimates for complex objects when given only partial and uncertain information and when such information is incrementally accrued over time. Current studies consider the use of surface motion, contour, and texture information, with the longer range goal of developing a fused sensing strategy based on these sources and others.
NASA Technical Reports Server (NTRS)
Townsend, James C.; Weston, Robert P.; Eidson, Thomas M.
1993-01-01
The Framework for Interdisciplinary Design Optimization (FIDO) is a general programming environment for automating the distribution of complex computing tasks over a networked system of heterogeneous computers. For example, instead of manually passing a complex design problem between its diverse specialty disciplines, the FIDO system provides for automatic interactions between the discipline tasks and facilitates their communications. The FIDO system networks all the computers involved into a distributed heterogeneous computing system, so they have access to centralized data and can work on their parts of the total computation simultaneously in parallel whenever possible. Thus, each computational task can be done by the most appropriate computer. Results can be viewed as they are produced and variables changed manually for steering the process. The software is modular in order to ease migration to new problems: different codes can be substituted for each of the current code modules with little or no effect on the others. The potential for commercial use of FIDO rests in the capability it provides for automatically coordinating diverse computations on a networked system of workstations and computers. For example, FIDO could provide the coordination required for the design of vehicles or electronics or for modeling complex systems.
NASA Astrophysics Data System (ADS)
Mueller, J. A.; Runci, P. J.
2009-12-01
The recent passage of the American Climate and Energy Security Act by the U.S. House of Representatives in June of this year was a landmark in U.S. efforts to move climate change legislation through Congress. Although an historic achievement, the bill (and surrounding debate) highlights many concerns about the processes by which lawmakers and the public inform themselves about scientifically relevant problems and, subsequently, by which policy responses are crafted in a context of complexity, uncertainty, and competition for resources and attention. In light of the ever-increasing specialization of expertise in the sciences and other technical fields, and the inherent complexity of scientifically relevant problems such as climate change, society faces significant hurdles in its efforts to integrate knowledge and develop sufficient understanding of these problems to which it must respond with legislation or other effective collective or individual action. The emergence of a new class of experts who act as science-policy brokers may not be sufficient to cross these hurdles. Herein, we explore how society and the scientific community in particular can work toward closing the ever-growing gap between technical knowledge and society’s ability to comprehend and use it. Both authors are currently legislative fellows working on energy and climate change issues in the U.S. Senate.
NASA Technical Reports Server (NTRS)
Smith, Phillip J.; Billings, Charles; McCoy, C. Elaine; Orasanu, Judith
1999-01-01
The air traffic management system in the United States is an example of a distributed problem solving system. It has elements of both cooperative and competitive problem-solving. This system includes complex organizations such as Airline Operations Centers (AOCs), the FAA Air Traffic Control Systems Command Center (ATCSCC), and traffic management units (TMUs) at enroute centers and TRACONs, all of which have a major focus on strategic decision-making. It also includes individuals concerned more with tactical decisions (such as air traffic controllers and pilots). The architecture for this system has evolved over time to rely heavily on the distribution of tasks and control authority in order to keep cognitive complexity manageable for any one individual operator, and to provide redundancy (both human and technological) to serve as a safety net to catch the slips or mistakes that any one person or entity might make. Currently, major changes are being considered for this architecture, especially with respect to the locus of control, in an effort to improve efficiency and safety. This paper uses a series of case studies to help evaluate some of these changes from the perspective of system complexity, and to point out possible alternative approaches that might be taken to improve system performance. The paper illustrates the need to maintain a clear understanding of what is required to assure a high level of performance when alternative system architectures and decompositions are developed.
Current state and problems of integrated development of mineral resources base in Russia
NASA Astrophysics Data System (ADS)
Filimonova, I. V.; Eder, L. V.; Mishenin, M. V.; Mamakhatov, T. M.
2017-09-01
The article deals with the issues of integrated development of subsoil resources taking into account the actual problems facing the Russian oil and gas complex. The key factors determining the need for integrated development of subsoil resources have been systematized and investigated. These factors are the change of the hydrocarbon resource base quality, the improvement of the depletion degree of basic (unique and major) oil fields, the increase in the number of small and smallest oil fields discovered and introduced into development, the increased capital intensity and the riskiness of geological exploration, and the territorial location of new subsoil use facilities.
The design of multiplayer online video game systems
NASA Astrophysics Data System (ADS)
Hsu, Chia-chun A.; Ling, Jim; Li, Qing; Kuo, C.-C. J.
2003-11-01
The distributed Multiplayer Online Game (MOG) system is complex since it involves technologies in computer graphics, multimedia, artificial intelligence, computer networking, embedded systems, etc. Due to the large scope of this problem, the design of MOG systems has not yet been widely addressed in the literatures. In this paper, we review and analyze the current MOG system architecture followed by evaluation. Furthermore, we propose a clustered-server architecture to provide a scalable solution together with the region oriented allocation strategy. Two key issues, i.e. interesting management and synchronization, are discussed in depth. Some preliminary ideas to deal with the identified problems are described.
Schedule Risks Due to Delays in Advanced Technology Development
NASA Technical Reports Server (NTRS)
Reeves, John D. Jr.; Kayat, Kamal A.; Lim, Evan
2008-01-01
This paper discusses a methodology and modeling capability that probabilistically evaluates the likelihood and impacts of delays in advanced technology development prior to the start of design, development, test, and evaluation (DDT&E) of complex space systems. The challenges of understanding and modeling advanced technology development considerations are first outlined, followed by a discussion of the problem in the context of lunar surface architecture analysis. The current and planned methodologies to address the problem are then presented along with sample analyses and results. The methodology discussed herein provides decision-makers a thorough understanding of the schedule impacts resulting from the inclusion of various enabling advanced technology assumptions within system design.
A design thinking framework for healthcare management and innovation.
Roberts, Jess P; Fisher, Thomas R; Trowbridge, Matthew J; Bent, Christine
2016-03-01
The business community has learned the value of design thinking as a way to innovate in addressing people's needs--and health systems could benefit enormously from doing the same. This paper lays out how design thinking applies to healthcare challenges and how systems might utilize this proven and accessible problem-solving process. We show how design thinking can foster new approaches to complex and persistent healthcare problems through human-centered research, collective and diverse teamwork and rapid prototyping. We introduce the core elements of design thinking for a healthcare audience and show how it can supplement current healthcare management, innovation and practice. Copyright © 2015 Elsevier Inc. All rights reserved.
Perspective: Machine learning potentials for atomistic simulations
NASA Astrophysics Data System (ADS)
Behler, Jörg
2016-11-01
Nowadays, computer simulations have become a standard tool in essentially all fields of chemistry, condensed matter physics, and materials science. In order to keep up with state-of-the-art experiments and the ever growing complexity of the investigated problems, there is a constantly increasing need for simulations of more realistic, i.e., larger, model systems with improved accuracy. In many cases, the availability of sufficiently efficient interatomic potentials providing reliable energies and forces has become a serious bottleneck for performing these simulations. To address this problem, currently a paradigm change is taking place in the development of interatomic potentials. Since the early days of computer simulations simplified potentials have been derived using physical approximations whenever the direct application of electronic structure methods has been too demanding. Recent advances in machine learning (ML) now offer an alternative approach for the representation of potential-energy surfaces by fitting large data sets from electronic structure calculations. In this perspective, the central ideas underlying these ML potentials, solved problems and remaining challenges are reviewed along with a discussion of their current applicability and limitations.
NASA Astrophysics Data System (ADS)
Beckstein, Pascal; Galindo, Vladimir; Vukčević, Vuko
2017-09-01
Eddy-current problems occur in a wide range of industrial and metallurgical applications where conducting material is processed inductively. Motivated by realising coupled multi-physics simulations, we present a new method for the solution of such problems in the finite volume framework of foam-extend, an extended version of the very popular OpenFOAM software. The numerical procedure involves a semi-coupled multi-mesh approach to solve Maxwell's equations for non-magnetic materials by means of the Coulomb gauged magnetic vector potential A and the electric scalar potential ϕ. The concept is further extended on the basis of the impressed and reduced magnetic vector potential and its usage in accordance with Biot-Savart's law to achieve a very efficient overall modelling even for complex three-dimensional geometries. Moreover, we present a special discretisation scheme to account for possible discontinuities in the electrical conductivity. To complement our numerical method, an extensive validation is completing the paper, which provides insight into the behaviour and the potential of our approach.
Rodrigues, Charlene M C; Pinto, Marta V; Sadarangani, Manish; Plotkin, Stanley A
2017-06-01
Currently used vaccines have had major effects on eliminating common infections, largely by duplicating the immune responses induced by natural infections. Now vaccinology faces more complex problems, such as waning antibody, immunosenescence, evasion of immunity by the pathogen, deviation of immunity by the microbiome, induction of inhibitory responses, and complexity of the antigens required for protection. Fortunately, vaccine development is now incorporating knowledge from immunology, structural biology, systems biology and synthetic chemistry to meet these challenges. In addition, international organisations are developing new funding and licensing pathways for vaccines aimed at pathogens with epidemic potential that emerge from tropical areas. © 2017 The British Infection Association. Published by Elsevier Ltd. All rights reserved.
Applications of Metal Additive Manufacturing in Veterinary Orthopedic Surgery
NASA Astrophysics Data System (ADS)
Harrysson, Ola L. A.; Marcellin-Little, Denis J.; Horn, Timothy J.
2015-03-01
Veterinary medicine has undergone a rapid increase in specialization over the last three decades. Veterinarians now routinely perform joint replacement, neurosurgery, limb-sparing surgery, interventional radiology, radiation therapy, and other complex medical procedures. Many procedures involve advanced imaging and surgical planning. Evidence-based medicine has also become part of the modus operandi of veterinary clinicians. Modeling and additive manufacturing can provide individualized or customized therapeutic solutions to support the management of companion animals with complex medical problems. The use of metal additive manufacturing is increasing in veterinary orthopedic surgery. This review describes and discusses current and potential applications of metal additive manufacturing in veterinary orthopedic surgery.
Improving the Aircraft Design Process Using Web-Based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)
2000-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Improving the Aircraft Design Process Using Web-based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.
2003-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Design and manufacturing methods for the integral field unit of the nirspec instrument on JWST
NASA Astrophysics Data System (ADS)
Lobb, Dan; Robertson, David
2017-11-01
An integral field unit, to be used with the near-IR spectrometer instrument of the James Webb Space Telescope (JWST), is currently under development by SSTL and CfAI. Special problems in design and manufacture of the optical system are outlined, and manufacturing methods for critical optical elements are discussed. The optical system is complex, requiring a total of 95 mirrors to produce 30 output channels. Emphasis is placed on the advantages of free-form machining in aluminium. These include: resistance to launch stress, insensitivity to temperature variations from ambient to cryogenic, and the possibility of relatively complex mirror surface shapes.
2006 Y-12 National Security Complex Annual Illness and Injury Surveillance Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
U.S. Department of Energy, Office of Health, Safety and Health, Office of Health and Safety, Office of Illness and Injury Prevention Programs
2008-04-17
The U.S. Department of Energy’s (DOE) commitment to assuring the health and safety of its workers includes the conduct of illness and injury surveillance activities that provide an early warning system to detect health problems among workers. The Illness and Injury Surveillance Program monitors illnesses and health conditions that result in an absence, occupational injuries and illnesses, and disabilities and deaths among current workers.
2009 Y-12 National Security Complex Annual Illness and Injury Surveillance Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
U.S. Department of Energy, Office of Health, Safety and Security, Office of Health and Safety, Office of Illness and Injury Prevention Programs
2010-07-09
The U.S. Department of Energy’s (DOE) commitment to assuring the health and safety of its workers includes the conduct of epidemiologic surveillance activities that provide an early warning system for health problems among workers. The Illness and Injury Surveillance Program monitors illnesses and health conditions that result in an absence of workdays, occupational injuries and illnesses, and disabilities and deaths among current workers.
2008 Y-12 National Security Complex Annual Illness and Injury Surveillance Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
U.S. Department of Energy, Office of Health, Safety and Security, Office of Health and Safety, Office of Illness and Injury Prevention Programs
2009-12-11
The U.S. Department of Energy’s (DOE) commitment to assuring the health and safety of its workers includes the conduct of epidemiologic surveillance activities that provide an early warning system for health problems among workers. The Illness and Injury Surveillance Program monitors illnesses and health conditions that result in an absence of workdays, occupational injuries and illnesses, and disabilities and deaths among current workers.
2010 Y-12 National Security Complex Annual Illness and Injury Surveillance Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
U.S. Department of Energy, Office of Health, Safety and Health, Office of Health and Safety, Office of Illness and Injury Prevention Programs
2011-08-31
The U.S. Department of Energy's (DOE) commitment to assuring the health and safety of its workers includes the conduct of illness and injury surveillance activities that provide an early warning system to detect health problems among workers. The Illness and Injury Surveillance Program monitors illnesses and health conditions that result in an absence, occupational injuries and illnesses, and disabilities and deaths among current workers.
2007 Y-12 National Security Complex Annual Illness and Injury Surveillance Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
U.S. Department of Energy, Office of Health, Safety and Health, Office of Health and Safety, Office of Illness and Injury Prevention Programs
2009-07-01
The U.S. Department of Energy’s (DOE) commitment to assuring the health and safety of its workers includes the conduct of epidemiologic surveillance activities that provide an early warning system for health problems among workers. The Illness and Injury Surveillance Program monitors illnesses and health conditions that result in an absence of workdays, occupational injuries and illnesses, and disabilities and deaths among current workers.
European Science Notes Information Bulletin Reports on Current European/ Middle Eastern Science
1991-06-01
particularly those that involve shock wave/boundary layer cell-centered, finite-volume, explicit, Runge-Kutta interactions , still prov;de considerble...aircraft configuration attributed to using an interactive vcual grid generation was provided by A. Bocci and A. Baxendale, the Aircraft system developed...the surface pressure the complex problem of wing/body/pylon/store distributions with and without the mass flow through the interaction . Reasonable
Rights of Privacy and Research Needs: A Problem Whose Time Has Arrived.
ERIC Educational Resources Information Center
Hayman, John L. Jr.
There is no more fundamental right in our system than the right of privacy--the right to be let alone. Current trends lead to a major assault on this right, and one of the great tests of the viability of our system is its ability to preserve this right in the face of increasing complexity and increasing needs for control. As part of the scientific…
Toward the modelling of safety violations in healthcare systems.
Catchpole, Ken
2013-09-01
When frontline staff do not adhere to policies, protocols, or checklists, managers often regard these violations as indicating poor practice or even negligence. More often than not, however, these policy and protocol violations reflect the efforts of well intentioned professionals to carry out their work efficiently in the face of systems poorly designed to meet the diverse demands of patient care. Thus, non-compliance with institutional policies and protocols often signals a systems problem, rather than a people problem, and can be influenced among other things by training, competing goals, context, process, location, case complexity, individual beliefs, the direct or indirect influence of others, job pressure, flexibility, rule definition, and clinician-centred design. Three candidates are considered for developing a model of safety behaviour and decision making. The dynamic safety model helps to understand the relationship between systems designs and human performance. The theory of planned behaviour suggests that intention is a function of attitudes, social norms and perceived behavioural control. The naturalistic decision making paradigm posits that decisions are based on a wider view of multiple patients, expertise, systems complexity, behavioural intention, individual beliefs and current understanding of the system. Understanding and predicting behavioural safety decisions could help us to encourage compliance to current processes and to design better interventions.
[Complexity of social and healthcare coordination in addictions and the role of the nurse].
Molina Fernández, Antonio Jesús; González Riera, Javier; Montero Bancalero, Francisco José; Gómez-Salgado, Juan
2016-01-01
The present article discusses the psychosocial impact of basic and advanced concepts, such as social support and prevention, as well as to establish a link between theoretical models related to the social sphere on one side, and the health aspects on the other. This work is based on the context of the influence on health shared by community psychology and social psychology. Starting from the historical background of current approaches, a review is presented of those first actions focused on the care plan and they are framed in a reaction model to the drug problem, which progressed to the current healthcare network model, through the creation of Spanish National Action Plan on Drugs. The complexity of the problem is then broken down into the following key elements: Multifactorial Model of Drugs and Addictions, importance of prevention, and social support. Subsequently, a description is presented on the different levels of the healthcare network, with their different resources. This is also illustrated using a coordination protocol. Finally, it features the nursing approach to drugs, with its contributions, particularly as regards the coordination of resources, and aspects that must be developed for improvement in this area. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
Measuring flows in the solar interior: current developments, results, and outstanding problems
NASA Astrophysics Data System (ADS)
Schad, Ariane
2016-10-01
I will present an overview of the current developments to determine flows in the solar interior and recent results from helioseismology. I will lay special focus on the inference of the deep structure of the meridional flow, which is one of the most challenging problems in helioseismology. In recent times, promising approaches have been developed for solving this problem. The time-distance analysis made large improvements in this after becoming aware of and compensating for a systematic effect in the analysis, the origin of which is not clear yet. In addition to this, a different approach is now available, which directly exploits the distortion of mode eigenfunctions by the meridional flow as well as rotation. These methods have presented us partly surprisingly complex meridional flow patterns, which, however, do not provide a consistent picture of the flow. Resolving this puzzle is part of current research since this has important consequences on our understanding of the solar dynamo. Another interesting discrepancy was found in recent studies between the amplitudes of the large- and small-scale dynamics in the convection zone estimated from helioseismology and those predicted from theoretical models. This raises fundamental questions how the Sun, and in general a star, maintains its heat transport and redistributes its angular momentum that lead, e.g., to the observed differential rotation.
NASA Astrophysics Data System (ADS)
Loepp, Susan; Wootters, William K.
2006-09-01
For many everyday transmissions, it is essential to protect digital information from noise or eavesdropping. This undergraduate introduction to error correction and cryptography is unique in devoting several chapters to quantum cryptography and quantum computing, thus providing a context in which ideas from mathematics and physics meet. By covering such topics as Shor's quantum factoring algorithm, this text informs the reader about current thinking in quantum information theory and encourages an appreciation of the connections between mathematics and science.Of particular interest are the potential impacts of quantum physics:(i) a quantum computer, if built, could crack our currently used public-key cryptosystems; and (ii) quantum cryptography promises to provide an alternative to these cryptosystems, basing its security on the laws of nature rather than on computational complexity. No prior knowledge of quantum mechanics is assumed, but students should have a basic knowledge of complex numbers, vectors, and matrices. Accessible to readers familiar with matrix algebra, vector spaces and complex numbers First undergraduate text to cover cryptography, error-correction, and quantum computation together Features exercises designed to enhance understanding, including a number of computational problems, available from www.cambridge.org/9780521534765
Multiphysics analysis of liquid metal annular linear induction pumps: A project overview
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maidana, Carlos Omar; Nieminen, Juha E.
Liquid metal-cooled fission reactors are both moderated and cooled by a liquid metal solution. These reactors are typically very compact and they can be used in regular electric power production, for naval and space propulsion systems or in fission surface power systems for planetary exploration. The coupling between the electromagnetics and thermo-fluid mechanical phenomena observed in liquid metal thermo-magnetic systems for nuclear and space applications gives rise to complex engineering magnetohydrodynamics and numerical problems. It is known that electromagnetic pumps have a number of advantages over rotating mechanisms: absence of moving parts, low noise and vibration level, simplicity of flowmore » rate regulation, easy maintenance and so on. However, while developing annular linear induction pumps, we are faced with a significant problem of magnetohydrodynamic instability arising in the device. The complex flow behavior in this type of devices includes a time-varying Lorentz force and pressure pulsation due to the time-varying electromagnetic fields and the induced convective currents that originates from the liquid metal flow, leading to instability problems along the device geometry. The determinations of the geometry and electrical configuration of liquid metal thermo-magnetic devices give rise to a complex inverse magnetohydrodynamic field problem were techniques for global optimization should be used, magnetohydrodynamics instabilities understood –or quantified- and multiphysics models developed and analyzed. Lastly, we present a project overview as well as a few computational models developed to study liquid metal annular linear induction pumps using first principles and the a few results of our multi-physics analysis.« less
Multiphysics analysis of liquid metal annular linear induction pumps: A project overview
Maidana, Carlos Omar; Nieminen, Juha E.
2016-03-14
Liquid metal-cooled fission reactors are both moderated and cooled by a liquid metal solution. These reactors are typically very compact and they can be used in regular electric power production, for naval and space propulsion systems or in fission surface power systems for planetary exploration. The coupling between the electromagnetics and thermo-fluid mechanical phenomena observed in liquid metal thermo-magnetic systems for nuclear and space applications gives rise to complex engineering magnetohydrodynamics and numerical problems. It is known that electromagnetic pumps have a number of advantages over rotating mechanisms: absence of moving parts, low noise and vibration level, simplicity of flowmore » rate regulation, easy maintenance and so on. However, while developing annular linear induction pumps, we are faced with a significant problem of magnetohydrodynamic instability arising in the device. The complex flow behavior in this type of devices includes a time-varying Lorentz force and pressure pulsation due to the time-varying electromagnetic fields and the induced convective currents that originates from the liquid metal flow, leading to instability problems along the device geometry. The determinations of the geometry and electrical configuration of liquid metal thermo-magnetic devices give rise to a complex inverse magnetohydrodynamic field problem were techniques for global optimization should be used, magnetohydrodynamics instabilities understood –or quantified- and multiphysics models developed and analyzed. Lastly, we present a project overview as well as a few computational models developed to study liquid metal annular linear induction pumps using first principles and the a few results of our multi-physics analysis.« less
Genome-wide detection of intervals of genetic heterogeneity associated with complex traits
Llinares-López, Felipe; Grimm, Dominik G.; Bodenham, Dean A.; Gieraths, Udo; Sugiyama, Mahito; Rowan, Beth; Borgwardt, Karsten
2015-01-01
Motivation: Genetic heterogeneity, the fact that several sequence variants give rise to the same phenotype, is a phenomenon that is of the utmost interest in the analysis of complex phenotypes. Current approaches for finding regions in the genome that exhibit genetic heterogeneity suffer from at least one of two shortcomings: (i) they require the definition of an exact interval in the genome that is to be tested for genetic heterogeneity, potentially missing intervals of high relevance, or (ii) they suffer from an enormous multiple hypothesis testing problem due to the large number of potential candidate intervals being tested, which results in either many false positives or a lack of power to detect true intervals. Results: Here, we present an approach that overcomes both problems: it allows one to automatically find all contiguous sequences of single nucleotide polymorphisms in the genome that are jointly associated with the phenotype. It also solves both the inherent computational efficiency problem and the statistical problem of multiple hypothesis testing, which are both caused by the huge number of candidate intervals. We demonstrate on Arabidopsis thaliana genome-wide association study data that our approach can discover regions that exhibit genetic heterogeneity and would be missed by single-locus mapping. Conclusions: Our novel approach can contribute to the genome-wide discovery of intervals that are involved in the genetic heterogeneity underlying complex phenotypes. Availability and implementation: The code can be obtained at: http://www.bsse.ethz.ch/mlcb/research/bioinformatics-and-computational-biology/sis.html. Contact: felipe.llinares@bsse.ethz.ch Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26072488
NASA Astrophysics Data System (ADS)
Ibrahim, Ireen Munira; Liong, Choong-Yeun; Bakar, Sakhinah Abu; Ahmad, Norazura; Najmuddin, Ahmad Farid
2015-12-01
The Emergency Department (ED) is a very complex system with limited resources to support increase in demand. ED services are considered as good quality if they can meet the patient's expectation. Long waiting times and length of stay is always the main problem faced by the management. The management of ED should give greater emphasis on their capacity of resources in order to increase the quality of services, which conforms to patient satisfaction. This paper is a review of work in progress of a study being conducted in a government hospital in Selangor, Malaysia. This paper proposed a simulation optimization model framework which is used to study ED operations and problems as well as to find an optimal solution to the problems. The integration of simulation and optimization is hoped can assist management in decision making process regarding their resource capacity planning in order to improve current and future ED operations.
Winskell, Kate; McFarland, Deborah A.; del Rio, Carlos
2015-01-01
Global health is a dynamic, emerging, and interdisciplinary field. To address current and emerging global health challenges, we need a public health workforce with adaptable and collaborative problem-solving skills. In the 2013–2014 academic year, the Hubert Department of Global Health at the Rollins School of Public Health–Emory University launched an innovative required core course for its first-year Master of Public Health students in the global health track. The course uses a case-based, problem-based learning approach to develop global health competencies. Small teams of students propose solutions to these problems by identifying learning issues and critically analyzing and synthesizing new information. We describe the course structure and logistics used to apply this approach in the context of a large class and share lessons learned. PMID:25706029
From problem solving to problem definition: scrutinizing the complex nature of clinical practice.
Cristancho, Sayra; Lingard, Lorelei; Regehr, Glenn
2017-02-01
In medical education, we have tended to present problems as being singular, stable, and solvable. Problem solving has, therefore, drawn much of medical education researchers' attention. This focus has been important but it is limited in terms of preparing clinicians to deal with the complexity of the 21st century healthcare system in which they will provide team-based care for patients with complex medical illness. In this paper, we use the Soft Systems Engineering principles to introduce the idea that in complex, team-based situations, problems usually involve divergent views and evolve with multiple solution iterations. As such we need to shift the conversation from (1) problem solving to problem definition, and (2) from a problem definition derived exclusively at the level of the individual to a definition derived at the level of the situation in which the problem is manifested. Embracing such a focus on problem definition will enable us to advocate for novel educational practices that will equip trainees to effectively manage the problems they will encounter in complex, team-based healthcare.
Developing Students' Understanding of Complex Systems in the Geosciences (Invited)
NASA Astrophysics Data System (ADS)
Manduca, C. A.; Mogk, D. W.; Bice, D. M.; Pyle, E.; Slotta, J.
2010-12-01
Developing a systems perspective is a commonly cited goal for geosciences courses and programs. This perspective is a powerful tool for critical thinking, problem solving and integrative thinking across and beyond the sciences. In April 2010, a NSF funded ‘On the Cutting Edge’ workshop brought together 45 geoscience faculty, education and cognitive science researchers, and faculty from other STEM and social science disciplines that make use of a complex systems approach. The workshop participants focused on understanding the challenges inherent in developing an understanding of complex systems and the teaching strategies currently in use across the disciplines. These include using models and visualizations to allow students to experiment with complex systems, using projects and problems to give students experience with data and observations derived from a complex system, and using illustrated lectures and discussions and analogies to illuminate the salient aspects of complex systems. The workshop website contains a collection of teaching activities, instructional resources and courses that demonstrate these approaches. The workshop participants concluded that research leading to a clear articulation of what constitutes understanding complex system behavior is needed, as are instruments and performance measures that could be used to assess this understanding. Developing the ability to recognize complex systems and understand their behavior is a significant learning task that cannot be achieved in a single course. Rather it is a type of literacy that should be taught in a progression extending from elementary school to college and across the disciplines. Research defining this progression and its endpoints is needed. Full information about the workshop, its discussions, and resulting collections of courses, activities, references and ideas are available on the workshop website.
Teaching children with autism to explain how: A case for problem solving?
Frampton, Sarah E; Alice Shillingsburg, M
2018-04-01
Few studies have applied Skinner's (1953) conceptualization of problem solving to teach socially significant behaviors to individuals with developmental disabilities. The current study used a multiple probe design across behavior (sets) to evaluate the effects of problem-solving strategy training (PSST) on the target behavior of explaining how to complete familiar activities. During baseline, none of the three participants with autism spectrum disorder (ASD) could respond to the problems presented to them (i.e., explain how to do the activities). Tact training of the actions in each activity alone was ineffective; however, all participants demonstrated independent explaining-how following PSST. Further, following PSST with Set 1, tact training alone was sufficient for at least one scenario in sets 2 and 3 for all 3 participants. Results have implications for generative responding for individuals with ASD and further the discussion regarding the role of problem solving in complex verbal behavior. © 2018 Society for the Experimental Analysis of Behavior.
Lattice Boltzmann computation of creeping fluid flow in roll-coating applications
NASA Astrophysics Data System (ADS)
Rajan, Isac; Kesana, Balashanker; Perumal, D. Arumuga
2018-04-01
Lattice Boltzmann Method (LBM) has advanced as a class of Computational Fluid Dynamics (CFD) methods used to solve complex fluid systems and heat transfer problems. It has ever-increasingly attracted the interest of researchers in computational physics to solve challenging problems of industrial and academic importance. In this current study, LBM is applied to simulate the creeping fluid flow phenomena commonly encountered in manufacturing technologies. In particular, we apply this novel method to simulate the fluid flow phenomena associated with the "meniscus roll coating" application. This prevalent industrial problem encountered in polymer processing and thin film coating applications is modelled as standard lid-driven cavity problem to which creeping flow analysis is applied. This incompressible viscous flow problem is studied in various speed ratios, the ratio of upper to lower lid speed in two different configurations of lid movement - parallel and anti-parallel wall motion. The flow exhibits interesting patterns which will help in design of roll coaters.
Current and Future Critical Issues in Rocket Propulsion Systems
NASA Technical Reports Server (NTRS)
Navaz, Homayun K.; Dix, Jeff C.
1998-01-01
The objective of this research was to tackle several problems that are currently of great importance to NASA. In a liquid rocket engine several complex processes take place that are not thoroughly understood. Droplet evaporation, turbulence, finite rate chemistry, instability, and injection/atomization phenomena are some of the critical issues being encountered in a liquid rocket engine environment. Pulse Detonation Engines (PDE) performance, combustion chamber instability analysis, 60K motor flowfield pattern from hydrocarbon fuel combustion, and 3D flowfield analysis for the Combined Cycle engine were of special interest to NASA. During the summer of 1997, we made an attempt to generate computational results for all of the above problems and shed some light on understanding some of the complex physical phenomena. For this purpose, the Liquid Thrust Chamber Performance (LTCP) code, mainly designed for liquid rocket engine applications, was utilized. The following test cases were considered: (1) Characterization of a detonation wave in a Pulse Detonation Tube; (2) 60K Motor wall temperature studies; (3) Propagation of a pressure pulse in a combustion chamber (under single and two-phase flow conditions); (4) Transonic region flowfield analysis affected by viscous effects; (5) Exploring the viscous differences between a smooth and a corrugated wall; and (6) 3D thrust chamber flowfield analysis of the Combined Cycle engine. It was shown that the LTCP-2D and LTCP-3D codes are capable of solving complex and stiff conservation equations for gaseous and droplet phases in a very robust and efficient manner. These codes can be run on a workstation and personal computers (PC's).
A heuristic for efficient data distribution management in distributed simulation
NASA Astrophysics Data System (ADS)
Gupta, Pankaj; Guha, Ratan K.
2005-05-01
In this paper, we propose an algorithm for reducing the complexity of region matching and efficient multicasting in data distribution management component of High Level Architecture (HLA) Run Time Infrastructure (RTI). The current data distribution management (DDM) techniques rely on computing the intersection between the subscription and update regions. When a subscription region and an update region of different federates overlap, RTI establishes communication between the publisher and the subscriber. It subsequently routes the updates from the publisher to the subscriber. The proposed algorithm computes the update/subscription regions matching for dynamic allocation of multicast group. It provides new multicast routines that exploit the connectivity of federation by communicating updates regarding interactions and routes information only to those federates that require them. The region-matching problem in DDM reduces to clique-covering problem using the connections graph abstraction where the federations represent the vertices and the update/subscribe relations represent the edges. We develop an abstract model based on connection graph for data distribution management. Using this abstract model, we propose a heuristic for solving the region-matching problem of DDM. We also provide complexity analysis of the proposed heuristics.
Gentle Masking of Low-Complexity Sequences Improves Homology Search
Frith, Martin C.
2011-01-01
Detection of sequences that are homologous, i.e. descended from a common ancestor, is a fundamental task in computational biology. This task is confounded by low-complexity tracts (such as atatatatatat), which arise frequently and independently, causing strong similarities that are not homologies. There has been much research on identifying low-complexity tracts, but little research on how to treat them during homology search. We propose to find homologies by aligning sequences with “gentle” masking of low-complexity tracts. Gentle masking means that the match score involving a masked letter is , where is the unmasked score. Gentle masking slightly but noticeably improves the sensitivity of homology search (compared to “harsh” masking), without harming specificity. We show examples in three useful homology search problems: detection of NUMTs (nuclear copies of mitochondrial DNA), recruitment of metagenomic DNA reads to reference genomes, and pseudogene detection. Gentle masking is currently the best way to treat low-complexity tracts during homology search. PMID:22205972
Martin, Aaron; Kistler, Charles Andrew; Wrobel, Piotr; Yang, Juliana F.; Siddiqui, Ali A.
2016-01-01
The management of pancreaticobiliary disease in patients with surgically altered anatomy is a growing problem for gastroenterologists today. Over the years, endoscopic ultrasound (EUS) has emerged as an important diagnostic and therapeutic modality in the treatment of pancreaticobiliary disease. Patient anatomy has become increasingly complex due to advances in surgical resection of pancreaticobiliary disease and EUS has emerged as the therapy of choice when endoscopic retrograde cholangiopancreatography failed cannulation or when the papilla is inaccessible such as in gastric obstruction or duodenal obstruction. The current article gives a comprehensive review of the current literature for EUS-guided intervention of the pancreaticobiliary tract in patients with altered surgical anatomy. PMID:27386471
Toward Modeling the Intrinsic Complexity of Test Problems
ERIC Educational Resources Information Center
Shoufan, Abdulhadi
2017-01-01
The concept of intrinsic complexity explains why different problems of the same type, tackled by the same problem solver, can require different times to solve and yield solutions of different quality. This paper proposes a general four-step approach that can be used to establish a model for the intrinsic complexity of a problem class in terms of…
ERIC Educational Resources Information Center
Tang, Hui; Kirk, John; Pienta, Norbert J.
2014-01-01
This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we developed. The…
Building information modelling review with potential applications in tunnel engineering of China.
Zhou, Weihong; Qin, Haiyang; Qiu, Junling; Fan, Haobo; Lai, Jinxing; Wang, Ke; Wang, Lixin
2017-08-01
Building information modelling (BIM) can be applied to tunnel engineering to address a number of problems, including complex structure, extensive design, long construction cycle and increased security risks. To promote the development of tunnel engineering in China, this paper combines actual cases, including the Xingu mountain tunnel and the Shigu Mountain tunnel, to systematically analyse BIM applications in tunnel engineering in China. The results indicate that BIM technology in tunnel engineering is currently mainly applied during the design stage rather than during construction and operation stages. The application of BIM technology in tunnel engineering covers many problems, such as a lack of standards, incompatibility of different software, disorganized management, complex combination with GIS (Geographic Information System), low utilization rate and poor awareness. In this study, through summary of related research results and engineering cases, suggestions are introduced and an outlook for the BIM application in tunnel engineering in China is presented, which provides guidance for design optimization, construction standards and later operation maintenance.
Understanding and simulating the material behavior during multi-particle irradiations
Mir, Anamul H.; Toulemonde, M.; Jegou, C.; Miro, S.; Serruys, Y.; Bouffard, S.; Peuget, S.
2016-01-01
A number of studies have suggested that the irradiation behavior and damage processes occurring during sequential and simultaneous particle irradiations can significantly differ. Currently, there is no definite answer as to why and when such differences are seen. Additionally, the conventional multi-particle irradiation facilities cannot correctly reproduce the complex irradiation scenarios experienced in a number of environments like space and nuclear reactors. Therefore, a better understanding of multi-particle irradiation problems and possible alternatives are needed. This study shows ionization induced thermal spike and defect recovery during sequential and simultaneous ion irradiation of amorphous silica. The simultaneous irradiation scenario is shown to be equivalent to multiple small sequential irradiation scenarios containing latent damage formation and recovery mechanisms. The results highlight the absence of any new damage mechanism and time-space correlation between various damage events during simultaneous irradiation of amorphous silica. This offers a new and convenient way to simulate and understand complex multi-particle irradiation problems. PMID:27466040
Building information modelling review with potential applications in tunnel engineering of China
Zhou, Weihong; Qin, Haiyang; Fan, Haobo; Lai, Jinxing; Wang, Ke; Wang, Lixin
2017-01-01
Building information modelling (BIM) can be applied to tunnel engineering to address a number of problems, including complex structure, extensive design, long construction cycle and increased security risks. To promote the development of tunnel engineering in China, this paper combines actual cases, including the Xingu mountain tunnel and the Shigu Mountain tunnel, to systematically analyse BIM applications in tunnel engineering in China. The results indicate that BIM technology in tunnel engineering is currently mainly applied during the design stage rather than during construction and operation stages. The application of BIM technology in tunnel engineering covers many problems, such as a lack of standards, incompatibility of different software, disorganized management, complex combination with GIS (Geographic Information System), low utilization rate and poor awareness. In this study, through summary of related research results and engineering cases, suggestions are introduced and an outlook for the BIM application in tunnel engineering in China is presented, which provides guidance for design optimization, construction standards and later operation maintenance. PMID:28878970
Building information modelling review with potential applications in tunnel engineering of China
NASA Astrophysics Data System (ADS)
Zhou, Weihong; Qin, Haiyang; Qiu, Junling; Fan, Haobo; Lai, Jinxing; Wang, Ke; Wang, Lixin
2017-08-01
Building information modelling (BIM) can be applied to tunnel engineering to address a number of problems, including complex structure, extensive design, long construction cycle and increased security risks. To promote the development of tunnel engineering in China, this paper combines actual cases, including the Xingu mountain tunnel and the Shigu Mountain tunnel, to systematically analyse BIM applications in tunnel engineering in China. The results indicate that BIM technology in tunnel engineering is currently mainly applied during the design stage rather than during construction and operation stages. The application of BIM technology in tunnel engineering covers many problems, such as a lack of standards, incompatibility of different software, disorganized management, complex combination with GIS (Geographic Information System), low utilization rate and poor awareness. In this study, through summary of related research results and engineering cases, suggestions are introduced and an outlook for the BIM application in tunnel engineering in China is presented, which provides guidance for design optimization, construction standards and later operation maintenance.
Social adversities in first-time and repeat prisoners.
Kjelsberg, Ellen; Friestad, Christine
2008-11-01
To explore possible systematic differences between prison inmates serving their first sentence and inmates having experienced previous incarcerations. It is hoped that a better knowledge of these issues will make us better equipped to meet the rehabilitation needs of our prisoners and decrease their risk of reoffending and reincarceration. In this cross-sectional study a randomly selected and nationally representative sample of 260 Norwegian prisoners, 100 serving their first sentence and 160 recidivists, was interviewed with special focus on childhood circumstances, education, work experience, and present social and economic situation. In addition their criminal records were collected from the National Crime Registry. In males multivariate analyses identified a number of variables independently and significantly associated with being a repeat offender. The odds for reincarceration increased significantly if the person fulfilled any one of the following criteria: having experienced the incarceration of a family member during childhood (OR = 3.6); having experienced childcare interventions during childhood (OR = 3.2); current drug abuse (OR = 2.6); current housing problems (OR = 2.3). In females only one strong correlation emerged: if the person had current drug problems the odds for being a recidivist increased substantially (OR = 10.9). While criminal reoffending and reincarceration seemed to be primarily associated with drug abuse in females, the childhood problems of male repeat offenders, compared with males serving their first sentence, indicate that these individuals' current multiple social and economic disadvantages were complex in origin and of long standing. Interventions aimed at preventing reoffending must take into account the gender differences demonstrated. ing at primary prevention, the negative effects associated with parental incarceration are crucial: how can one prevent the perpetuation of these problems from one generation to the next?
Army Science & Technology: Problems and Challenges
2012-03-01
Boundary Conditions: Who: Small Units is COIN/Stability Operations What: Provide affordable real-time translations and d t di f b h i f l i th t i...Soldiers, Leaders and Units in complex tactical operations exceeds the Army’s current capability for home-station Challenge: Formulate a S& T program...Formulate a S& T program to capture, process and electronically a vance rauma managemen . disseminate near-real-time medical information on Soldier
Multigrid Methods for Aerodynamic Problems in Complex Geometries
NASA Technical Reports Server (NTRS)
Caughey, David A.
1995-01-01
Work has been directed at the development of efficient multigrid methods for the solution of aerodynamic problems involving complex geometries, including the development of computational methods for the solution of both inviscid and viscous transonic flow problems. The emphasis is on problems of complex, three-dimensional geometry. The methods developed are based upon finite-volume approximations to both the Euler and the Reynolds-Averaged Navier-Stokes equations. The methods are developed for use on multi-block grids using diagonalized implicit multigrid methods to achieve computational efficiency. The work is focused upon aerodynamic problems involving complex geometries, including advanced engine inlets.
2015-07-14
AFRL-OSR-VA-TR-2015-0202 Robust Decision Making: The Cognitive and Computational Modeling of Team Problem Solving for Decision Making under Complex...Computational Modeling of Team Problem Solving for Decision Making Under Complex and Dynamic Conditions 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-12-1...functioning as they solve complex problems, and propose the means to improve the performance of teams, under changing or adversarial conditions. By
[Sexually transmitted diseases: the impact of stigma and taboo on current medical care].
Badura-Lotter, G
2014-04-01
Sexually transmitted diseases (STD) are probably the most tabooed diseases we know. The many taboos and the related stigmata shape patients' lives and significantly influence health care policies, medical research, and current problems in medical ethics. To better understand these complex influences, the still powerful taboos and related metaphors associated with illness and disease are analyzed within their cultural and historical background and concerning the actual impact on patient care and research. It becomes obvious that research and health care policies cannot be satisfyingly successful in helping people affected by STDs as long as these "nonscientific" factors are not taken into account.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKone, Thomas E.; Deshpande, Ashok W.
2004-06-14
In modeling complex environmental problems, we often fail to make precise statements about inputs and outcome. In this case the fuzzy logic method native to the human mind provides a useful way to get at these problems. Fuzzy logic represents a significant change in both the approach to and outcome of environmental evaluations. Risk assessment is currently based on the implicit premise that probability theory provides the necessary and sufficient tools for dealing with uncertainty and variability. The key advantage of fuzzy methods is the way they reflect the human mind in its remarkable ability to store and process informationmore » which is consistently imprecise, uncertain, and resistant to classification. Our case study illustrates the ability of fuzzy logic to integrate statistical measurements with imprecise health goals. But we submit that fuzzy logic and probability theory are complementary and not competitive. In the world of soft computing, fuzzy logic has been widely used and has often been the ''smart'' behind smart machines. But it will require more effort and case studies to establish its niche in risk assessment or other types of impact assessment. Although we often hear complaints about ''bright lines,'' could we adapt to a system that relaxes these lines to fuzzy gradations? Would decision makers and the public accept expressions of water or air quality goals in linguistic terms with computed degrees of certainty? Resistance is likely. In many regions, such as the US and European Union, it is likely that both decision makers and members of the public are more comfortable with our current system in which government agencies avoid confronting uncertainties by setting guidelines that are crisp and often fail to communicate uncertainty. But some day perhaps a more comprehensive approach that includes exposure surveys, toxicological data, epidemiological studies coupled with fuzzy modeling will go a long way in resolving some of the conflict, divisiveness, and controversy in the current regulatory paradigm.« less
Parallel stochastic simulation of macroscopic calcium currents.
González-Vélez, Virginia; González-Vélez, Horacio
2007-06-01
This work introduces MACACO, a macroscopic calcium currents simulator. It provides a parameter-sweep framework which computes macroscopic Ca(2+) currents from the individual aggregation of unitary currents, using a stochastic model for L-type Ca(2+) channels. MACACO uses a simplified 3-state Markov model to simulate the response of each Ca(2+) channel to different voltage inputs to the cell. In order to provide an accurate systematic view for the stochastic nature of the calcium channels, MACACO is composed of an experiment generator, a central simulation engine and a post-processing script component. Due to the computational complexity of the problem and the dimensions of the parameter space, the MACACO simulation engine employs a grid-enabled task farm. Having been designed as a computational biology tool, MACACO heavily borrows from the way cell physiologists conduct and report their experimental work.
Scene analysis in the natural environment
Lewicki, Michael S.; Olshausen, Bruno A.; Surlykke, Annemarie; Moss, Cynthia F.
2014-01-01
The problem of scene analysis has been studied in a number of different fields over the past decades. These studies have led to important insights into problems of scene analysis, but not all of these insights are widely appreciated, and there remain critical shortcomings in current approaches that hinder further progress. Here we take the view that scene analysis is a universal problem solved by all animals, and that we can gain new insight by studying the problems that animals face in complex natural environments. In particular, the jumping spider, songbird, echolocating bat, and electric fish, all exhibit behaviors that require robust solutions to scene analysis problems encountered in the natural environment. By examining the behaviors of these seemingly disparate animals, we emerge with a framework for studying scene analysis comprising four essential properties: (1) the ability to solve ill-posed problems, (2) the ability to integrate and store information across time and modality, (3) efficient recovery and representation of 3D scene structure, and (4) the use of optimal motor actions for acquiring information to progress toward behavioral goals. PMID:24744740
A quantum annealing approach for fault detection and diagnosis of graph-based systems
NASA Astrophysics Data System (ADS)
Perdomo-Ortiz, A.; Fluegemann, J.; Narasimhan, S.; Biswas, R.; Smelyanskiy, V. N.
2015-02-01
Diagnosing the minimal set of faults capable of explaining a set of given observations, e.g., from sensor readouts, is a hard combinatorial optimization problem usually tackled with artificial intelligence techniques. We present the mapping of this combinatorial problem to quadratic unconstrained binary optimization (QUBO), and the experimental results of instances embedded onto a quantum annealing device with 509 quantum bits. Besides being the first time a quantum approach has been proposed for problems in the advanced diagnostics community, to the best of our knowledge this work is also the first research utilizing the route Problem → QUBO → Direct embedding into quantum hardware, where we are able to implement and tackle problem instances with sizes that go beyond previously reported toy-model proof-of-principle quantum annealing implementations; this is a significant leap in the solution of problems via direct-embedding adiabatic quantum optimization. We discuss some of the programmability challenges in the current generation of the quantum device as well as a few possible ways to extend this work to more complex arbitrary network graphs.
Salvatore, Jessica E; Aliev, Fazil; Edwards, Alexis C; Evans, David M; Macleod, John; Hickman, Matthew; Lewis, Glyn; Kendler, Kenneth S; Loukola, Anu; Korhonen, Tellervo; Latvala, Antti; Rose, Richard J; Kaprio, Jaakko; Dick, Danielle M
2014-04-10
Alcohol problems represent a classic example of a complex behavioral outcome that is likely influenced by many genes of small effect. A polygenic approach, which examines aggregate measured genetic effects, can have predictive power in cases where individual genes or genetic variants do not. In the current study, we first tested whether polygenic risk for alcohol problems-derived from genome-wide association estimates of an alcohol problems factor score from the age 18 assessment of the Avon Longitudinal Study of Parents and Children (ALSPAC; n = 4304 individuals of European descent; 57% female)-predicted alcohol problems earlier in development (age 14) in an independent sample (FinnTwin12; n = 1162; 53% female). We then tested whether environmental factors (parental knowledge and peer deviance) moderated polygenic risk to predict alcohol problems in the FinnTwin12 sample. We found evidence for both polygenic association and for additive polygene-environment interaction. Higher polygenic scores predicted a greater number of alcohol problems (range of Pearson partial correlations 0.07-0.08, all p-values ≤ 0.01). Moreover, genetic influences were significantly more pronounced under conditions of low parental knowledge or high peer deviance (unstandardized regression coefficients (b), p-values (p), and percent of variance (R2) accounted for by interaction terms: b = 1.54, p = 0.02, R2 = 0.33%; b = 0.94, p = 0.04, R2 = 0.30%, respectively). Supplementary set-based analyses indicated that the individual top single nucleotide polymorphisms (SNPs) contributing to the polygenic scores were not individually enriched for gene-environment interaction. Although the magnitude of the observed effects are small, this study illustrates the usefulness of polygenic approaches for understanding the pathways by which measured genetic predispositions come together with environmental factors to predict complex behavioral outcomes.
Fem Formulation for Heat and Mass Transfer in Porous Medium
NASA Astrophysics Data System (ADS)
Azeem; Soudagar, Manzoor Elahi M.; Salman Ahmed, N. J.; Anjum Badruddin, Irfan
2017-08-01
Heat and mass transfer in porous medium can be modelled using three partial differential equations namely, momentum equation, energy equation and mass diffusion. These three equations are coupled to each other by some common terms that turn the whole phenomenon into a complex problem with inter-dependable variables. The current article describes the finite element formulation of heat and mass transfer in porous medium with respect to Cartesian coordinates. The problem under study is formulated into algebraic form of equations by using Galerkin's method with the help of two-node linear triangular element having three nodes. The domain is meshed with smaller sized elements near the wall region and bigger size away from walls.
Study designs appropriate for the workplace.
Hogue, C J
1986-01-01
Carlo and Hearn have called for "refinement of old [epidemiologic] methods and an ongoing evaluation of where methods fit in the overall scheme as we address the multiple complexities of reproductive hazard assessment." This review is an attempt to bring together the current state-of-the-art methods for problem definition and hypothesis testing available to the occupational epidemiologist. For problem definition, meta analysis can be utilized to narrow the field of potential causal hypotheses. Passive active surveillance may further refine issues for analytic research. Within analytic epidemiology, several methods may be appropriate for the workplace setting. Those discussed here may be used to estimate the risk ratio in either a fixed or dynamic population.
Zador, Anthony M.; Dubnau, Joshua; Oyibo, Hassana K.; Zhan, Huiqing; Cao, Gang; Peikon, Ian D.
2012-01-01
Connectivity determines the function of neural circuits. Historically, circuit mapping has usually been viewed as a problem of microscopy, but no current method can achieve high-throughput mapping of entire circuits with single neuron precision. Here we describe a novel approach to determining connectivity. We propose BOINC (“barcoding of individual neuronal connections”), a method for converting the problem of connectivity into a form that can be read out by high-throughput DNA sequencing. The appeal of using sequencing is that its scale—sequencing billions of nucleotides per day is now routine—is a natural match to the complexity of neural circuits. An inexpensive high-throughput technique for establishing circuit connectivity at single neuron resolution could transform neuroscience research. PMID:23109909
Razumikhin-Type Stability Criteria for Differential Equations with Delayed Impulses.
Wang, Qing; Zhu, Quanxin
2013-01-01
This paper studies stability problems of general impulsive differential equations where time delays occur in both differential and difference equations. Based on the method of Lyapunov functions, Razumikhin technique and mathematical induction, several stability criteria are obtained for differential equations with delayed impulses. Our results show that some systems with delayed impulses may be exponentially stabilized by impulses even if the system matrices are unstable. Some less restrictive sufficient conditions are also given to keep the good stability property of systems subject to certain type of impulsive perturbations. Examples with numerical simulations are discussed to illustrate the theorems. Our results may be applied to complex problems where impulses depend on both current and past states.
Padilla-Walker, Laura M; Carlo, Gustavo; Nielson, Matthew G
2015-01-01
The current study examined bidirectional, longitudinal links between prosocial and problem behavior. Participants (N = 500) were recruited from a Northwestern city in the United States and assessed for 3 consecutive years from 2009 to 2011 (M(age) of youth at Time 1 = 13.32, SD = 1.05; 52% girls; 67% European American, 33% single-parent families). Results suggested that effects of earlier prosocial behavior toward family and strangers were predictive of fewer problem behaviors 2 years later, while results for prosocial behavior toward friends were more mixed. Results also suggested depression predicted lower prosocial behavior toward family members and anxiety predicted higher prosocial behavior toward friends. Findings show a complex pattern of relations that demonstrate the need to consider targets of helping. © 2015 The Authors. Child Development © 2015 Society for Research in Child Development, Inc.
[Integral obstetrics impeded by history? Midwives and gynaecologists through the ages].
van der Lee, N; Scheele, F
2016-01-01
There is a long and complicated history concerning the interprofessional collaboration between midwives and gynaecologists, which is still evident in current practice. Yet, in the analysis of collaborative problems, history and its lessons are often overlooked. Consequently, less effective solutions to problems may be found, because the root cause of a problem is not addressed. In this historical perspective we show how policies of the respective professions have often focused on self-preservation and competition, rather than on effective collaboration. We also highlight how the independent midwives lost and regained authorisation, status and income. Finally, using a theoretical model for interprofessional collaboration, we reflect on where history impedes the development of integral obstetrics. The focus must be averted away from professional self-interest and power struggles, but this proves to be a complex exercise.
Students' conceptual performance on synthesis physics problems with varying mathematical complexity
NASA Astrophysics Data System (ADS)
Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan
2017-06-01
A body of research on physics problem solving has focused on single-concept problems. In this study we use "synthesis problems" that involve multiple concepts typically taught in different chapters. We use two types of synthesis problems, sequential and simultaneous synthesis tasks. Sequential problems require a consecutive application of fundamental principles, and simultaneous problems require a concurrent application of pertinent concepts. We explore students' conceptual performance when they solve quantitative synthesis problems with varying mathematical complexity. Conceptual performance refers to the identification, follow-up, and correct application of the pertinent concepts. Mathematical complexity is determined by the type and the number of equations to be manipulated concurrently due to the number of unknowns in each equation. Data were collected from written tasks and individual interviews administered to physics major students (N =179 ) enrolled in a second year mechanics course. The results indicate that mathematical complexity does not impact students' conceptual performance on the sequential tasks. In contrast, for the simultaneous problems, mathematical complexity negatively influences the students' conceptual performance. This difference may be explained by the students' familiarity with and confidence in particular concepts coupled with cognitive load associated with manipulating complex quantitative equations. Another explanation pertains to the type of synthesis problems, either sequential or simultaneous task. The students split the situation presented in the sequential synthesis tasks into segments but treated the situation in the simultaneous synthesis tasks as a single event.
Analysis of Change in the Wind Speed Ratio according to Apartment Layout and Solutions
Hyung, Won-gil; Kim, Young-Moon; You, Ki-Pyo
2014-01-01
Apartment complexes in various forms are built in downtown areas. The arrangement of an apartment complex has great influence on the wind flow inside it. There are issues of residents' walking due to gust occurrence within apartment complexes, problems with pollutant emission due to airflow congestion, and heat island and cool island phenomena in apartment complexes. Currently, the forms of internal arrangements of apartment complexes are divided into the flat type and the tower type. In the present study, a wind tunnel experiment and computational fluid dynamics (CFD) simulation were performed with respect to internal wind flows in different apartment arrangement forms. Findings of the wind tunnel experiment showed that the internal form and arrangement of an apartment complex had significant influence on its internal airflow. The wind velocity of the buildings increased by 80% at maximum due to the proximity effects between the buildings. The CFD simulation for relaxing such wind flows indicated that the wind velocity reduced by 40% or more at maximum when the paths between the lateral sides of the buildings were extended. PMID:24688430
Analysis of change in the wind speed ratio according to apartment layout and solutions.
Hyung, Won-gil; Kim, Young-Moon; You, Ki-Pyo
2014-01-01
Apartment complexes in various forms are built in downtown areas. The arrangement of an apartment complex has great influence on the wind flow inside it. There are issues of residents' walking due to gust occurrence within apartment complexes, problems with pollutant emission due to airflow congestion, and heat island and cool island phenomena in apartment complexes. Currently, the forms of internal arrangements of apartment complexes are divided into the flat type and the tower type. In the present study, a wind tunnel experiment and computational fluid dynamics (CFD) simulation were performed with respect to internal wind flows in different apartment arrangement forms. Findings of the wind tunnel experiment showed that the internal form and arrangement of an apartment complex had significant influence on its internal airflow. The wind velocity of the buildings increased by 80% at maximum due to the proximity effects between the buildings. The CFD simulation for relaxing such wind flows indicated that the wind velocity reduced by 40% or more at maximum when the paths between the lateral sides of the buildings were extended.
Applications of genetic programming in cancer research.
Worzel, William P; Yu, Jianjun; Almal, Arpit A; Chinnaiyan, Arul M
2009-02-01
The theory of Darwinian evolution is the fundamental keystones of modern biology. Late in the last century, computer scientists began adapting its principles, in particular natural selection, to complex computational challenges, leading to the emergence of evolutionary algorithms. The conceptual model of selective pressure and recombination in evolutionary algorithms allow scientists to efficiently search high dimensional space for solutions to complex problems. In the last decade, genetic programming has been developed and extensively applied for analysis of molecular data to classify cancer subtypes and characterize the mechanisms of cancer pathogenesis and development. This article reviews current successes using genetic programming and discusses its potential impact in cancer research and treatment in the near future.
Object-oriented Bayesian networks for paternity cases with allelic dependencies
Hepler, Amanda B.; Weir, Bruce S.
2008-01-01
This study extends the current use of Bayesian networks by incorporating the effects of allelic dependencies in paternity calculations. The use of object-oriented networks greatly simplify the process of building and interpreting forensic identification models, allowing researchers to solve new, more complex problems. We explore two paternity examples: the most common scenario where DNA evidence is available from the alleged father, the mother and the child; a more complex casewhere DNA is not available from the alleged father, but is available from the alleged father’s brother. Object-oriented networks are built, using HUGIN, for each example which incorporate the effects of allelic dependence caused by evolutionary relatedness. PMID:19079769
Public nutrition in complex emergencies.
Young, Helen; Borrel, Annalies; Holland, Diane; Salama, Peter
Public nutrition is a broad-based, problem-solving approach to addressing malnutrition in complex emergencies that combines analysis of nutritional risk and vulnerability with action-oriented strategies, including policies, programmes, and capacity development. This paper focuses on six broad areas: nutritional assessment, distribution of a general food ration, prevention and treatment of moderate malnutrition, treatment of severe malnutrition in children and adults, prevention and treatment of micronutrient deficiency diseases, and nutritional support for at-risk groups, including infants, pregnant and lactating women, elderly people, and people living with HIV. Learning and documenting good practice from previous emergencies, the promotion of good practice in current emergencies, and adherence to international standards and guidelines have contributed to establishing the field of public nutrition. However, many practical challenges reduce the effectiveness of nutritional interventions in complex emergencies, and important research and programmatic questions remain.
Extracting attosecond delays from spectrally overlapping interferograms
NASA Astrophysics Data System (ADS)
Jordan, Inga; Wörner, Hans Jakob
2018-02-01
Attosecond interferometry is becoming an increasingly popular technique for measuring the dynamics of photoionization in real time. Whereas early measurements focused on atomic systems with very simple photoelectron spectra, the technique is now being applied to more complex systems including isolated molecules and solids. The increase in complexity translates into an augmented spectral congestion, unavoidably resulting in spectral overlap in attosecond interferograms. Here, we discuss currently used methods for phase retrieval and introduce two new approaches for determining attosecond photoemission delays from spectrally overlapping photoelectron spectra. We show that the previously used technique, consisting in the spectral integration of the areas of interest, does in general not provide reliable results. Our methods resolve this problem, thereby opening the technique of attosecond interferometry to complex systems and fully exploiting its specific advantages in terms of spectral resolution compared to attosecond streaking.
NASA Astrophysics Data System (ADS)
Zobnina, V. G.; Kosevich, M. V.; Chagovets, V. V.; Boryak, O. A.
A problem of elucidation of structure of nanomaterials based on combination of proteins and polyether polymers is addressed on the monomeric level of single amino acids and oligomers of PEG-400 and OEG-5 polyethers. Efficiency of application of combined approach involving experimental electrospray mass spectrometry and computer modeling by molecular dynamics simulation is demonstrated. It is shown that oligomers of polyethers form stable complexes with amino acids valine, proline, histidine, glutamic, and aspartic acids. Molecular dynamics simulation has shown that stabilization of amino acid-polyether complexes is achieved due to winding of the polymeric chain around charged groups of amino acids. Structural motives revealed for complexes of single amino acids with polyethers can be realized in structures of protein-polyether nanoparticles currently designed for drug delivery.
Practical modeling approaches for geological storage of carbon dioxide.
Celia, Michael A; Nordbotten, Jan M
2009-01-01
The relentless increase of anthropogenic carbon dioxide emissions and the associated concerns about climate change have motivated new ideas about carbon-constrained energy production. One technological approach to control carbon dioxide emissions is carbon capture and storage, or CCS. The underlying idea of CCS is to capture the carbon before it emitted to the atmosphere and store it somewhere other than the atmosphere. Currently, the most attractive option for large-scale storage is in deep geological formations, including deep saline aquifers. Many physical and chemical processes can affect the fate of the injected CO2, with the overall mathematical description of the complete system becoming very complex. Our approach to the problem has been to reduce complexity as much as possible, so that we can focus on the few truly important questions about the injected CO2, most of which involve leakage out of the injection formation. Toward this end, we have established a set of simplifying assumptions that allow us to derive simplified models, which can be solved numerically or, for the most simplified cases, analytically. These simplified models allow calculation of solutions to large-scale injection and leakage problems in ways that traditional multicomponent multiphase simulators cannot. Such simplified models provide important tools for system analysis, screening calculations, and overall risk-assessment calculations. We believe this is a practical and important approach to model geological storage of carbon dioxide. It also serves as an example of how complex systems can be simplified while retaining the essential physics of the problem.
Formative feedback and scaffolding for developing complex problem solving and modelling outcomes
NASA Astrophysics Data System (ADS)
Frank, Brian; Simper, Natalie; Kaupp, James
2018-07-01
This paper discusses the use and impact of formative feedback and scaffolding to develop outcomes for complex problem solving in a required first-year course in engineering design and practice at a medium-sized research-intensive Canadian university. In 2010, the course began to use team-based, complex, open-ended contextualised problems to develop problem solving, communications, teamwork, modelling, and professional skills. Since then, formative feedback has been incorporated into: task and process-level feedback on scaffolded tasks in-class, formative assignments, and post-assignment review. Development in complex problem solving and modelling has been assessed through analysis of responses from student surveys, direct criterion-referenced assessment of course outcomes from 2013 to 2015, and an external longitudinal study. The findings suggest that students are improving in outcomes related to complex problem solving over the duration of the course. Most notably, the addition of new feedback and scaffolding coincided with improved student performance.
Schmidt, Henk G.; Rikers, Remy M. J. P.; Custers, Eugene J. F. M.; Splinter, Ted A. W.; van Saase, Jan L. C. M.
2010-01-01
Contrary to what common sense makes us believe, deliberation without attention has recently been suggested to produce better decisions in complex situations than deliberation with attention. Based on differences between cognitive processes of experts and novices, we hypothesized that experts make in fact better decisions after consciously thinking about complex problems whereas novices may benefit from deliberation-without-attention. These hypotheses were confirmed in a study among doctors and medical students. They diagnosed complex and routine problems under three conditions, an immediate-decision condition and two delayed conditions: conscious thought and deliberation-without-attention. Doctors did better with conscious deliberation when problems were complex, whereas reasoning mode did not matter in simple problems. In contrast, deliberation-without-attention improved novices’ decisions, but only in simple problems. Experts benefit from consciously thinking about complex problems; for novices thinking does not help in those cases. PMID:20354726
Mamede, Sílvia; Schmidt, Henk G; Rikers, Remy M J P; Custers, Eugene J F M; Splinter, Ted A W; van Saase, Jan L C M
2010-11-01
Contrary to what common sense makes us believe, deliberation without attention has recently been suggested to produce better decisions in complex situations than deliberation with attention. Based on differences between cognitive processes of experts and novices, we hypothesized that experts make in fact better decisions after consciously thinking about complex problems whereas novices may benefit from deliberation-without-attention. These hypotheses were confirmed in a study among doctors and medical students. They diagnosed complex and routine problems under three conditions, an immediate-decision condition and two delayed conditions: conscious thought and deliberation-without-attention. Doctors did better with conscious deliberation when problems were complex, whereas reasoning mode did not matter in simple problems. In contrast, deliberation-without-attention improved novices' decisions, but only in simple problems. Experts benefit from consciously thinking about complex problems; for novices thinking does not help in those cases.
Discovering Tradeoffs, Vulnerabilities, and Dependencies within Water Resources Systems
NASA Astrophysics Data System (ADS)
Reed, P. M.
2015-12-01
There is a growing recognition and interest in using emerging computational tools for discovering the tradeoffs that emerge across complex combinations infrastructure options, adaptive operations, and sign posts. As a field concerned with "deep uncertainties", it is logically consistent to include a more direct acknowledgement that our choices for dealing with computationally demanding simulations, advanced search algorithms, and sensitivity analysis tools are themselves subject to failures that could adversely bias our understanding of how systems' vulnerabilities change with proposed actions. Balancing simplicity versus complexity in our computational frameworks is nontrivial given that we are often exploring high impact irreversible decisions. It is not always clear that accepted models even encompass important failure modes. Moreover as they become more complex and computationally demanding the benefits and consequences of simplifications are often untested. This presentation discusses our efforts to address these challenges through our "many-objective robust decision making" (MORDM) framework for the design and management water resources systems. The MORDM framework has four core components: (1) elicited problem conception and formulation, (2) parallel many-objective search, (3) interactive visual analytics, and (4) negotiated selection of robust alternatives. Problem conception and formulation is the process of abstracting a practical design problem into a mathematical representation. We build on the emerging work in visual analytics to exploit interactive visualization of both the design space and the objective space in multiple heterogeneous linked views that permit exploration and discovery. Many-objective search produces tradeoff solutions from potentially competing problem formulations that can each consider up to ten conflicting objectives based on current computational search capabilities. Negotiated design selection uses interactive visualization, reformulation, and optimization to discover desirable designs for implementation. Multi-city urban water supply portfolio planning will be used to illustrate the MORDM framework.
Preparing new nurses with complexity science and problem-based learning.
Hodges, Helen F
2011-01-01
Successful nurses function effectively with adaptability, improvability, and interconnectedness, and can see emerging and unpredictable complex problems. Preparing new nurses for complexity requires a significant change in prevalent but dated nursing education models for rising graduates. The science of complexity coupled with problem-based learning and peer review contributes a feasible framework for a constructivist learning environment to examine real-time systems data; explore uncertainty, inherent patterns, and ambiguity; and develop skills for unstructured problem solving. This article describes a pilot study of a problem-based learning strategy guided by principles of complexity science in a community clinical nursing course. Thirty-five senior nursing students participated during a 3-year period. Assessments included peer review, a final project paper, reflection, and a satisfaction survey. Results were higher than expected levels of student satisfaction, increased breadth and analysis of complex data, acknowledgment of community as complex adaptive systems, and overall higher level thinking skills than in previous years. 2011, SLACK Incorporated.
NASA Astrophysics Data System (ADS)
Szczęśniak, Dominik; Ennaoui, Ahmed; Ahzi, Saïd
2016-09-01
Recently, the transition metal dichalcogenides have attracted renewed attention due to the potential use of their low-dimensional forms in both nano- and opto-electronics. In such applications, the electronic and transport properties of monolayer transition metal dichalcogenides play a pivotal role. The present paper provides a new insight into these essential properties by studying the complex band structures of popular transition metal dichalcogenide monolayers (MX 2, where M = Mo, W; X = S, Se, Te) while including spin-orbit coupling effects. The conducted symmetry-based tight-binding calculations show that the analytical continuation from the real band structures to the complex momentum space leads to nonlinear generalized eigenvalue problems. Herein an efficient method for solving such a class of nonlinear problems is presented and yields a complete set of physically relevant eigenvalues. Solutions obtained by this method are characterized and classified into propagating and evanescent states, where the latter states manifest not only monotonic but also oscillatory decay character. It is observed that some of the oscillatory evanescent states create characteristic complex loops at the direct band gap of MX 2 monolayers, where electrons can directly tunnel between the band gap edges. To describe these tunneling currents, decay behavior of electronic states in the forbidden energy region is elucidated and their importance within the ballistic transport regime is briefly discussed.
Explicitly solvable complex Chebyshev approximation problems related to sine polynomials
NASA Technical Reports Server (NTRS)
Freund, Roland
1989-01-01
Explicitly solvable real Chebyshev approximation problems on the unit interval are typically characterized by simple error curves. A similar principle is presented for complex approximation problems with error curves induced by sine polynomials. As an application, some new explicit formulae for complex best approximations are derived.
ERIC Educational Resources Information Center
Nelson, Tenneisha; Squires, Vicki
2017-01-01
Organizations are faced with solving increasingly complex problems. Addressing these issues requires effective leadership that can facilitate a collaborative problem solving approach where multiple perspectives are leveraged. In this conceptual paper, we critique the effectiveness of earlier leadership models in tackling complex organizational…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, A., E-mail: davidsoa@physics.ucla.edu; Tableman, A., E-mail: Tableman@physics.ucla.edu; An, W., E-mail: anweiming@ucla.edu
2015-01-15
For many plasma physics problems, three-dimensional and kinetic effects are very important. However, such simulations are very computationally intensive. Fortunately, there is a class of problems for which there is nearly azimuthal symmetry and the dominant three-dimensional physics is captured by the inclusion of only a few azimuthal harmonics. Recently, it was proposed [1] to model one such problem, laser wakefield acceleration, by expanding the fields and currents in azimuthal harmonics and truncating the expansion. The complex amplitudes of the fundamental and first harmonic for the fields were solved on an r–z grid and a procedure for calculating the complexmore » current amplitudes for each particle based on its motion in Cartesian geometry was presented using a Marder's correction to maintain the validity of Gauss's law. In this paper, we describe an implementation of this algorithm into OSIRIS using a rigorous charge conserving current deposition method to maintain the validity of Gauss's law. We show that this algorithm is a hybrid method which uses a particles-in-cell description in r–z and a gridless description in ϕ. We include the ability to keep an arbitrary number of harmonics and higher order particle shapes. Examples for laser wakefield acceleration, plasma wakefield acceleration, and beam loading are also presented and directions for future work are discussed.« less
2011-12-01
therefore a more general approach uses the pseudo-inverse shown in Equation (12) to obtain the commanded gimbal rate. 1 /T T b N CMG...gimbal motor. Approaching the problem from this perspective increases the complexity significantly and the relationship between motor current and...included in this document confirms the equations that Schaub and Junkins developed. The approaches used in the two derivations are sufficiently
1981-02-01
the machine . ARI’s efforts in this area focus on human perfor- mance problems related to interactions with command and control centers, and on issues...improvement of the user- machine interface. Lacking consistent design principles, current practice results in a fragmented and unsystematic approach to system...complexity in the user- machine interface of BAS, ARI supported this effort for develop- me:nt of an online language for Army tactical intelligence
Contract W911NF-09-1-0384 (Purdue University)
2012-10-27
spin system, Physical Review A , (02 2010): 22324. doi: 10.1103/PhysRevA.81.022324 08/31/2011 8.00 Sabre Kais, Anmer Daskin . Group leaders... a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. a ...billions ) and developed new quantum algorithms to solve complex chemistry problems such as global optimization and excited states of molecules. ( a ) Papers
Current Methods for Evaluation of Physical Security System Effectiveness.
1981-05-01
It also helps the user modify a data set before further processing. (c) Safeguards Engineering and Analysis Data Base (SEAD)--To complete SAFE’s...graphic display software in addition to a Fortran compiler, and up to about (3 35,000 words of storage. For a fairly complex problem, a single run through...operational software . 94 BIBLIOGRAPHY Lenz, J.E., "The PROSE (Protection System Evaluator) Model," Proc. 1979 Winter Simulation Conference, IEEE, 1979
Weber, Durkheim, and the comparative method.
Kapsis, R E
1977-10-01
This essay compares and contrasts the means by which Durkheim and Weber dealt with methodological issues peculiar to the comparative study of societies, what Smelser has called "the problem of sociocultural variability and complexity." More specifically, it examines how Weber and Durkheim chose appropriate comparative units for their empirical studies. The approaches that Weber and Durkheim brought to theproblem of cross-cultural comparison have critical implications for more current procedures used in the comparative study of contemporary and historical societies.
2003 Y-12 National Security Complex Annual Illness and Injury Surveillance Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
U.S. Department of Energy, Office of Health, Safety and Security, Office of Illness and Injury Prevention Programs
2007-05-23
Annual Illness and Injury Surveillance Program report for 2003 for Y-12. The U.S. Department of Energy’s (DOE) commitment to assuring the health and safety of its workers includes the conduct of epidemiologic surveillance activities that provide an early warning system for health problems among workers. The IISP monitors illnesses and health conditions that result in an absence of workdays, occupational injuries and illnesses, and disabilities and deaths among current workers.
Coping and Sexual Harassment: How Victims Cope across Multiple Settings.
Scarduzio, Jennifer A; Sheff, Sarah E; Smith, Mathew
2018-02-01
The ways sexual harassment occurs both online and in face-to-face settings has become more complicated. Sexual harassment that occurs in cyberspace or online sexual harassment adds complexity to the experiences of victims, current research understandings, and the legal dimensions of this phenomenon. Social networking sites (SNS) are a type of social media that offer unique opportunities to users and sometimes the communication that occurs on SNS can cross the line from flirtation into online sexual harassment. Victims of sexual harassment employ communicative strategies such as coping to make sense of their experiences of sexual harassment. The current study qualitatively examined problem-focused, active emotion-focused, and passive emotion-focused coping strategies employed by sexual harassment victims across multiple settings. We conducted 26 in-depth interviews with victims that had experienced sexual harassment across multiple settings (e.g., face-to-face and SNS). The findings present 16 types of coping strategies-five problem-focused, five active emotion-focused, and six passive emotion-focused. The victims used an average of three types of coping strategies during their experiences. Theoretical implications extend research on passive emotion-focused coping strategies by discussing powerlessness and how victims blame other victims. Furthermore, theoretically the findings reveal that coping is a complex, cyclical process and that victims shift among types of coping strategies over the course of their experience. Practical implications are offered for victims and for SNS sites.
ELSI: A unified software interface for Kohn–Sham electronic structure solvers
Yu, Victor Wen-zhe; Corsetti, Fabiano; Garcia, Alberto; ...
2017-09-15
Solving the electronic structure from a generalized or standard eigenproblem is often the bottleneck in large scale calculations based on Kohn-Sham density-functional theory. This problem must be addressed by essentially all current electronic structure codes, based on similar matrix expressions, and by high-performance computation. We here present a unified software interface, ELSI, to access different strategies that address the Kohn-Sham eigenvalue problem. Currently supported algorithms include the dense generalized eigensolver library ELPA, the orbital minimization method implemented in libOMM, and the pole expansion and selected inversion (PEXSI) approach with lower computational complexity for semilocal density functionals. The ELSI interface aimsmore » to simplify the implementation and optimal use of the different strategies, by offering (a) a unified software framework designed for the electronic structure solvers in Kohn-Sham density-functional theory; (b) reasonable default parameters for a chosen solver; (c) automatic conversion between input and internal working matrix formats, and in the future (d) recommendation of the optimal solver depending on the specific problem. As a result, comparative benchmarks are shown for system sizes up to 11,520 atoms (172,800 basis functions) on distributed memory supercomputing architectures.« less
ELSI: A unified software interface for Kohn-Sham electronic structure solvers
NASA Astrophysics Data System (ADS)
Yu, Victor Wen-zhe; Corsetti, Fabiano; García, Alberto; Huhn, William P.; Jacquelin, Mathias; Jia, Weile; Lange, Björn; Lin, Lin; Lu, Jianfeng; Mi, Wenhui; Seifitokaldani, Ali; Vázquez-Mayagoitia, Álvaro; Yang, Chao; Yang, Haizhao; Blum, Volker
2018-01-01
Solving the electronic structure from a generalized or standard eigenproblem is often the bottleneck in large scale calculations based on Kohn-Sham density-functional theory. This problem must be addressed by essentially all current electronic structure codes, based on similar matrix expressions, and by high-performance computation. We here present a unified software interface, ELSI, to access different strategies that address the Kohn-Sham eigenvalue problem. Currently supported algorithms include the dense generalized eigensolver library ELPA, the orbital minimization method implemented in libOMM, and the pole expansion and selected inversion (PEXSI) approach with lower computational complexity for semilocal density functionals. The ELSI interface aims to simplify the implementation and optimal use of the different strategies, by offering (a) a unified software framework designed for the electronic structure solvers in Kohn-Sham density-functional theory; (b) reasonable default parameters for a chosen solver; (c) automatic conversion between input and internal working matrix formats, and in the future (d) recommendation of the optimal solver depending on the specific problem. Comparative benchmarks are shown for system sizes up to 11,520 atoms (172,800 basis functions) on distributed memory supercomputing architectures.
Vista goes online: Decision-analytic systems for real-time decision-making in mission control
NASA Technical Reports Server (NTRS)
Barry, Matthew; Horvitz, Eric; Ruokangas, Corinne; Srinivas, Sampath
1994-01-01
The Vista project has centered on the use of decision-theoretic approaches for managing the display of critical information relevant to real-time operations decisions. The Vista-I project originally developed a prototype of these approaches for managing flight control displays in the Space Shuttle Mission Control Center (MCC). The follow-on Vista-II project integrated these approaches in a workstation program which currently is being certified for use in the MCC. To our knowledge, this will be the first application of automated decision-theoretic reasoning techniques for real-time spacecraft operations. We shall describe the development and capabilities of the Vista-II system, and provide an overview of the use of decision-theoretic reasoning techniques to the problems of managing the complexity of flight controller displays. We discuss the relevance of the Vista techniques within the MCC decision-making environment, focusing on the problems of detecting and diagnosing spacecraft electromechanical subsystems component failures with limited information, and the problem of determining what control actions should be taken in high-stakes, time-critical situations in response to a diagnosis performed under uncertainty. Finally, we shall outline our current research directions for follow-on projects.
Transfer path analysis: Current practice, trade-offs and consideration of damping
NASA Astrophysics Data System (ADS)
Oktav, Akın; Yılmaz, Çetin; Anlaş, Günay
2017-02-01
Current practice of experimental transfer path analysis is discussed in the context of trade-offs between accuracy and time cost. An overview of methods, which propose solutions for structure borne noise, is given, where assumptions, drawbacks and advantages of methods are stated theoretically. Applicability of methods is also investigated, where an engine induced structure borne noise of an automobile is taken as a reference problem. Depending on this particular problem, sources of measurement errors, processing operations that affect results and physical obstacles faced in the application are analysed. While an operational measurement is common in all stated methods, when it comes to removal of source, or the need for an external excitation, discrepancies are present. Depending on the chosen method, promised outcomes like independent characterisation of the source, or getting information about mounts also differ. Although many aspects of the problem are reported in the literature, damping and its effects are not considered. Damping effect is embedded in the measured complex frequency response functions, and it is needed to be analysed in the post processing step. Effects of damping, reasons and methods to analyse them are discussed in detail. In this regard, a new procedure, which increases the accuracy of results, is also proposed.
Motion Artefacts in MRI: a Complex Problem with Many Partial Solutions
Zaitsev, Maxim; Maclaren, Julian.; Herbst, Michael
2015-01-01
Subject motion during magnetic resonance imaging (MRI) has been problematic since its introduction as a clinical imaging modality. While sensitivity to particle motion or blood flow can be used to provide useful image contrast, bulk motion presents a considerable problem in the majority of clinical applications. It is one of the most frequent sources of artefacts. Over 30 years of research have produced numerous methods to mitigate or correct for motion artefacts, but no single method can be applied in all imaging situations. Instead, a ‘toolbox’ of methods exists, where each tool is suitable for some tasks, but not for others. This article reviews the origins of motion artefacts and presents current mitigation and correction methods. In some imaging situations, the currently available motion correction tools are highly effective; in other cases, appropriate tools still need to be developed. It seems likely that this multifaceted approach will be what eventually solves the motion sensitivity problem in MRI, rather than a single solution that is effective in all situations. This review places a strong emphasis on explaining the physics behind the occurrence of such artefacts, with the aim of aiding artefact detection and mitigation in particular clinical situations. PMID:25630632
Motion artifacts in MRI: A complex problem with many partial solutions.
Zaitsev, Maxim; Maclaren, Julian; Herbst, Michael
2015-10-01
Subject motion during magnetic resonance imaging (MRI) has been problematic since its introduction as a clinical imaging modality. While sensitivity to particle motion or blood flow can be used to provide useful image contrast, bulk motion presents a considerable problem in the majority of clinical applications. It is one of the most frequent sources of artifacts. Over 30 years of research have produced numerous methods to mitigate or correct for motion artifacts, but no single method can be applied in all imaging situations. Instead, a "toolbox" of methods exists, where each tool is suitable for some tasks, but not for others. This article reviews the origins of motion artifacts and presents current mitigation and correction methods. In some imaging situations, the currently available motion correction tools are highly effective; in other cases, appropriate tools still need to be developed. It seems likely that this multifaceted approach will be what eventually solves the motion sensitivity problem in MRI, rather than a single solution that is effective in all situations. This review places a strong emphasis on explaining the physics behind the occurrence of such artifacts, with the aim of aiding artifact detection and mitigation in particular clinical situations. © 2015 Wiley Periodicals, Inc.
21st century toolkit for optimizing population health through precision nutrition.
O'Sullivan, Aifric; Henrick, Bethany; Dixon, Bonnie; Barile, Daniela; Zivkovic, Angela; Smilowitz, Jennifer; Lemay, Danielle; Martin, William; German, J Bruce; Schaefer, Sara Elizabeth
2017-07-05
Scientific, technological, and economic progress over the last 100 years all but eradicated problems of widespread food shortage and nutrient deficiency in developed nations. But now society is faced with a new set of nutrition problems related to energy imbalance and metabolic disease, which require new kinds of solutions. Recent developments in the area of new analytical tools enable us to systematically study large quantities of detailed and multidimensional metabolic and health data, providing the opportunity to address current nutrition problems through an approach called Precision Nutrition. This approach integrates different kinds of "big data" to expand our understanding of the complexity and diversity of human metabolism in response to diet. With these tools, we can more fully elucidate each individual's unique phenotype, or the current state of health, as determined by the interactions among biology, environment, and behavior. The tools of precision nutrition include genomics, metabolomics, microbiomics, phenotyping, high-throughput analytical chemistry techniques, longitudinal tracking with body sensors, informatics, data science, and sophisticated educational and behavioral interventions. These tools are enabling the development of more personalized and predictive dietary guidance and interventions that have the potential to transform how the public makes food choices and greatly improve population health.
ELSI: A unified software interface for Kohn–Sham electronic structure solvers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Victor Wen-zhe; Corsetti, Fabiano; Garcia, Alberto
Solving the electronic structure from a generalized or standard eigenproblem is often the bottleneck in large scale calculations based on Kohn-Sham density-functional theory. This problem must be addressed by essentially all current electronic structure codes, based on similar matrix expressions, and by high-performance computation. We here present a unified software interface, ELSI, to access different strategies that address the Kohn-Sham eigenvalue problem. Currently supported algorithms include the dense generalized eigensolver library ELPA, the orbital minimization method implemented in libOMM, and the pole expansion and selected inversion (PEXSI) approach with lower computational complexity for semilocal density functionals. The ELSI interface aimsmore » to simplify the implementation and optimal use of the different strategies, by offering (a) a unified software framework designed for the electronic structure solvers in Kohn-Sham density-functional theory; (b) reasonable default parameters for a chosen solver; (c) automatic conversion between input and internal working matrix formats, and in the future (d) recommendation of the optimal solver depending on the specific problem. As a result, comparative benchmarks are shown for system sizes up to 11,520 atoms (172,800 basis functions) on distributed memory supercomputing architectures.« less
Lazarević, Tatjana; Rilak, Ana; Bugarčić, Živadin D
2017-12-15
Metallodrugs offer potential for unique mechanism of drug action based on the choice of the metal, its oxidation state, the types and number of coordinated ligands and the coordination geometry. This review illustrates notable recent progress in the field of medicinal bioinorganic chemistry as many new approaches to the design of innovative metal-based anticancer drugs are emerging. Current research addressing the problems associated with platinum drugs has focused on other metal-based therapeutics that have different modes of action and on prodrug and targeting strategies in an effort to diminish the side-effects of cisplatin chemotherapy. Examples of metal compounds and chelating agents currently in clinical use, clinical trials or preclinical development are highlighted. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Primary health care in the Czech Republic: brief history and current issues
Holcik, Jan; Koupilova, Ilona
2000-01-01
Abstract The objective of this paper is to describe the recent history, current situation and perspectives for further development of the integrated system of primary care in the Czech Republic. The role of primary care in the whole health care system is discussed and new initiatives aimed at strengthening and integrating primary care are outlined. Changes brought about by the recent reform processes are generally seen as favourable, however, a lack of integration of health services under the current system is causing various kinds of problems. A new strategy for development of primary care in the Czech Republic encourages integration of care and defines primary care as co-ordinated and complex care provided at the level of the first contact of an individual with the health care system. PMID:16902697
The potential application of the blackboard model of problem solving to multidisciplinary design
NASA Technical Reports Server (NTRS)
Rogers, James L.
1989-01-01
The potential application of the blackboard model of problem solving to multidisciplinary design is discussed. Multidisciplinary design problems are complex, poorly structured, and lack a predetermined decision path from the initial starting point to the final solution. The final solution is achieved using data from different engineering disciplines. Ideally, for the final solution to be the optimum solution, there must be a significant amount of communication among the different disciplines plus intradisciplinary and interdisciplinary optimization. In reality, this is not what happens in today's sequential approach to multidisciplinary design. Therefore it is highly unlikely that the final solution is the true optimum solution from an interdisciplinary optimization standpoint. A multilevel decomposition approach is suggested as a technique to overcome the problems associated with the sequential approach, but no tool currently exists with which to fully implement this technique. A system based on the blackboard model of problem solving appears to be an ideal tool for implementing this technique because it offers an incremental problem solving approach that requires no a priori determined reasoning path. Thus it has the potential of finding a more optimum solution for the multidisciplinary design problems found in today's aerospace industries.
An illustration of whole systems thinking.
Kalim, Kanwal; Carson, Ewart; Cramp, Derek
2006-08-01
The complexity of policy-making in the NHS is such that systemic, holistic thinking is needed if the current government's plans are to be realized. This paper describes systems thinking and illustrates its value in understanding the complexity of the diabetes National Service Framework (NSF); its role in identifying problems and barriers previously not predicted; and in reaching conclusions as to how it should be implemented. The approach adopted makes use of soft systems methodology (SSM) devised by Peter Checkland. This analysis reveals issues relating to human communication, information provision and resource allocation needing to be addressed. From this, desirable and feasible changes are explored as means of achieving a more effective NSF, examining possible changes from technical, organizational, economic and cultural perspectives. As well as testing current health policies and plans, SSM can be used to test the feasibility of new health policies. This is achieved by providing a greater understanding and appreciation of what is happening in the real world and how people work. Soft systems thinking is the best approach, given the complexity of health care. It is a flexible, cost-effective solution, which should be a prerequisite before any new health policy is launched.
Applications of neural networks to landmark detection in 3-D surface data
NASA Astrophysics Data System (ADS)
Arndt, Craig M.
1992-09-01
The problem of identifying key landmarks in 3-dimensional surface data is of considerable interest in solving a number of difficult real-world tasks, including object recognition and image processing. The specific problem that we address in this research is to identify the specific landmarks (anatomical) in human surface data. This is a complex task, currently performed visually by an expert human operator. In order to replace these human operators and increase reliability of the data acquisition, we need to develop a computer algorithm which will utilize the interrelations between the 3-dimensional data to identify the landmarks of interest. The current presentation describes a method for designing, implementing, training, and testing a custom architecture neural network which will perform the landmark identification task. We discuss the performance of the net in relationship to human performance on the same task and how this net has been integrated with other AI and traditional programming methods to produce a powerful analysis tool for computer anthropometry.
Nuwer, M R; Sigsbee, B
1998-02-01
Medicare recently announced the adoption of minimum documentation criteria for the neurologic examination. These criteria are added to existing standards for the history and medical decision-making. These criteria will be used in compliance audits by Medicare and other payors. Given the current federal initiative to eliminate fraud in the Medicare program, all neurologists need to comply with these standards. These criteria are for documentation only. Neurologic standards of care require a more complex and diverse examination pertinent to the problem(s) under consideration. Further guidance as to the content of a neurologic evaluation is outlined in the article "Practice guidelines: Neurologic evaluation" (Neurology 1990; 40: 871). The level of history and examination required for specific services is defined in the American Medical Association current procedural terminology book. Documentation standards for examination of children are not yet defined.
Tulakin, A V; Tsyplakova, G V; Ampleeva, G P; Kozyreva, O N; Pivneva, O S; Trukhina, G M
Problems of hygienic reliability of the drinking water use in regions of the Russian Federation are observed in the article. The optimization of the water use was shown must be based on the bearing in mind of regional peculiarities of the shaping of water quality of groundwater and surface sources of the water use, taking into account of the effectiveness of regional water protection programs, programs for water treatment, coordination of the activity of economic entities and oversight bodies in the management of water quality on the basis of socio-hygienic monitoring. Regional problems requiring hygienic justification and accounting, include such issues as complex hydrological, hydrogeological, climatic and geographical conditions, pronouncement of the severity of anthropogenic pollution of sources of water supply, natural conditions of the shaping of water quality, efficiency of the water treatment. There is need in the improvement of the problems of the water quality monitoring, including with the use of computer technology, which allows to realize regional hygienic monitoring and spatial-temporal analysis of the water quality, to model the water quality management, to predict conditions of the water use by population in regions taking into account peculiarities of the current health situation. In the article there is shown the practicability of the so-called complex concept of multiple barriers suggesting the combined use of chemical oxidation and physical methods of the preparation of drinking water. It is required the further development of legislation for the protection of water bodies from pollution with the bigging up the status of sanitary protection zones; timely revision of the regulatory framework, establishing sanitary-epidemiological requirements to potable water and drinking water supply. The problem of the provision of the population with safe drinking water requires complex solution within the framework of the implementation of target programs adopted at the Federal and regional levels.
The Bright Side of Being Blue: Depression as an Adaptation for Analyzing Complex Problems
ERIC Educational Resources Information Center
Andrews, Paul W.; Thomson, J. Anderson, Jr.
2009-01-01
Depression is the primary emotional condition for which help is sought. Depressed people often report persistent rumination, which involves analysis, and complex social problems in their lives. Analysis is often a useful approach for solving complex problems, but it requires slow, sustained processing, so disruption would interfere with problem…
Amir, Lisa H; Jones, Lester E; Buck, Miranda L
2015-03-01
New mothers frequently experience breastfeeding problems, in particular nipple pain. This is often attributed to compression, skin damage, infection or dermatitis. To outline an integrated approach to breastfeeding pain assessment that seeks to enhance current practice. Our clinical reasoning model resolves the complexity of pain into three categories: local stimulation, external influences and central modulation. Tissue pathology, damage or inflammation leads to local stimulation of nociceptors. External influences such as creams and breast pumps, as well as factors related to the mother, the infant and the maternal-infant interaction, may exacerbate the pain. Central nervous system modulation includes the enhancement of nociceptive transmission at the spinal cord and modification of the descending inhibitory influences. A broad range of factors can modulate pain through central mechanisms including maternal illness, exhaustion, lack of support, anxiety, depression or history of abuse. General practitioners (GPs) can use this model to explain nipple pain in complex settings, thus increasing management options for women.
On the adaptive function of children's and adults' false memories.
Howe, Mark L; Wilkinson, Samantha; Garner, Sarah R; Ball, Linden J
2016-09-01
Recent research has shown that memory illusions can successfully prime both children's and adults' performance on complex, insight-based problems (compound remote associates tasks or CRATs). The current research aimed to clarify the locus of these priming effects. Like before, Deese-Roediger-McDermott (DRM) lists were selected to prime subsequent CRATs such that the critical lures were also the solution words to a subset of the CRATs participants attempted to solve. Unique to the present research, recognition memory tests were used and participants were either primed during the list study phase, during the memory test phase, or both. Across two experiments, primed problems were solved more frequently and significantly faster than unprimed problems. Moreover, when participants were primed during the list study phase, subsequent solution times and rates were considerably superior to those produced by those participants who were simply primed at test. Together, these are the first results to show that false-memory priming during encoding facilitates problem-solving in both children and adults.
NASA Astrophysics Data System (ADS)
Gen, Mitsuo; Lin, Lin
Many combinatorial optimization problems from industrial engineering and operations research in real-world are very complex in nature and quite hard to solve them by conventional techniques. Since the 1960s, there has been an increasing interest in imitating living beings to solve such kinds of hard combinatorial optimization problems. Simulating the natural evolutionary process of human beings results in stochastic optimization techniques called evolutionary algorithms (EAs), which can often outperform conventional optimization methods when applied to difficult real-world problems. In this survey paper, we provide a comprehensive survey of the current state-of-the-art in the use of EA in manufacturing and logistics systems. In order to demonstrate the EAs which are powerful and broadly applicable stochastic search and optimization techniques, we deal with the following engineering design problems: transportation planning models, layout design models and two-stage logistics models in logistics systems; job-shop scheduling, resource constrained project scheduling in manufacturing system.
NASA Astrophysics Data System (ADS)
Navas, Pedro; Sanavia, Lorenzo; López-Querol, Susana; Yu, Rena C.
2017-12-01
Solving dynamic problems for fluid saturated porous media at large deformation regime is an interesting but complex issue. An implicit time integration scheme is herein developed within the framework of the u-w (solid displacement-relative fluid displacement) formulation for the Biot's equations. In particular, liquid water saturated porous media is considered and the linearization of the linear momentum equations taking into account all the inertia terms for both solid and fluid phases is for the first time presented. The spatial discretization is carried out through a meshfree method, in which the shape functions are based on the principle of local maximum entropy LME. The current methodology is firstly validated with the dynamic consolidation of a soil column and the plastic shear band formulation of a square domain loaded by a rigid footing. The feasibility of this new numerical approach for solving large deformation dynamic problems is finally demonstrated through the application to an embankment problem subjected to an earthquake.
NASA Technical Reports Server (NTRS)
Kumar, A.; Rudy, D. H.; Drummond, J. P.; Harris, J. E.
1982-01-01
Several two- and three-dimensional external and internal flow problems solved on the STAR-100 and CYBER-203 vector processing computers are described. The flow field was described by the full Navier-Stokes equations which were then solved by explicit finite-difference algorithms. Problem results and computer system requirements are presented. Program organization and data base structure for three-dimensional computer codes which will eliminate or improve on page faulting, are discussed. Storage requirements for three-dimensional codes are reduced by calculating transformation metric data in each step. As a result, in-core grid points were increased in number by 50% to 150,000, with a 10% execution time increase. An assessment of current and future machine requirements shows that even on the CYBER-205 computer only a few problems can be solved realistically. Estimates reveal that the present situation is more storage limited than compute rate limited, but advancements in both storage and speed are essential to realistically calculate three-dimensional flow.
On the adaptive function of children's and adults’ false memories
Howe, Mark L.; Wilkinson, Samantha; Garner, Sarah R.; Ball, Linden J.
2016-01-01
ABSTRACT Recent research has shown that memory illusions can successfully prime both children's and adults' performance on complex, insight-based problems (compound remote associates tasks or CRATs). The current research aimed to clarify the locus of these priming effects. Like before, Deese–Roediger–McDermott (DRM) lists were selected to prime subsequent CRATs such that the critical lures were also the solution words to a subset of the CRATs participants attempted to solve. Unique to the present research, recognition memory tests were used and participants were either primed during the list study phase, during the memory test phase, or both. Across two experiments, primed problems were solved more frequently and significantly faster than unprimed problems. Moreover, when participants were primed during the list study phase, subsequent solution times and rates were considerably superior to those produced by those participants who were simply primed at test. Together, these are the first results to show that false-memory priming during encoding facilitates problem-solving in both children and adults. PMID:26230151
PWHATSHAP: efficient haplotyping for future generation sequencing.
Bracciali, Andrea; Aldinucci, Marco; Patterson, Murray; Marschall, Tobias; Pisanti, Nadia; Merelli, Ivan; Torquati, Massimo
2016-09-22
Haplotype phasing is an important problem in the analysis of genomics information. Given a set of DNA fragments of an individual, it consists of determining which one of the possible alleles (alternative forms of a gene) each fragment comes from. Haplotype information is relevant to gene regulation, epigenetics, genome-wide association studies, evolutionary and population studies, and the study of mutations. Haplotyping is currently addressed as an optimisation problem aiming at solutions that minimise, for instance, error correction costs, where costs are a measure of the confidence in the accuracy of the information acquired from DNA sequencing. Solutions have typically an exponential computational complexity. WHATSHAP is a recent optimal approach which moves computational complexity from DNA fragment length to fragment overlap, i.e., coverage, and is hence of particular interest when considering sequencing technology's current trends that are producing longer fragments. Given the potential relevance of efficient haplotyping in several analysis pipelines, we have designed and engineered PWHATSHAP, a parallel, high-performance version of WHATSHAP. PWHATSHAP is embedded in a toolkit developed in Python and supports genomics datasets in standard file formats. Building on WHATSHAP, PWHATSHAP exhibits the same complexity exploring a number of possible solutions which is exponential in the coverage of the dataset. The parallel implementation on multi-core architectures allows for a relevant reduction of the execution time for haplotyping, while the provided results enjoy the same high accuracy as that provided by WHATSHAP, which increases with coverage. Due to its structure and management of the large datasets, the parallelisation of WHATSHAP posed demanding technical challenges, which have been addressed exploiting a high-level parallel programming framework. The result, PWHATSHAP, is a freely available toolkit that improves the efficiency of the analysis of genomics information.
Lp-Norm Regularization in Volumetric Imaging of Cardiac Current Sources
Rahimi, Azar; Xu, Jingjia; Wang, Linwei
2013-01-01
Advances in computer vision have substantially improved our ability to analyze the structure and mechanics of the heart. In comparison, our ability to observe and analyze cardiac electrical activities is much limited. The progress to computationally reconstruct cardiac current sources from noninvasive voltage data sensed on the body surface has been hindered by the ill-posedness and the lack of a unique solution of the reconstruction problem. Common L2- and L1-norm regularizations tend to produce a solution that is either too diffused or too scattered to reflect the complex spatial structure of current source distribution in the heart. In this work, we propose a general regularization with Lp-norm (1 < p < 2) constraint to bridge the gap and balance between an overly smeared and overly focal solution in cardiac source reconstruction. In a set of phantom experiments, we demonstrate the superiority of the proposed Lp-norm method over its L1 and L2 counterparts in imaging cardiac current sources with increasing extents. Through computer-simulated and real-data experiments, we further demonstrate the feasibility of the proposed method in imaging the complex structure of excitation wavefront, as well as current sources distributed along the postinfarction scar border. This ability to preserve the spatial structure of source distribution is important for revealing the potential disruption to the normal heart excitation. PMID:24348735
Understanding Wicked Problems: A Key to Advancing Environmental Health Promotion
ERIC Educational Resources Information Center
Kreuter, Marshall W.; De Rosa, Christopher; Howze, Elizabeth H.; Baldwin, Grant T.
2004-01-01
Complex environmental health problems--like air and water pollution, hazardous waste sites, and lead poisoning--are in reality a constellation of linked problems embedded in the fabric of the communities in which they occur. These kinds of complex problems have been characterized by some as "wicked problems" wherein stakeholders may have…
Prescott-Clements, Linda; Voller, Vicky; Bell, Mark; Nestors, Natasha; van der Vleuten, Cees P M
2017-01-01
The successful remediation of clinicians demonstrating poor performance in the workplace is essential to ensure the provision of safe patient care. Clinicians may develop performance problems for numerous reasons, including health, personal factors, the workplace environment, or outdated knowledge/skills. Performance problems are often complex involving multifactorial issues, encompassing knowledge, skills, and professional behaviors. It is important that (where possible and appropriate) clinicians are supported through effective remediation to return them to safe clinical practice. A review of the literature demonstrated that research into remediation is in its infancy, with little known about the effectiveness of remediation programs currently. Current strategies for the development of remediation programs are mostly "intuitive"; a few draw upon established theories to inform their approach. Similarly, although it has been established that identification of the nature/scope of performance problems through assessment is an essential first step within remediation, the need for a more widespread "diagnosis" of why the problems exist is emerging. These reasons for poor performance, particularly in the context of experienced practicing clinicians, are likely to have an impact on the potential success of remediation and should be considered within the "diagnosis." A new model for diagnosing the performance problems of the clinicians has been developed, using behavioral change theories to explore known barriers to successful remediation, such as insight, motivation, attitude, self-efficacy, and the working environment, in addition to addressing known deficits regarding knowledge and skills. This novel approach is described in this article. An initial feasibility study has demonstrated the acceptability and practical implementation of our model.
NOTES: a review of the technical problems encountered and their solutions.
Mintz, Yoav; Horgan, Santiago; Cullen, John; Stuart, David; Falor, Eric; Talamini, Mark A
2008-08-01
Natural orifice translumenal endoscopic surgery (NOTES) is currently investigated and developed worldwide. In the past few years, multiple groups have confronted this challenge. Many technical problems are encountered in this technique due to the currently available tools for this approach. Some of the unique technical problems in NOTES include: blindly performed primary incisions; uncontrolled pneumoperitoneal pressure; no support for the endoscope in the abdominal cavity; inadequate vision; insufficient illumination; limited retraction and exposure; and the complexity of suturing and performing a safe anastomosis. In this paper, we review the problems encountered in NOTES and provide possible temporary solutions. Acute and survival studies were performed on 15 farm pigs. The hybrid technique approach (i.e., endoscopic surgery with the aid of laparoscopic vision) was performed in all cases. Procedures performed included liver biopsies, bilateral tubal ligation, oophprectomy, cholecystectomy, splenectomy and small bowel resection, and anastomosis. All attempted procedures were successfully performed. New methods and techniques were developed to overcome the technical problems. Closure of the gastrotomy was achieved by T-bar sutures and by stapler closure of the stomach incision. Small bowel anastomosis was achieved by the dual-lumen NOTES technique. The hybrid technique serves as a temporary approach to aid in developing the NOTES technique. A rectal or vaginal port of entry enables and facilitates gastrointestinal NOTES by using available laparoscopic instruments. The common operations performed today in the laparoscopic fashion could be probably performed in the NOTES approach. The safety of these procedures, however, is yet to be determined.
From scientific literacy to sustainability literacy: An ecological framework for education
NASA Astrophysics Data System (ADS)
Colucci-Gray, Laura; Camino, Elena; Barbiero, Giuseppe; Gray, Donald
2006-03-01
In this paper, we report some reflections on science and education, in relation to teaching and research in the field of complex and controversial socio-environmental issues. Starting from an examination of the literature on the epistemological aspects of the science of controversial issues, and introducing the perspective of complexity, the article argues for a complexity of content, context, and method in understanding current problems. Focusing on a model of learning which includes dialogical and reflective approaches, the final part of the article reports on aspect of the authors' experimental practice with role-play for dealing with complex issues. The review of the literature and our experience of action-research introduce a view of education which promotes young people's awareness of multiple points of view, an ability to establish relationships between processes, scales, and contexts which may be nonlinearly related, and practice with creative and nonviolent forms of interrelations with others. Such an approach in science education is coherent with a scenario of planet sustainability based on ecological webs and equity principles.
Seo, Jung Hee; Mittal, Rajat
2010-01-01
A new sharp-interface immersed boundary method based approach for the computation of low-Mach number flow-induced sound around complex geometries is described. The underlying approach is based on a hydrodynamic/acoustic splitting technique where the incompressible flow is first computed using a second-order accurate immersed boundary solver. This is followed by the computation of sound using the linearized perturbed compressible equations (LPCE). The primary contribution of the current work is the development of a versatile, high-order accurate immersed boundary method for solving the LPCE in complex domains. This new method applies the boundary condition on the immersed boundary to a high-order by combining the ghost-cell approach with a weighted least-squares error method based on a high-order approximating polynomial. The method is validated for canonical acoustic wave scattering and flow-induced noise problems. Applications of this technique to relatively complex cases of practical interest are also presented. PMID:21318129
Complex biomarker discovery in neuroimaging data: Finding a needle in a haystack☆
Atluri, Gowtham; Padmanabhan, Kanchana; Fang, Gang; Steinbach, Michael; Petrella, Jeffrey R.; Lim, Kelvin; MacDonald, Angus; Samatova, Nagiza F.; Doraiswamy, P. Murali; Kumar, Vipin
2013-01-01
Neuropsychiatric disorders such as schizophrenia, bipolar disorder and Alzheimer's disease are major public health problems. However, despite decades of research, we currently have no validated prognostic or diagnostic tests that can be applied at an individual patient level. Many neuropsychiatric diseases are due to a combination of alterations that occur in a human brain rather than the result of localized lesions. While there is hope that newer imaging technologies such as functional and anatomic connectivity MRI or molecular imaging may offer breakthroughs, the single biomarkers that are discovered using these datasets are limited by their inability to capture the heterogeneity and complexity of most multifactorial brain disorders. Recently, complex biomarkers have been explored to address this limitation using neuroimaging data. In this manuscript we consider the nature of complex biomarkers being investigated in the recent literature and present techniques to find such biomarkers that have been developed in related areas of data mining, statistics, machine learning and bioinformatics. PMID:24179856
An Automated Approach to Very High Order Aeroacoustic Computations in Complex Geometries
NASA Technical Reports Server (NTRS)
Dyson, Rodger W.; Goodrich, John W.
2000-01-01
Computational aeroacoustics requires efficient, high-resolution simulation tools. And for smooth problems, this is best accomplished with very high order in space and time methods on small stencils. But the complexity of highly accurate numerical methods can inhibit their practical application, especially in irregular geometries. This complexity is reduced by using a special form of Hermite divided-difference spatial interpolation on Cartesian grids, and a Cauchy-Kowalewslci recursion procedure for time advancement. In addition, a stencil constraint tree reduces the complexity of interpolating grid points that are located near wall boundaries. These procedures are used to automatically develop and implement very high order methods (>15) for solving the linearized Euler equations that can achieve less than one grid point per wavelength resolution away from boundaries by including spatial derivatives of the primitive variables at each grid point. The accuracy of stable surface treatments is currently limited to 11th order for grid aligned boundaries and to 2nd order for irregular boundaries.
Stewart, Simon; Riegel, Barbara; Thompson, David R
2016-02-01
There is clear evidence across the globe that the clinical complexity of patients presenting to hospital with the syndrome of heart failure is increasing - not only in terms of the presence of concurrent disease states, but with additional socio-demographic risk factors that complicate treatment. Management strategies that treat heart failure as the main determinant of health outcomes ignores the multiple and complex issues that will inevitably erode the efficacy and efficiency of current heart failure management programmes. This complex problem (or conundrum) requires a different way of thinking around the complex interactions that underpin poor outcomes in heart failure. In this context, we present the COordinated NUrse-led inteNsified Disease management for continuity of caRe for mUltiMorbidity in Heart Failure (CONUNDRUM-HF) matrix that may well inform future research and models of care to achieve better health outcomes in this rapidly increasing patient population. © The European Society of Cardiology 2015.
Associations between Prosocial and Problem Behavior from Early to Late Adolescence.
Padilla-Walker, Laura M; Memmott-Elison, Madison K; Coyne, Sarah M
2018-05-01
Though recent research has highlighted prosocial behavior as negatively associated with problem behavior during adolescence, we know little about how these variables might be associated longitudinally, whether there are bidirectional effects, and whether there might be different patterns of co-occurrence of behaviors for different individuals. Thus, the current study examined relations between prosocial and problem behaviors in three different ways in an attempt to better understand these associations. Participants included 500 adolescents recruited from a Northwestern state in the USA who took part in the study every year from age 12 to 18 (50% female, 67% European American). Growth curve analyses suggested that change in prosocial behavior was negatively associated with change in aggression and delinquency over time. A longitudinal panel model suggested that prosocial behavior and aggression were negatively associated bidirectionally, and that prosocial behavior was negatively associated with delinquency over time. Finally, mixture modeling conducted at ages 12, 15, and 18 revealed heterogeneity in the ways in which prosocial and problem behaviors co-occur. The discussion focuses on the complexity of interrelations between prosocial behavior and problem behavior across adolescence.
Processor farming in two-level analysis of historical bridge
NASA Astrophysics Data System (ADS)
Krejčí, T.; Kruis, J.; Koudelka, T.; Šejnoha, M.
2017-11-01
This contribution presents a processor farming method in connection with a multi-scale analysis. In this method, each macro-scopic integration point or each finite element is connected with a certain meso-scopic problem represented by an appropriate representative volume element (RVE). The solution of a meso-scale problem provides then effective parameters needed on the macro-scale. Such an analysis is suitable for parallel computing because the meso-scale problems can be distributed among many processors. The application of the processor farming method to a real world masonry structure is illustrated by an analysis of Charles bridge in Prague. The three-dimensional numerical model simulates the coupled heat and moisture transfer of one half of arch No. 3. and it is a part of a complex hygro-thermo-mechanical analysis which has been developed to determine the influence of climatic loading on the current state of the bridge.
Three case studies of the GasNet model in discrete domains.
Santos, C L; de Oliveira, P P; Husbands, P; Souza, C R
2001-06-01
A new neural network model - the GasNet - has been recently reported in the literature, which, in addition to the traditional electric type, point-to-point communication between units, also uses communication through a diffilsable chemical modulator. Here we assess the applicability of this model in three different scenarios, the XOR problem, a food gathering task for a simulated robot, and a docking task for a virtual spaceship. All of them represent discrete domains, a contrast with the one where the GasNet was originally introduced, which had an essentially continuous nature. These scenarios are well-known benchmark problems from the literature and, since they exhibit varying degrees of complexity, they impose distinct performance demands on the GasNet. The experiments were primarily intended to better understand the model, by extending the original problem domain where GasNet was introduced. The results reported point at some difficulties with the current GasNet model.
Simultaneous personnel and vehicle shift scheduling in the waste management sector.
Ghiani, Gianpaolo; Guerriero, Emanuela; Manni, Andrea; Manni, Emanuele; Potenza, Agostino
2013-07-01
Urban waste management is becoming an increasingly complex task, absorbing a huge amount of resources, and having a major environmental impact. The design of a waste management system consists in various activities, and one of these is related to the definition of shift schedules for both personnel and vehicles. This activity has a great incidence on the tactical and operational cost for companies. In this paper, we propose an integer programming model to find an optimal solution to the integrated problem. The aim is to determine optimal schedules at minimum cost. Moreover, we design a fast and effective heuristic to face large-size problems. Both approaches are tested on data from a real-world case in Southern Italy and compared to the current practice utilized by the company managing the service, showing that simultaneously solving these problems can lead to significant monetary savings. Copyright © 2013 Elsevier Ltd. All rights reserved.
Engineering Antifragile Systems: A Change In Design Philosophy
NASA Technical Reports Server (NTRS)
Jones, Kennie H.
2014-01-01
While technology has made astounding advances in the last century, problems are confronting the engineering community that must be solved. Cost and schedule of producing large systems are increasing at an unsustainable rate and these systems often do not perform as intended. New systems are required that may not be achieved by current methods. To solve these problems, NASA is working to infuse concepts from Complexity Science into the engineering process. Some of these problems may be solved by a change in design philosophy. Instead of designing systems to meet known requirements that will always lead to fragile systems at some degree, systems should be designed wherever possible to be antifragile: designing cognitive cyberphysical systems that can learn from their experience, adapt to unforeseen events they face in their environment, and grow stronger in the face of adversity. Several examples are presented of on ongoing research efforts to employ this philosophy.
Cognitive Behavioural Suicide Prevention for Male Prisoners: Case examples
Pratt, Daniel; Gooding, Patricia; Awenat, Yvonne; Eccles, Steve; Tarrier, Nicholas
2015-01-01
Suicide is a serious public health problem but a problem that is preventable. This complex and challenging problem is particularly prevalent amongst prisoners; associated with a five-fold increase in risk compared to the general community. Being in prison can lead people to experience fear, distrust, lack of control, isolation, and shame, which is often experienced as overwhelming and intolerable with some choosing suicide as a way to escape. Few effective psychological interventions exist to prevent suicide although cognitive behaviour therapies appear to offer some promise. Offering cognitive behaviour suicide prevention (CBSP) therapy to high risk prisoners may help to reduce the likelihood of preventable self-inflicted deaths. In this paper we present three cases drawn from a randomised controlled trial designed to investigate the feasibility of CBSP for male prisoners. Implications of the current findings for future research and clinical practice are considered. PMID:27713616
Preparing for Complexity and Wicked Problems through Transformational Learning Approaches
ERIC Educational Resources Information Center
Yukawa, Joyce
2015-01-01
As the information environment becomes increasingly complex and challenging, Library and Information Studies (LIS) education is called upon to nurture innovative leaders capable of managing complex situations and "wicked problems." While disciplinary expertise remains essential, higher levels of mental complexity and adaptive…
Complexity in Nature and Society: Complexity Management in the Age of Globalization
NASA Astrophysics Data System (ADS)
Mainzer, Klaus
The theory of nonlinear complex systems has become a proven problem-solving approach in the natural sciences from cosmic and quantum systems to cellular organisms and the brain. Even in modern engineering science self-organizing systems are developed to manage complex networks and processes. It is now recognized that many of our ecological, social, economic, and political problems are also of a global, complex, and nonlinear nature. What are the laws of sociodynamics? Is there a socio-engineering of nonlinear problem solving? What can we learn from nonlinear dynamics for complexity management in social, economic, financial and political systems? Is self-organization an acceptable strategy to handle the challenges of complexity in firms, institutions and other organizations? It is a main thesis of the talk that nature and society are basically governed by nonlinear and complex information dynamics. How computational is sociodynamics? What can we hope for social, economic and political problem solving in the age of globalization?.
G.A.M.E.: GPU-accelerated mixture elucidator.
Schurz, Alioune; Su, Bo-Han; Tu, Yi-Shu; Lu, Tony Tsung-Yu; Lin, Olivia A; Tseng, Yufeng J
2017-09-15
GPU acceleration is useful in solving complex chemical information problems. Identifying unknown structures from the mass spectra of natural product mixtures has been a desirable yet unresolved issue in metabolomics. However, this elucidation process has been hampered by complex experimental data and the inability of instruments to completely separate different compounds. Fortunately, with current high-resolution mass spectrometry, one feasible strategy is to define this problem as extending a scaffold database with sidechains of different probabilities to match the high-resolution mass obtained from a high-resolution mass spectrum. By introducing a dynamic programming (DP) algorithm, it is possible to solve this NP-complete problem in pseudo-polynomial time. However, the running time of the DP algorithm grows by orders of magnitude as the number of mass decimal digits increases, thus limiting the boost in structural prediction capabilities. By harnessing the heavily parallel architecture of modern GPUs, we designed a "compute unified device architecture" (CUDA)-based GPU-accelerated mixture elucidator (G.A.M.E.) that considerably improves the performance of the DP, allowing up to five decimal digits for input mass data. As exemplified by four testing datasets with verified constitutions from natural products, G.A.M.E. allows for efficient and automatic structural elucidation of unknown mixtures for practical procedures. Graphical abstract .
Cultural differences in complex addition: efficient Chinese versus adaptive Belgians and Canadians.
Imbo, Ineke; LeFevre, Jo-Anne
2009-11-01
In the present study, the authors tested the effects of working-memory load on math problem solving in 3 different cultures: Flemish-speaking Belgians, English-speaking Canadians, and Chinese-speaking Chinese currently living in Canada. Participants solved complex addition problems (e.g., 58 + 76) in no-load and working-memory load conditions, in which either the central executive or the phonological loop was loaded. The authors used the choice/no-choice method to obtain unbiased measures of strategy selection and strategy efficiency. The Chinese participants were faster than the Belgians, who were faster and more accurate than the Canadians. The Chinese also required fewer working-memory resources than did the Belgians and Canadians. However, the Chinese chose less adaptively from the available strategies than did the Belgians and Canadians. These cultural differences in math problem solving are likely the result of different instructional approaches during elementary school (practice and training in Asian countries vs. exploration and flexibility in non-Asian countries), differences in the number language, and informal cultural norms and standards. The relevance of being adaptive is discussed as well as the implications of the results in regards to the strategy choice and discovery simulation model of strategy selection (J. Shrager & R. S. Siegler, 1998).
NASA Astrophysics Data System (ADS)
Chernyak, Vladimir Y.; Chertkov, Michael; Bierkens, Joris; Kappen, Hilbert J.
2014-01-01
In stochastic optimal control (SOC) one minimizes the average cost-to-go, that consists of the cost-of-control (amount of efforts), cost-of-space (where one wants the system to be) and the target cost (where one wants the system to arrive), for a system participating in forced and controlled Langevin dynamics. We extend the SOC problem by introducing an additional cost-of-dynamics, characterized by a vector potential. We propose derivation of the generalized gauge-invariant Hamilton-Jacobi-Bellman equation as a variation over density and current, suggest hydrodynamic interpretation and discuss examples, e.g., ergodic control of a particle-within-a-circle, illustrating non-equilibrium space-time complexity.
The artificial object detection and current velocity measurement using SAR ocean surface images
NASA Astrophysics Data System (ADS)
Alpatov, Boris; Strotov, Valery; Ershov, Maksim; Muraviev, Vadim; Feldman, Alexander; Smirnov, Sergey
2017-10-01
Due to the fact that water surface covers wide areas, remote sensing is the most appropriate way of getting information about ocean environment for vessel tracking, security purposes, ecological studies and others. Processing of synthetic aperture radar (SAR) images is extensively used for control and monitoring of the ocean surface. Image data can be acquired from Earth observation satellites, such as TerraSAR-X, ERS, and COSMO-SkyMed. Thus, SAR image processing can be used to solve many problems arising in this field of research. This paper discusses some of them including ship detection, oil pollution control and ocean currents mapping. Due to complexity of the problem several specialized algorithm are necessary to develop. The oil spill detection algorithm consists of the following main steps: image preprocessing, detection of dark areas, parameter extraction and classification. The ship detection algorithm consists of the following main steps: prescreening, land masking, image segmentation combined with parameter measurement, ship orientation estimation and object discrimination. The proposed approach to ocean currents mapping is based on Doppler's law. The results of computer modeling on real SAR images are presented. Based on these results it is concluded that the proposed approaches can be used in maritime applications.
NASA Technical Reports Server (NTRS)
Benyo, Theresa L.; Jones, William H.
2005-01-01
The development of new ideas is the essence of scientific research. This is frequently done by developing models of physical processes and comparing model predictions with results from experiments. With models becoming ever more complex and data acquisition systems becoming more powerful, the researcher is burdened with wading through data ranging in volume up to a level of many terabytes and beyond. These data often come from multiple, heterogeneous sources and usually the methods for searching through it are at or near the manual level. In addition, current documentation methods are generally limited to researchers pen-and-paper style notebooks. Researchers may want to form constraint-based queries on a body of existing knowledge that is, itself, distributed over many different machines and environments and from the results of such queries then spawn additional queries, simulations, and data analyses in order to discover new insights into the problem being investigated. Currently, researchers are restricted to working within the boundaries of tools that are inefficient at probing current and legacy data to extend the knowledge of the problem at hand and reveal innovative and efficient solutions. A framework called the Project Integration Architecture is discussed that can address these desired functionalities.
A Knowledge-Based and Model-Driven Requirements Engineering Approach to Conceptual Satellite Design
NASA Astrophysics Data System (ADS)
Dos Santos, Walter A.; Leonor, Bruno B. F.; Stephany, Stephan
Satellite systems are becoming even more complex, making technical issues a significant cost driver. The increasing complexity of these systems makes requirements engineering activities both more important and difficult. Additionally, today's competitive pressures and other market forces drive manufacturing companies to improve the efficiency with which they design and manufacture space products and systems. This imposes a heavy burden on systems-of-systems engineering skills and particularly on requirements engineering which is an important phase in a system's life cycle. When this is poorly performed, various problems may occur, such as failures, cost overruns and delays. One solution is to underpin the preliminary conceptual satellite design with computer-based information reuse and integration to deal with the interdisciplinary nature of this problem domain. This can be attained by taking a model-driven engineering approach (MDE), in which models are the main artifacts during system development. MDE is an emergent approach that tries to address system complexity by the intense use of models. This work outlines the use of SysML (Systems Modeling Language) and a novel knowledge-based software tool, named SatBudgets, to deal with these and other challenges confronted during the conceptual phase of a university satellite system, called ITASAT, currently being developed by INPE and some Brazilian universities.
Word problems: a review of linguistic and numerical factors contributing to their difficulty
Daroczy, Gabriella; Wolska, Magdalena; Meurers, Walt Detmar; Nuerk, Hans-Christoph
2015-01-01
Word problems (WPs) belong to the most difficult and complex problem types that pupils encounter during their elementary-level mathematical development. In the classroom setting, they are often viewed as merely arithmetic tasks; however, recent research shows that a number of linguistic verbal components not directly related to arithmetic contribute greatly to their difficulty. In this review, we will distinguish three components of WP difficulty: (i) the linguistic complexity of the problem text itself, (ii) the numerical complexity of the arithmetic problem, and (iii) the relation between the linguistic and numerical complexity of a problem. We will discuss the impact of each of these factors on WP difficulty and motivate the need for a high degree of control in stimuli design for experiments that manipulate WP difficulty for a given age group. PMID:25883575
NASA Astrophysics Data System (ADS)
Hanke, U.; Modler, K.-H.; Neumann, R.; Fischer, C.
The objective of this paper is to simplify a very complex guidance mechanism, currently used for lid separating issues in a packaging-machine. The task of this machine is to pick up a lid from a magazine file, rotate it around 180° and place it on tins. The developed mechanism works successfully but with a very complex construction. It consists of a planetary cam mechanism, combined with a toothed gear (with a constant transmission ratio) and a guiding mechanism with a toothed belt and circular pulleys. Such complex constructions are very common in industrial solutions. The idea of the authors is to show a much simpler design in solving the same problem. They developed a guidance mechanism realizing the same function, consisting only of a toothed belt with non-circular pulleys. The used parts are common trade articles.
Kelemen, Arpad; Vasilakos, Athanasios V; Liang, Yulan
2009-09-01
Comprehensive evaluation of common genetic variations through association of single-nucleotide polymorphism (SNP) structure with common complex disease in the genome-wide scale is currently a hot area in human genome research due to the recent development of the Human Genome Project and HapMap Project. Computational science, which includes computational intelligence (CI), has recently become the third method of scientific enquiry besides theory and experimentation. There have been fast growing interests in developing and applying CI in disease mapping using SNP and haplotype data. Some of the recent studies have demonstrated the promise and importance of CI for common complex diseases in genomic association study using SNP/haplotype data, especially for tackling challenges, such as gene-gene and gene-environment interactions, and the notorious "curse of dimensionality" problem. This review provides coverage of recent developments of CI approaches for complex diseases in genetic association study with SNP/haplotype data.
NASA Astrophysics Data System (ADS)
Watson, Brett; Yeo, Leslie; Friend, James
2010-06-01
Making use of mechanical resonance has many benefits for the design of microscale devices. A key to successfully incorporating this phenomenon in the design of a device is to understand how the resonant frequencies of interest are affected by changes to the geometric parameters of the design. For simple geometric shapes, this is quite easy, but for complex nonlinear designs, it becomes significantly more complex. In this paper, two novel modeling techniques are demonstrated to extract the axial and torsional resonant frequencies of a complex nonlinear geometry. The first decomposes the complex geometry into easy to model components, while the second uses scaling techniques combined with the finite element method. Both models overcome problems associated with using current analytical methods as design tools, and enable a full investigation of how changes in the geometric parameters affect the resonant frequencies of interest. The benefit of such models is then demonstrated through their use in the design of a prototype piezoelectric ultrasonic resonant micromotor which has improved performance characteristics over previous prototypes.
ERIC Educational Resources Information Center
Blackburn, J. Joey; Robinson, J. Shane; Lamm, Alexa J.
2014-01-01
The purpose of this experimental study was to determine the effects of cognitive style and problem complexity on Oklahoma State University preservice agriculture teachers' (N = 56) ability to solve problems in small gasoline engines. Time to solution was operationalized as problem solving ability. Kirton's Adaption-Innovation Inventory was…
Mental workload in decision and control
NASA Technical Reports Server (NTRS)
Sheridan, T. B.
1979-01-01
This paper briefly reviews the problems of defining and measuring the 'mental workload' of aircraft pilots and other human operators of complex dynamic systems. Of the alternative approaches the author indicates a clear preference for the use of subjective scaling. Some recent experiments from MIT and elsewhere are described which utilize subjective mental workload scales in conjunction with human decision and control tasks in the laboratory. Finally a new three-dimensional mental workload rating scale, under current development for use by IFR aircraft pilots, is presented.
Antitumor Activity of Monoterpenes Found in Essential Oils
Sobral, Marianna Vieira; Xavier, Aline Lira; Lima, Tamires Cardoso; de Sousa, Damião Pergentino
2014-01-01
Cancer is a complex genetic disease that is a major public health problem worldwide, accounting for about 7 million deaths each year. Many anticancer drugs currently used clinically have been isolated from plant species or are based on such substances. Accumulating data has revealed anticancer activity in plant-derived monoterpenes. In this review the antitumor activity of 37 monoterpenes found in essential oils is discussed. Chemical structures, experimental models, and mechanisms of action for bioactive substances are presented. PMID:25401162
Breakdowns in Coordination Between Air Traffic Controllers
NASA Technical Reports Server (NTRS)
Bearman, Chris; Orasanu, Judith; Miller, Ronald C.
2011-01-01
This talk outlines the complexity of coordination in air traffic control, introduces the NextGen technologies, identifies common causes for coordination breakdowns in air traffic control and examines whether these causes are likely to be reduced with the introduction of NextGen technologies. While some of the common causes of breakdowns will be reduced in a NextGen environment this conclusion should be drawn carefully given the current stage of development of the technologies and the observation that new technologies often shift problems rather than reduce them.
Abat, F; Alfredson, H; Cucchiarini, M; Madry, H; Marmotti, A; Mouton, C; Oliveira, J M; Pereira, H; Peretti, G M; Romero-Rodriguez, D; Spang, C; Stephen, J; van Bergen, C J A; de Girolamo, L
2017-12-01
Chronic tendinopathies represent a major problem in the clinical practice of sports orthopaedic surgeons, sports doctors and other health professionals involved in the treatment of athletes and patients that perform repetitive actions. The lack of consensus relative to the diagnostic tools and treatment modalities represents a management dilemma for these professionals. With this review, the purpose of the ESSKA Basic Science Committee is to establish guidelines for understanding, diagnosing and treating this complex pathology.
Perspectives on the neuroscience of cognition and consciousness.
Werner, Gerhard
2007-01-01
The origin and current use of the concepts of computation, representation and information in Neuroscience are examined and conceptual flaws are identified which vitiate their usefulness for addressing the problem of the neural basis of Cognition and Consciousness. In contrast, a convergence of views is presented to support the characterization of the Nervous System as a complex dynamical system operating in a metastable regime, and capable of evolving to configurations and transitions in phase space with potential relevance for Cognition and Consciousness.
A convolutional neural network neutrino event classifier
Aurisano, A.; Radovic, A.; Rocco, D.; ...
2016-09-01
Here, convolutional neural networks (CNNs) have been widely applied in the computer vision community to solve complex problems in image recognition and analysis. We describe an application of the CNN technology to the problem of identifying particle interactions in sampling calorimeters used commonly in high energy physics and high energy neutrino physics in particular. Following a discussion of the core concepts of CNNs and recent innovations in CNN architectures related to the field of deep learning, we outline a specific application to the NOvA neutrino detector. This algorithm, CVN (Convolutional Visual Network) identifies neutrino interactions based on their topology withoutmore » the need for detailed reconstruction and outperforms algorithms currently in use by the NOvA collaboration.« less
Some human factors issues in the development and evaluation of cockpit alerting and warning systems
NASA Technical Reports Server (NTRS)
Randle, R. J., Jr.; Larsen, W. E.; Williams, D. H.
1980-01-01
A set of general guidelines for evaluating a newly developed cockpit alerting and warning system in terms of human factors issues are provided. Although the discussion centers around a general methodology, it is made specifically to the issues involved in alerting systems. An overall statement of the current operational problem is presented. Human factors problems with reference to existing alerting and warning systems are described. The methodology for proceeding through system development to system test is discussed. The differences between traditional human factors laboratory evaluations and those required for evaluation of complex man-machine systems under development are emphasized. Performance evaluation in the alerting and warning subsystem using a hypothetical sample system is explained.
A convolutional neural network neutrino event classifier
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aurisano, A.; Radovic, A.; Rocco, D.
Here, convolutional neural networks (CNNs) have been widely applied in the computer vision community to solve complex problems in image recognition and analysis. We describe an application of the CNN technology to the problem of identifying particle interactions in sampling calorimeters used commonly in high energy physics and high energy neutrino physics in particular. Following a discussion of the core concepts of CNNs and recent innovations in CNN architectures related to the field of deep learning, we outline a specific application to the NOvA neutrino detector. This algorithm, CVN (Convolutional Visual Network) identifies neutrino interactions based on their topology withoutmore » the need for detailed reconstruction and outperforms algorithms currently in use by the NOvA collaboration.« less
NASA Technical Reports Server (NTRS)
Lucero, John
2016-01-01
The presentation will provide an overview of the fundamentals and principles of Systems Engineering (SE). This includes understanding the processes that are used to assist the engineer in a successful design, build and implementation of solutions. The context of this presentation will be to describe the involvement of SE throughout the life-cycle of a project from cradle to grave. Due to the ever growing number of complex technical problems facing our world, a Systems Engineering approach is desirable for many reasons. The interdisciplinary technical structure of current systems, technical processes representing System Design, Technical Management and Product Realization are instrumental in the development and integration of new technologies into mainstream applications. This tutorial will demonstrate the application of SE tools to these types of problems..
On the Complexity of Delaying an Adversary’s Project
2005-01-01
interdiction models for such problems and show that the resulting problem com- plexities run the gamut : polynomially solvable, weakly NP-complete, strongly...problems and show that the resulting problem complexities run the gamut : polynomially solvable, weakly NP-complete, strongly NP-complete or NP-hard. We
Solving the Inverse-Square Problem with Complex Variables
ERIC Educational Resources Information Center
Gauthier, N.
2005-01-01
The equation of motion for a mass that moves under the influence of a central, inverse-square force is formulated and solved as a problem in complex variables. To find the solution, the constancy of angular momentum is first established using complex variables. Next, the complex position coordinate and complex velocity of the particle are assumed…
Bovine pasteurellosis and other bacterial infections of the respiratory tract.
Griffin, Dee
2010-03-01
Despite technological, biologic, and pharmacologic advances the bacterial component of the bovine respiratory disease (BRD) complex continues to have a major adverse effect on the health and wellbeing of stocker and feeder cattle. Overlooked in this disappointing assessment is evaluation of the effects that working with younger, lighter-weight cattle have on managing the bacterial component of the BRD complex. Most problems associated with BRD come from cattle taken from and comingled with cattle operations that have inconsistent or nonexistent cattle health management. This article reviews the biologic, clinical, and management aspects of Pasteurella multocida, Mannheimia haemolytica, Histophilus somni, and Mycoplasma bovis, primarily as related to current production management considerations of stocker and feeder cattle. Copyright 2010 Elsevier Inc. All rights reserved.
Magnetically tunable 1D Coulomb drag: Theory
NASA Astrophysics Data System (ADS)
Tylan-Tyler, Anthony; Tang, Yuhe; Levy, Jeremy
In this work, we examine the Coulomb drag effect in 1D nanowires in close proximity, focusing on experimental parameters relevant to complex-oxide nanostructures. Previous work on this problem examined Coulomb drag through quantum point contacts, where effective capacitive coupling between the 2D leads of the system generates the drag voltage. In our case, the entire system is composed of 1D components and thus a more careful treatment of the Coulomb interactions is required. This more complex environment then leads to the ability to switch the drag voltage by an applied magnetic field without altering the current supplied to the drive system. We gratefully acknowledge financial support from ONR N00014-15-1-2847 and DOE DE-SC0014417.
van der Wilt, Gert Jan; Kievit, Wietske; Oortwijn, Wija
2017-01-01
A central idea underlying the INTEGRATE-HTA project is that many of the interventions that are being used in health care are quite complex. By this, we mean that the relation between the delivery of the intervention on the one hand, and the onset of (desired and undesired) changes may be less straightforward than hoped for. There may be all sorts of reasons for this, varying from a lack of resources, lack of skills, perverse incentives, organizational problems, etc. Not identifying such factors and their potential impact may seriously compromise the policy relevance of a health technology assessment (HTA) (1). However, current approaches and methods in HTA do not seem to be adequately geared to deal with this complexity.
Are middle school mathematics teachers able to solve word problems without using variable?
NASA Astrophysics Data System (ADS)
Gökkurt Özdemir, Burçin; Erdem, Emrullah; Örnek, Tuğba; Soylu, Yasin
2018-01-01
Many people consider problem solving as a complex process in which variables such as x, y are used. Problems may not be solved by only using 'variable.' Problem solving can be rationalized and made easier using practical strategies. When especially the development of children at younger ages is considered, it is obvious that mathematics teachers should solve problems through concrete processes. In this context, middle school mathematics teachers' skills to solve word problems without using variables were examined in the current study. Through the case study method, this study was conducted with 60 middle school mathematics teachers who have different professional experiences in five provinces in Turkey. A test consisting of five open-ended word problems was used as the data collection tool. The content analysis technique was used to analyze the data. As a result of the analysis, it was seen that the most of the teachers used trial-and-error strategy or area model as the solution strategy. On the other hand, the teachers who solved the problems using variables such as x, a, n or symbols such as Δ, □, ○, * and who also felt into error by considering these solutions as without variable were also seen in the study.
Improving Search Properties in Genetic Programming
NASA Technical Reports Server (NTRS)
Janikow, Cezary Z.; DeWeese, Scott
1997-01-01
With the advancing computer processing capabilities, practical computer applications are mostly limited by the amount of human programming required to accomplish a specific task. This necessary human participation creates many problems, such as dramatically increased cost. To alleviate the problem, computers must become more autonomous. In other words, computers must be capable to program/reprogram themselves to adapt to changing environments/tasks/demands/domains. Evolutionary computation offers potential means, but it must be advanced beyond its current practical limitations. Evolutionary algorithms model nature. They maintain a population of structures representing potential solutions to the problem at hand. These structures undergo a simulated evolution by means of mutation, crossover, and a Darwinian selective pressure. Genetic programming (GP) is the most promising example of an evolutionary algorithm. In GP, the structures that evolve are trees, which is a dramatic departure from previously used representations such as strings in genetic algorithms. The space of potential trees is defined by means of their elements: functions, which label internal nodes, and terminals, which label leaves. By attaching semantic interpretation to those elements, trees can be interpreted as computer programs (given an interpreter), evolved architectures, etc. JSC has begun exploring GP as a potential tool for its long-term project on evolving dextrous robotic capabilities. Last year we identified representation redundancies as the primary source of inefficiency in GP. Subsequently, we proposed a method to use problem constraints to reduce those redundancies, effectively reducing GP complexity. This method was implemented afterwards at the University of Missouri. This summer, we have evaluated the payoff from using problem constraints to reduce search complexity on two classes of problems: learning boolean functions and solving the forward kinematics problem. We have also developed and implemented methods to use additional problem heuristics to fine-tune the searchable space, and to use typing information to further reduce the search space. Additional improvements have been proposed, but they are yet to be explored and implemented.
ERIC Educational Resources Information Center
Eseryel, Deniz; Ge, Xun; Ifenthaler, Dirk; Law, Victor
2011-01-01
Following a design-based research framework, this article reports two empirical studies with an educational MMOG, called "McLarin's Adventures," on facilitating 9th-grade students' complex problem-solving skill acquisition in interdisciplinary STEM education. The article discusses the nature of complex and ill-structured problem solving…
ERIC Educational Resources Information Center
Goode, Natassia; Beckmann, Jens F.
2010-01-01
This study investigates the relationships between structural knowledge, control performance and fluid intelligence in a complex problem solving (CPS) task. 75 participants received either complete, partial or no information regarding the underlying structure of a complex problem solving task, and controlled the task to reach specific goals.…
Integrating Visualizations into Modeling NEST Simulations
Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W.
2015-01-01
Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work. PMID:26733860
Pre-Service Teachers' Free and Structured Mathematical Problem Posing
ERIC Educational Resources Information Center
Silber, Steven; Cai, Jinfa
2017-01-01
This exploratory study examined how pre-service teachers (PSTs) pose mathematical problems for free and structured mathematical problem-posing conditions. It was hypothesized that PSTs would pose more complex mathematical problems under structured posing conditions, with increasing levels of complexity, than PSTs would pose under free posing…
A restricted Steiner tree problem is solved by Geometric Method II
NASA Astrophysics Data System (ADS)
Lin, Dazhi; Zhang, Youlin; Lu, Xiaoxu
2013-03-01
The minimum Steiner tree problem has wide application background, such as transportation system, communication network, pipeline design and VISL, etc. It is unfortunately that the computational complexity of the problem is NP-hard. People are common to find some special problems to consider. In this paper, we first put forward a restricted Steiner tree problem, which the fixed vertices are in the same side of one line L and we find a vertex on L such the length of the tree is minimal. By the definition and the complexity of the Steiner tree problem, we know that the complexity of this problem is also Np-complete. In the part one, we have considered there are two fixed vertices to find the restricted Steiner tree problem. Naturally, we consider there are three fixed vertices to find the restricted Steiner tree problem. And we also use the geometric method to solve such the problem.
Remote control system for high-perfomance computer simulation of crystal growth by the PFC method
NASA Astrophysics Data System (ADS)
Pavlyuk, Evgeny; Starodumov, Ilya; Osipov, Sergei
2017-04-01
Modeling of crystallization process by the phase field crystal method (PFC) - one of the important directions of modern computational materials science. In this paper, the practical side of the computer simulation of the crystallization process by the PFC method is investigated. To solve problems using this method, it is necessary to use high-performance computing clusters, data storage systems and other often expensive complex computer systems. Access to such resources is often limited, unstable and accompanied by various administrative problems. In addition, the variety of software and settings of different computing clusters sometimes does not allow researchers to use unified program code. There is a need to adapt the program code for each configuration of the computer complex. The practical experience of the authors has shown that the creation of a special control system for computing with the possibility of remote use can greatly simplify the implementation of simulations and increase the performance of scientific research. In current paper we show the principal idea of such a system and justify its efficiency.
Identification of Yeast V-ATPase Mutants by Western Blots Analysis of Whole Cell Lysates
NASA Astrophysics Data System (ADS)
Parra-Belky, Karlett
2002-11-01
A biochemistry laboratory was designed for an undergraduate course to help students better understand the link between molecular engineering and biochemistry. Students identified unknown yeast strains with high specificity using SDS-PAGE and Western blot analysis of whole cell lysates. This problem-solving exercise is a common application of biochemistry in biotechnology research. Three different strains were used: a wild-type and two mutants for the proton pump vacuolar ATPase (V-ATPase). V-ATPases are multisubunit enzymes and the mutants used were deletion mutants; each lacked one structural gene of the complex. After three, three-hour labs, mutant strains were easily identified by the students and distinguished from wild-type cells analyzing the pattern of SDS-PAGE distribution of proteins. Identifying different subunits of one multimeric protein allowed for discussion of the structure and function of this metabolic enzyme, which captured the interest of the students. The experiment can be adapted to other multimeric protein complexes and shows improvement of the described methodology over previous reports, perhaps because the problem and its solution are representative of the type of techniques currently used in research labs.
Hybrid DG/FV schemes for magnetohydrodynamics and relativistic hydrodynamics
NASA Astrophysics Data System (ADS)
Núñez-de la Rosa, Jonatan; Munz, Claus-Dieter
2018-01-01
This paper presents a high order hybrid discontinuous Galerkin/finite volume scheme for solving the equations of the magnetohydrodynamics (MHD) and of the relativistic hydrodynamics (SRHD) on quadrilateral meshes. In this approach, for the spatial discretization, an arbitrary high order discontinuous Galerkin spectral element (DG) method is combined with a finite volume (FV) scheme in order to simulate complex flow problems involving strong shocks. Regarding the time discretization, a fourth order strong stability preserving Runge-Kutta method is used. In the proposed hybrid scheme, a shock indicator is computed at the beginning of each Runge-Kutta stage in order to flag those elements containing shock waves or discontinuities. Subsequently, the DG solution in these troubled elements and in the current time step is projected onto a subdomain composed of finite volume subcells. Right after, the DG operator is applied to those unflagged elements, which, in principle, are oscillation-free, meanwhile the troubled elements are evolved with a robust second/third order FV operator. With this approach we are able to numerically simulate very challenging problems in the context of MHD and SRHD in one, and two space dimensions and with very high order polynomials. We make convergence tests and show a comprehensive one- and two dimensional testbench for both equation systems, focusing in problems with strong shocks. The presented hybrid approach shows that numerical schemes of very high order of accuracy are able to simulate these complex flow problems in an efficient and robust manner.
ERIC Educational Resources Information Center
Scherer, Ronny; Tiemann, Rudiger
2012-01-01
The ability to solve complex scientific problems is regarded as one of the key competencies in science education. Until now, research on problem solving focused on the relationship between analytical and complex problem solving, but rarely took into account the structure of problem-solving processes and metacognitive aspects. This paper,…
ERIC Educational Resources Information Center
Blackburn, J. Joey; Robinson, J. Shane
2016-01-01
The purpose of this experimental study was to assess the effects of cognitive style, problem complexity, and hypothesis generation on the problem solving ability of school-based agricultural education students. Problem solving ability was defined as time to solution. Kirton's Adaption-Innovation Inventory was employed to assess students' cognitive…
Multichromosomal median and halving problems under different genomic distances
Tannier, Eric; Zheng, Chunfang; Sankoff, David
2009-01-01
Background Genome median and genome halving are combinatorial optimization problems that aim at reconstructing ancestral genomes as well as the evolutionary events leading from the ancestor to extant species. Exploring complexity issues is a first step towards devising efficient algorithms. The complexity of the median problem for unichromosomal genomes (permutations) has been settled for both the breakpoint distance and the reversal distance. Although the multichromosomal case has often been assumed to be a simple generalization of the unichromosomal case, it is also a relaxation so that complexity in this context does not follow from existing results, and is open for all distances. Results We settle here the complexity of several genome median and halving problems, including a surprising polynomial result for the breakpoint median and guided halving problems in genomes with circular and linear chromosomes, showing that the multichromosomal problem is actually easier than the unichromosomal problem. Still other variants of these problems are NP-complete, including the DCJ double distance problem, previously mentioned as an open question. We list the remaining open problems. Conclusion This theoretical study clears up a wide swathe of the algorithmical study of genome rearrangements with multiple multichromosomal genomes. PMID:19386099
NASA Astrophysics Data System (ADS)
Huang, Xingguo; Sun, Hui
2018-05-01
Gaussian beam is an important complex geometrical optical technology for modeling seismic wave propagation and diffraction in the subsurface with complex geological structure. Current methods for Gaussian beam modeling rely on the dynamic ray tracing and the evanescent wave tracking. However, the dynamic ray tracing method is based on the paraxial ray approximation and the evanescent wave tracking method cannot describe strongly evanescent fields. This leads to inaccuracy of the computed wave fields in the region with a strong inhomogeneous medium. To address this problem, we compute Gaussian beam wave fields using the complex phase by directly solving the complex eikonal equation. In this method, the fast marching method, which is widely used for phase calculation, is combined with Gauss-Newton optimization algorithm to obtain the complex phase at the regular grid points. The main theoretical challenge in combination of this method with Gaussian beam modeling is to address the irregular boundary near the curved central ray. To cope with this challenge, we present the non-uniform finite difference operator and a modified fast marching method. The numerical results confirm the proposed approach.
Analysis and numerical modelling of eddy current damper for vibration problems
NASA Astrophysics Data System (ADS)
Irazu, L.; Elejabarrieta, M. J.
2018-07-01
This work discusses a contactless eddy current damper, which is used to attenuate structural vibration. Eddy currents can remove energy from dynamic systems without any contact and, thus, without adding mass or modifying the rigidity of the structure. An experimental modal analysis of a cantilever beam in the absence of and under a partial magnetic field is conducted in the bandwidth of 01 kHz. The results show that the eddy current phenomenon can attenuate the vibration of the entire structure without modifying the natural frequencies or the mode shapes of the structure itself. In this study, a new inverse method to numerically determine the dynamic properties of the contactless eddy current damper is proposed. The proposed inverse method and the eddy current model based on a lineal viscous force are validated by a practical application. The numerically obtained transfer function correlates with the experimental one, thus showing good agreement in the entire bandwidth of 01 kHz. The proposed method provides an easy and quick tool to model and predict the dynamic behaviour of the contactless eddy current damper, thereby avoiding the use of complex analytical models.
Clinical Problem Analysis (CPA): A Systematic Approach To Teaching Complex Medical Problem Solving.
ERIC Educational Resources Information Center
Custers, Eugene J. F. M.; Robbe, Peter F. De Vries; Stuyt, Paul M. J.
2000-01-01
Discusses clinical problem analysis (CPA) in medical education, an approach to solving complex clinical problems. Outlines the five step CPA model and examines the value of CPA's content-independent (methodical) approach. Argues that teaching students to use CPA will enable them to avoid common diagnostic reasoning errors and pitfalls. Compares…
Predicting Development of Mathematical Word Problem Solving Across the Intermediate Grades
Tolar, Tammy D.; Fuchs, Lynn; Cirino, Paul T.; Fuchs, Douglas; Hamlett, Carol L.; Fletcher, Jack M.
2012-01-01
This study addressed predictors of the development of word problem solving (WPS) across the intermediate grades. At beginning of 3rd grade, 4 cohorts of students (N = 261) were measured on computation, language, nonverbal reasoning skills, and attentive behavior and were assessed 4 times from beginning of 3rd through end of 5th grade on 2 measures of WPS at low and high levels of complexity. Language skills were related to initial performance at both levels of complexity and did not predict growth at either level. Computational skills had an effect on initial performance in low- but not high-complexity problems and did not predict growth at either level of complexity. Attentive behavior did not predict initial performance but did predict growth in low-complexity, whereas it predicted initial performance but not growth for high-complexity problems. Nonverbal reasoning predicted initial performance and growth for low-complexity WPS, but only growth for high-complexity WPS. This evidence suggests that although mathematical structure is fixed, different cognitive resources may act as limiting factors in WPS development when the WPS context is varied. PMID:23325985
Turbomachinery computational fluid dynamics: asymptotes and paradigm shifts.
Dawes, W N
2007-10-15
This paper reviews the development of computational fluid dynamics (CFD) specifically for turbomachinery simulations and with a particular focus on application to problems with complex geometry. The review is structured by considering this development as a series of paradigm shifts, followed by asymptotes. The original S1-S2 blade-blade-throughflow model is briefly described, followed by the development of two-dimensional then three-dimensional blade-blade analysis. This in turn evolved from inviscid to viscous analysis and then from steady to unsteady flow simulations. This development trajectory led over a surprisingly small number of years to an accepted approach-a 'CFD orthodoxy'. A very important current area of intense interest and activity in turbomachinery simulation is in accounting for real geometry effects, not just in the secondary air and turbine cooling systems but also associated with the primary path. The requirements here are threefold: capturing and representing these geometries in a computer model; making rapid design changes to these complex geometries; and managing the very large associated computational models on PC clusters. Accordingly, the challenges in the application of the current CFD orthodoxy to complex geometries are described in some detail. The main aim of this paper is to argue that the current CFD orthodoxy is on a new asymptote and is not in fact suited for application to complex geometries and that a paradigm shift must be sought. In particular, the new paradigm must be geometry centric and inherently parallel without serial bottlenecks. The main contribution of this paper is to describe such a potential paradigm shift, inspired by the animation industry, based on a fundamental shift in perspective from explicit to implicit geometry and then illustrate this with a number of applications to turbomachinery.
Complex Problem Solving in a Workplace Setting.
ERIC Educational Resources Information Center
Middleton, Howard
2002-01-01
Studied complex problem solving in the hospitality industry through interviews with six office staff members and managers. Findings show it is possible to construct a taxonomy of problem types and that the most common approach can be termed "trial and error." (SLD)
NASA Astrophysics Data System (ADS)
Ulrich, T.; Gabriel, A. A.
2016-12-01
The geometry of faults is subject to a large degree of uncertainty. As buried structures being not directly observable, their complex shapes may only be inferred from surface traces, if available, or through geophysical methods, such as reflection seismology. As a consequence, most studies aiming at assessing the potential hazard of faults rely on idealized fault models, based on observable large-scale features. Yet, real faults are known to be wavy at all scales, their geometric features presenting similar statistical properties from the micro to the regional scale. The influence of roughness on the earthquake rupture process is currently a driving topic in the computational seismology community. From the numerical point of view, rough faults problems are challenging problems that require optimized codes able to run efficiently on high-performance computing infrastructure and simultaneously handle complex geometries. Physically, simulated ruptures hosted by rough faults appear to be much closer to source models inverted from observation in terms of complexity. Incorporating fault geometry on all scales may thus be crucial to model realistic earthquake source processes and to estimate more accurately seismic hazard. In this study, we use the software package SeisSol, based on an ADER-Discontinuous Galerkin scheme, to run our numerical simulations. SeisSol allows solving the spontaneous dynamic earthquake rupture problem and the wave propagation problem with high-order accuracy in space and time efficiently on large-scale machines. In this study, the influence of fault roughness on dynamic rupture style (e.g. onset of supershear transition, rupture front coherence, propagation of self-healing pulses, etc) at different length scales is investigated by analyzing ruptures on faults of varying roughness spectral content. In particular, we investigate the existence of a minimum roughness length scale in terms of rupture inherent length scales below which the rupture ceases to be sensible. Finally, the effect of fault geometry on ground-motions, in the near-field, is considered. Our simulations feature a classical linear slip weakening on the fault and a viscoplastic constitutive model off the fault. The benefits of using a more elaborate fast velocity-weakening friction law will also be considered.
Processing time tolerance-based ACO algorithm for solving job-shop scheduling problem
NASA Astrophysics Data System (ADS)
Luo, Yabo; Waden, Yongo P.
2017-06-01
Ordinarily, Job Shop Scheduling Problem (JSSP) is known as NP-hard problem which has uncertainty and complexity that cannot be handled by a linear method. Thus, currently studies on JSSP are concentrated mainly on applying different methods of improving the heuristics for optimizing the JSSP. However, there still exist many problems for efficient optimization in the JSSP, namely, low efficiency and poor reliability, which can easily trap the optimization process of JSSP into local optima. Therefore, to solve this problem, a study on Ant Colony Optimization (ACO) algorithm combined with constraint handling tactics is carried out in this paper. Further, the problem is subdivided into three parts: (1) Analysis of processing time tolerance-based constraint features in the JSSP which is performed by the constraint satisfying model; (2) Satisfying the constraints by considering the consistency technology and the constraint spreading algorithm in order to improve the performance of ACO algorithm. Hence, the JSSP model based on the improved ACO algorithm is constructed; (3) The effectiveness of the proposed method based on reliability and efficiency is shown through comparative experiments which are performed on benchmark problems. Consequently, the results obtained by the proposed method are better, and the applied technique can be used in optimizing JSSP.
Insight and analysis problem solving in microbes to machines.
Clark, Kevin B
2015-11-01
A key feature for obtaining solutions to difficult problems, insight is oftentimes vaguely regarded as a special discontinuous intellectual process and/or a cognitive restructuring of problem representation or goal approach. However, this nearly century-old state of art devised by the Gestalt tradition to explain the non-analytical or non-trial-and-error, goal-seeking aptitude of primate mentality tends to neglect problem-solving capabilities of lower animal phyla, Kingdoms other than Animalia, and advancing smart computational technologies built from biological, artificial, and composite media. Attempting to provide an inclusive, precise definition of insight, two major criteria of insight, discontinuous processing and problem restructuring, are here reframed using terminology and statistical mechanical properties of computational complexity classes. Discontinuous processing becomes abrupt state transitions in algorithmic/heuristic outcomes or in types of algorithms/heuristics executed by agents using classical and/or quantum computational models. And problem restructuring becomes combinatorial reorganization of resources, problem-type substitution, and/or exchange of computational models. With insight bounded by computational complexity, humans, ciliated protozoa, and complex technological networks, for example, show insight when restructuring time requirements, combinatorial complexity, and problem type to solve polynomial and nondeterministic polynomial decision problems. Similar effects are expected from other problem types, supporting the idea that insight might be an epiphenomenon of analytical problem solving and consequently a larger information processing framework. Thus, this computational complexity definition of insight improves the power, external and internal validity, and reliability of operational parameters with which to classify, investigate, and produce the phenomenon for computational agents ranging from microbes to man-made devices. Copyright © 2015 Elsevier Ltd. All rights reserved.
Lusby, Richard Martin; Schwierz, Martin; Range, Troels Martin; Larsen, Jesper
2016-11-01
The aim of this paper is to provide an improved method for solving the so-called dynamic patient admission scheduling (DPAS) problem. This is a complex scheduling problem that involves assigning a set of patients to hospital beds over a given time horizon in such a way that several quality measures reflecting patient comfort and treatment efficiency are maximized. Consideration must be given to uncertainty in the length of stays of patients as well as the possibility of emergency patients. We develop an adaptive large neighborhood search (ALNS) procedure to solve the problem. This procedure utilizes a Simulated Annealing framework. We thoroughly test the performance of the proposed ALNS approach on a set of 450 publicly available problem instances. A comparison with the current state-of-the-art indicates that the proposed methodology provides solutions that are of comparable quality for small and medium sized instances (up to 1000 patients); the two approaches provide solutions that differ in quality by approximately 1% on average. The ALNS procedure does, however, provide solutions in a much shorter time frame. On larger instances (between 1000-4000 patients) the improvement in solution quality by the ALNS procedure is substantial, approximately 3-14% on average, and as much as 22% on a single instance. The time taken to find such results is, however, in the worst case, a factor 12 longer on average than the time limit which is granted to the current state-of-the-art. The proposed ALNS procedure is an efficient and flexible method for solving the DPAS problem. Copyright © 2016 Elsevier B.V. All rights reserved.
Airway Problems in Neonates—A Review of the Current Investigation and Management Strategies
Mok, Quen
2017-01-01
Airway problems in the neonatal population are often life threatening and raise challenging issues in diagnosis and management. The airway problems can result from congenital or acquired lesions and can be broadly classified into those causing obstruction or those due to an abnormal “communication” in the airway. Many different investigations are now available to identify the diagnosis and quantify the severity of the problem, and these tests can be simple or invasive. Bronchography and bronchoscopy are essential to determine the extent and severity of the airway problem and to plan treatment strategy. Further imaging techniques help to delineate other commonly associated abnormalities. Echocardiography is also important to confirm any associated cardiac abnormality. In this review, the merits and disadvantages of the various investigations now available to the clinician will be discussed. The current therapeutic strategies are discussed, and the review will focus on the most challenging conditions that cause the biggest management conundrums, specifically laryngotracheal cleft, congenital tracheal stenosis, and tracheobronchomalacia. Management of acquired stenosis secondary to airway injury from endotracheal intubation will also be discussed as this is a common problem. Slide tracheoplasty is the preferred surgical option for long-segment tracheal stenosis, and results have improved significantly. Stents are occasionally required for residual or recurrent stenosis following surgical repair. There is sufficient evidence that a multidisciplinary team approach for managing complex airway issues provides the best results for the patient. There is ongoing progress in the field with newer diagnostic tools as well as development of innovative management techniques, such as biodegradable stents and stem cell-based tracheal transplants, leading to a much better prognosis for these children in the future. PMID:28424763
Computational Analysis of Static and Dynamic Behaviour of Magnetic Suspensions and Magnetic Bearings
NASA Technical Reports Server (NTRS)
Britcher, Colin P. (Editor); Groom, Nelson J.
1996-01-01
Static modelling of magnetic bearings is often carried out using magnetic circuit theory. This theory cannot easily include nonlinear effects such as magnetic saturation or the fringing of flux in air-gaps. Modern computational tools are able to accurately model complex magnetic bearing geometries, provided some care is exercised. In magnetic suspension applications, the magnetic fields are highly three-dimensional and require computational tools for the solution of most problems of interest. The dynamics of a magnetic bearing or magnetic suspension system can be strongly affected by eddy currents. Eddy currents are present whenever a time-varying magnetic flux penetrates a conducting medium. The direction of flow of the eddy current is such as to reduce the rate-of-change of flux. Analytic solutions for eddy currents are available for some simplified geometries, but complex geometries must be solved by computation. It is only in recent years that such computations have been considered truly practical. At NASA Langley Research Center, state-of-the-art finite-element computer codes, 'OPERA', 'TOSCA' and 'ELEKTRA' have recently been installed and applied to the magnetostatic and eddy current problems. This paper reviews results of theoretical analyses which suggest general forms of mathematical models for eddy currents, together with computational results. A simplified circuit-based eddy current model proposed appears to predict the observed trends in the case of large eddy current circuits in conducting non-magnetic material. A much more difficult case is seen to be that of eddy currents in magnetic material, or in non-magnetic material at higher frequencies, due to the lower skin depths. Even here, the dissipative behavior has been shown to yield at least somewhat to linear modelling. Magnetostatic and eddy current computations have been carried out relating to the Annular Suspension and Pointing System, a prototype for a space payload pointing and vibration isolation system, where the magnetic actuator geometry resembles a conventional magnetic bearing. Magnetostatic computations provide estimates of flux density within airgaps and the iron core material, fringing at the pole faces and the net force generated. Eddy current computations provide coil inductance, power dissipation and the phase lag in the magnetic field, all as functions of excitation frequency. Here, the dynamics of the magnetic bearings, notably the rise time of forces with changing currents, are found to be very strongly affected by eddy currents, even at quite low frequencies. Results are also compared to experimental measurements of the performance of a large-gap magnetic suspension system, the Large Angle Magnetic Suspension Test Fixture (LAMSTF). Eddy current effects are again shown to significantly affect the dynamics of the system. Some consideration is given to the ease and accuracy of computation, specifically relating to OPERA/TOSCA/ELEKTRA.
Moe, Aubrey M; Breitborde, Nicholas J K; Bourassa, Kyle J; Gallagher, Colin J; Shakeel, Mohammed K; Docherty, Nancy M
2018-06-01
Schizophrenia researchers have focused on phenomenological aspects of the disorder to better understand its underlying nature. In particular, development of personal narratives-that is, the complexity with which people form, organize, and articulate their "life stories"-has recently been investigated in individuals with schizophrenia. However, less is known about how aspects of narrative relate to indicators of neurocognitive and social functioning. The objective of the present study was to investigate the association of linguistic complexity of life-story narratives to measures of cognitive and social problem-solving abilities among people with schizophrenia. Thirty-two individuals with a diagnosis of schizophrenia completed a research battery consisting of clinical interviews, a life-story narrative, neurocognitive testing, and a measure assessing multiple aspects of social problem solving. Narrative interviews were assessed for linguistic complexity using computerized technology. The results indicate differential relationships of linguistic complexity and neurocognition to domains of social problem-solving skills. More specifically, although neurocognition predicted how well one could both describe and enact a solution to a social problem, linguistic complexity alone was associated with accurately recognizing that a social problem had occurred. In addition, linguistic complexity appears to be a cognitive factor that is discernible from other broader measures of neurocognition. Linguistic complexity may be more relevant in understanding earlier steps of the social problem-solving process than more traditional, broad measures of cognition, and thus is relevant in conceptualizing treatment targets. These findings also support the relevance of developing narrative-focused psychotherapies. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Variational Integration for Ideal Magnetohydrodynamics and Formation of Current Singularities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Yao
Coronal heating has been a long-standing conundrum in solar physics. Parker's conjecture that spontaneous current singularities lead to nanoflares that heat the corona has been controversial. In ideal magnetohydrodynamics (MHD), can genuine current singularities emerge from a smooth 3D line-tied magnetic field? To numerically resolve this issue, the schemes employed must preserve magnetic topology exactly to avoid artificial reconnection in the presence of (nearly) singular current densities. Structure-preserving numerical methods are favorable for mitigating numerical dissipation, and variational integration is a powerful machinery for deriving them. However, successful applications of variational integration to ideal MHD have been scarce. In thismore » thesis, we develop variational integrators for ideal MHD in Lagrangian labeling by discretizing Newcomb's Lagrangian on a moving mesh using discretized exterior calculus. With the built-in frozen-in equation, the schemes are free of artificial reconnection, hence optimal for studying current singularity formation. Using this method, we first study a fundamental prototype problem in 2D, the Hahm-Kulsrud-Taylor (HKT) problem. It considers the effect of boundary perturbations on a 2D plasma magnetized by a sheared field, and its linear solution is singular. We find that with increasing resolution, the nonlinear solution converges to one with a current singularity. The same signature of current singularity is also identified in other 2D cases with more complex magnetic topologies, such as the coalescence instability of magnetic islands. We then extend the HKT problem to 3D line-tied geometry, which models the solar corona by anchoring the field lines in the boundaries. The effect of such geometry is crucial in the controversy over Parker's conjecture. The linear solution, which is singular in 2D, is found to be smooth. However, with finite amplitude, it can become pathological above a critical system length. The nonlinear solution turns out smooth for short systems. Nonetheless, the scaling of peak current density vs. system length suggests that the nonlinear solution may become singular at a finite length. With the results in hand, we cannot confirm or rule out this possibility conclusively, since we cannot obtain solutions with system lengths near the extrapolated critical value.« less
Xavier, Pascal; Rauly, Dominique; Chamberod, Eric; Martins, Jean M F
2017-04-01
In this work, the problem of intracellular currents in longilinear bacteria, such as Escherichia coli, suspended in a physiological medium and submitted to a harmonic voltage (AC), is analyzed using the Finite-Element-based software COMSOL Multiphysics. Bacterium was modeled as a cylindrical capsule, ended by semi-spheres and surrounded by a dielectric cell wall. An equivalent single-layer cell wall was defined, starting from the well-recognized three-shell modeling approach. The bacterium was considered immersed in a physiological medium, which was also taken into account in the modeling. A new complex transconductance was thus introduced, relating the complex ratio between current inside the bacterium and voltage applied between two parallel equipotential planes, separated by a realistic distance. When voltage was applied longitudinally relative to the bacterium main axis, numerical results in terms of frequency response in the 1-20 MHz range for E. coli cells revealed that transconductance magnitude exhibited a maximum at a frequency depending on the cell wall capacitance. This occurred in spite of the purely passive character of the model and could be explained by an equivalent electrical network giving very similar results and showing special conditions for lateral paths of the currents through the cell wall. It is shown that the main contribution to this behavior is due to the conductive part of the current. Bioelectromagnetics. 38:213-219, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Grossi, Enzo
2006-05-03
In recent years a number of algorithms for cardiovascular risk assessment has been proposed to the medical community. These algorithms consider a number of variables and express their results as the percentage risk of developing a major fatal or non-fatal cardiovascular event in the following 10 to 20 years The author has identified three major pitfalls of these algorithms, linked to the limitation of the classical statistical approach in dealing with this kind of non linear and complex information. The pitfalls are the inability to capture the disease complexity, the inability to capture process dynamics, and the wide confidence interval of individual risk assessment. Artificial Intelligence tools can provide potential advantage in trying to overcome these limitations. The theoretical background and some application examples related to artificial neural networks and fuzzy logic have been reviewed and discussed. The use of predictive algorithms to assess individual absolute risk of cardiovascular future events is currently hampered by methodological and mathematical flaws. The use of newer approaches, such as fuzzy logic and artificial neural networks, linked to artificial intelligence, seems to better address both the challenge of increasing complexity resulting from a correlation between predisposing factors, data on the occurrence of cardiovascular events, and the prediction of future events on an individual level.
NASA Astrophysics Data System (ADS)
Likun, Wang; Weili, Li; Yi, Xue; Chunwei, Guan
2013-11-01
A significant problem of turbogenerators on complex end structures is overheating of local parts caused by end losses in the end region. Therefore, it is important to investigate the 3-D magnetic field and eddy current loss in the end. In end region of operating large turbogenerator at thermal power plants, magnetic leakage field distribution is complex. In this paper, a 3-D mathematical model used for the calculation of the electromagnetic field in the end region of large turbo-generators is given. The influence of spatial locations of end structures, the actual shape and material of end windings, clamping plate, and copper screen are considered. Adopting the time-step finite element (FE) method and taking the nonlinear characteristics of the core into consideration, a 3-D transient magnetic field is calculated. The objective of this paper is to investigate the influence of clamping plate permeability and metal screen structures on 3-D electromagnetic field distribution and eddy current loss in end region of a turbo-generator. To reduce the temperature of copper screen, a hollow metal screen is proposed. The eddy current loss, which is gained from the 3D transient magnetic field, is used as heat source for the thermal field of end region. The calculated temperatures are compared with test data.
Pacemakers and implantable cardioverter-defibrillators in pediatric patients.
Silka, Michael J; Bar-Cohen, Yaniv
2006-11-01
The use of pacemakers and implantable cardioverter-defibrillators (ICDs) in infants, children, and patients with congenital heart disease presents unique challenges and considerations. They include uncommon indications for device implantation, innovative approaches to lead implantation and configuration, and age-dependent and disease-specific aspects of device programming. In this review, the current indications for pacemaker and ICD implantation in young patients are discussed, followed by consideration of the approaches to lead and device placement in very small patients and those with complex congenital heart disease, in whom unique problems may be encountered. The limitations of programmability of current pacemakers and ICDs when used in young patients are discussed, followed by an analysis of long-term device follow-up and potential late complications.
Why the Lack of Academic Literature on Export Controls?
NASA Technical Reports Server (NTRS)
Kremic, Tibor
2001-01-01
Export controls is currently a relevant and dynamic topic. Given the growth of global operations and the high-tech nature of many products, an increase in awareness and understanding of the impacts of export controls are necessary. A structured approach to export controls has been in existence since 1949. Despite over 50 years of history, surprisingly little academic research and literature exists on the topic. This paper explores the current export control environment and explores possible reasons for the limited academic interest. Five possible reasons are discussed: (1) dynamic nature of the topic; (2) difficulty in ensuring accurate data; (3) Complexity of the problem; (4) relatively small economic impact; and (5) sensitive information. A research approach is recommended that considers these potential obstacles.
Boonen, Anton J. H.; de Koning, Björn B.; Jolles, Jelle; van der Schoot, Menno
2016-01-01
Successfully solving mathematical word problems requires both mental representation skills and reading comprehension skills. In Realistic Math Education (RME), however, students primarily learn to apply the first of these skills (i.e., representational skills) in the context of word problem solving. Given this, it seems legitimate to assume that students from a RME curriculum experience difficulties when asked to solve semantically complex word problems. We investigated this assumption under 80 sixth grade students who were classified as successful and less successful word problem solvers based on a standardized mathematics test. To this end, students completed word problems that ask for both mental representation skills and reading comprehension skills. The results showed that even successful word problem solvers had a low performance on semantically complex word problems, despite adequate performance on semantically less complex word problems. Based on this study, we concluded that reading comprehension skills should be given a (more) prominent role during word problem solving instruction in RME. PMID:26925012
Boonen, Anton J H; de Koning, Björn B; Jolles, Jelle; van der Schoot, Menno
2016-01-01
Successfully solving mathematical word problems requires both mental representation skills and reading comprehension skills. In Realistic Math Education (RME), however, students primarily learn to apply the first of these skills (i.e., representational skills) in the context of word problem solving. Given this, it seems legitimate to assume that students from a RME curriculum experience difficulties when asked to solve semantically complex word problems. We investigated this assumption under 80 sixth grade students who were classified as successful and less successful word problem solvers based on a standardized mathematics test. To this end, students completed word problems that ask for both mental representation skills and reading comprehension skills. The results showed that even successful word problem solvers had a low performance on semantically complex word problems, despite adequate performance on semantically less complex word problems. Based on this study, we concluded that reading comprehension skills should be given a (more) prominent role during word problem solving instruction in RME.
Poot, Antonius J.; den Elzen, Wendy P. J.; Blom, Jeanet W.; Gussekloo, Jacobijn
2014-01-01
Background Satisfaction is widely used to evaluate and direct delivery of medical care; a complicated relationship exists between patient satisfaction, morbidity and age. This study investigates the relationships between complexity of health problems and level of patient satisfaction of older persons with their general practitioner (GP) and practice. Methods and Findings This study is embedded in the ISCOPE (Integrated Systematic Care for Older Persons) study. Enlisted patients aged ≥75 years from 59 practices received a written questionnaire to screen for complex health problems (somatic, functional, psychological and social). For 2664 randomly chosen respondents (median age 82 years; 68% female) information was collected on level of satisfaction (satisfied, neutral, dissatisfied) with their GP and general practice, and demographic and clinical characteristics including complexity of health problems. Of all participants, 4% was dissatisfied with their GP care, 59% neutral and 37% satisfied. Between these three categories no differences were observed in age, gender, country of birth or education level. The percentage of participants dissatisfied with their GP care increased from 0.4% in those with 0 problem domains to 8% in those with 4 domains, i.e. having complex health problems (p<0.001). Per additional health domain with problems, the risk of being dissatisfied increased 1.7 times (95% CI 1.4–2.14; p<0.001). This was independent of age, gender, and demographic and clinical parameters (adjusted OR 1.4, 95% CI 1.1–1.8; p = 0.021). Conclusion In older persons, dissatisfaction with general practice is strongly correlated with rising complexity of health problems, independent of age, demographic and clinical parameters. It remains unclear whether complexity of health problems is a patient characteristic influencing the perception of care, or whether the care is unable to handle the demands of these patients. Prospective studies are needed to investigate the causal associations between care organization, patient characteristics, indicators of quality, and patient perceptions. PMID:24710557
Poot, Antonius J; den Elzen, Wendy P J; Blom, Jeanet W; Gussekloo, Jacobijn
2014-01-01
Satisfaction is widely used to evaluate and direct delivery of medical care; a complicated relationship exists between patient satisfaction, morbidity and age. This study investigates the relationships between complexity of health problems and level of patient satisfaction of older persons with their general practitioner (GP) and practice. This study is embedded in the ISCOPE (Integrated Systematic Care for Older Persons) study. Enlisted patients aged ≥75 years from 59 practices received a written questionnaire to screen for complex health problems (somatic, functional, psychological and social). For 2664 randomly chosen respondents (median age 82 years; 68% female) information was collected on level of satisfaction (satisfied, neutral, dissatisfied) with their GP and general practice, and demographic and clinical characteristics including complexity of health problems. Of all participants, 4% was dissatisfied with their GP care, 59% neutral and 37% satisfied. Between these three categories no differences were observed in age, gender, country of birth or education level. The percentage of participants dissatisfied with their GP care increased from 0.4% in those with 0 problem domains to 8% in those with 4 domains, i.e. having complex health problems (p<0.001). Per additional health domain with problems, the risk of being dissatisfied increased 1.7 times (95% CI 1.4-2.14; p<0.001). This was independent of age, gender, and demographic and clinical parameters (adjusted OR 1.4, 95% CI 1.1-1.8; p = 0.021). In older persons, dissatisfaction with general practice is strongly correlated with rising complexity of health problems, independent of age, demographic and clinical parameters. It remains unclear whether complexity of health problems is a patient characteristic influencing the perception of care, or whether the care is unable to handle the demands of these patients. Prospective studies are needed to investigate the causal associations between care organization, patient characteristics, indicators of quality, and patient perceptions.
Norberg, Melissa M; Ham, Lindsay S; Olivier, Jake; Zamboanga, Byron L; Melkonian, Alexander; Fugitt, Jessica L
2016-07-02
Pregaming is a high-risk drinking behavior associated with increased alcohol consumption and alcohol-related problems. Quantity of alcohol consumed does not fully explain the level of problems associated with pregaming; yet, limited research has examined factors that may interact with pregaming behavior to contribute to the experience of alcohol-related problems. The current study examined whether use of two emotion regulation strategies influence pregaming's contribution to alcohol-related problems. Undergraduates (N = 1857) aged 18-25 years attending 19 different colleges completed an online survey in 2008-2009. Linear mixed models were used to test whether emotion regulation strategies moderate the association between pregaming status (pregamers vs. non/infrequent pregamers) and alcohol-related problems, when controlling for alcohol consumption, demographic covariates, and site as a random effect. Greater use of cognitive reappraisal was associated with decreased alcohol problems. Expressive suppression interacted with pregaming status. There was no relationship between pregaming status and alcohol problems for students who rarely used expression suppression; however, the relationship between pregaming status and alcohol problems was statistically significant for students who occasionally to frequently used expression suppression. Findings suggest that the relationship between pregaming and alcohol-related problems is complex. Accordingly, future studies should utilize event-level methodology to understand how emotion regulation strategies influence alcohol-related problems. Further, clinicians should tailor alcohol treatments to help students increase their use of cognitive reappraisal and decrease their use of suppression.
NASA Technical Reports Server (NTRS)
Schunk, Richard Gregory; Chung, T. J.
2001-01-01
A parallelized version of the Flowfield Dependent Variation (FDV) Method is developed to analyze a problem of current research interest, the flowfield resulting from a triple shock/boundary layer interaction. Such flowfields are often encountered in the inlets of high speed air-breathing vehicles including the NASA Hyper-X research vehicle. In order to resolve the complex shock structure and to provide adequate resolution for boundary layer computations of the convective heat transfer from surfaces inside the inlet, models containing over 500,000 nodes are needed. Efficient parallelization of the computation is essential to achieving results in a timely manner. Results from a parallelization scheme, based upon multi-threading, as implemented on multiple processor supercomputers and workstations is presented.
Tabu Search enhances network robustness under targeted attacks
NASA Astrophysics Data System (ADS)
Sun, Shi-wen; Ma, Yi-lin; Li, Rui-qi; Wang, Li; Xia, Cheng-yi
2016-03-01
We focus on the optimization of network robustness with respect to intentional attacks on high-degree nodes. Given an existing network, this problem can be considered as a typical single-objective combinatorial optimization problem. Based on the heuristic Tabu Search optimization algorithm, a link-rewiring method is applied to reconstruct the network while keeping the degree of every node unchanged. Through numerical simulations, BA scale-free network and two real-world networks are investigated to verify the effectiveness of the proposed optimization method. Meanwhile, we analyze how the optimization affects other topological properties of the networks, including natural connectivity, clustering coefficient and degree-degree correlation. The current results can help to improve the robustness of existing complex real-world systems, as well as to provide some insights into the design of robust networks.
Dueck, Alexander; Berger, Christoph; Wunsch, Katharina; Thome, Johannes; Cohrs, Stefan; Reis, Olaf; Haessler, Frank
2017-02-01
A more recent branch of research describes the importance of sleep problems in the development and treatment of mental disorders in children and adolescents, such as attention-deficit hyperactivity disorder (ADHD) and mood disorders (MD). Research about clock genes has continued since 2012 with a focus on metabolic processes within all parts of the mammalian body, but particularly within different cerebral regions. Research has focused on complex regulatory circuits involving clock genes themselves and their influence on circadian rhythms of diverse body functions. Current publications on basic research in human and animal models indicate directions for the treatment of mental disorders targeting circadian rhythms and mechanisms. The most significant lines of research are described in this paper.
Auto Draw from Excel Input Files
NASA Technical Reports Server (NTRS)
Strauss, Karl F.; Goullioud, Renaud; Cox, Brian; Grimes, James M.
2011-01-01
The design process often involves the use of Excel files during project development. To facilitate communications of the information in the Excel files, drawings are often generated. During the design process, the Excel files are updated often to reflect new input. The problem is that the drawings often lag the updates, often leading to confusion of the current state of the design. The use of this program allows visualization of complex data in a format that is more easily understandable than pages of numbers. Because the graphical output can be updated automatically, the manual labor of diagram drawing can be eliminated. The more frequent update of system diagrams can reduce confusion and reduce errors and is likely to uncover symmetric problems earlier in the design cycle, thus reducing rework and redesign.
NASA Astrophysics Data System (ADS)
Bog, Tino; Zander, Nils; Kollmannsberger, Stefan; Rank, Ernst
2018-04-01
The finite cell method (FCM) is a fictitious domain approach that greatly simplifies simulations involving complex structures. Recently, the FCM has been applied to contact problems. The current study continues in this field by extending the concept of weakly enforced boundary conditions to inequality constraints for frictionless contact. Furthermore, it formalizes an approach that automatically recovers high-order contact surfaces of (implicitly defined) embedded geometries by means of an extended Marching Cubes algorithm. To further improve the accuracy of the discretization, irregularities at the boundary of contact zones are treated with multi-level hp-refinements. Numerical results and a systematic study of h-, p- and hp-refinements show that the FCM can efficiently provide accurate results for problems involving contact.
Dynamics of tokamak plasma surface current in 3D ideal MHD model
NASA Astrophysics Data System (ADS)
Galkin, Sergei A.; Svidzinski, V. A.; Zakharov, L. E.
2013-10-01
Interest in the surface current which can arise on perturbed sharp plasma vacuum interface in tokamaks was recently generated by a few papers (see and references therein). In dangerous disruption events with plasma-touching-wall scenarios, the surface current can be shared with the wall leading to the strong, damaging forces acting on the wall A relatively simple analytic definition of δ-function surface current proportional to a jump of tangential component of magnetic field nevertheless leads to a complex computational problem on the moving plasma-vacuum interface, requiring the incorporation of non-linear 3D plasma dynamics even in one-fluid ideal MHD. The Disruption Simulation Code (DSC), which had recently been developed in a fully 3D toroidal geometry with adaptation to the moving plasma boundary, is an appropriate tool for accurate self-consistent δfunction surface current calculation. Progress on the DSC-3D development will be presented. Self-consistent surface current calculation under non-linear dynamics of low m kink mode and VDE will be discussed. Work is supported by the US DOE SBIR grant #DE-SC0004487.
Growth and Development of Three-Dimensional Plant Form.
Whitewoods, Christopher D; Coen, Enrico
2017-09-11
Plants can generate a spectacular array of complex shapes, many of which exhibit elaborate curvature in three dimensions, illustrated for example by orchid flowers and pitcher-plant traps. All of these structures arise through differential growth. Recent findings provide fresh mechanistic insights into how regional cell behaviours may lead to tissue deformations, including anisotropies and curvatures, which shape growing volumes and sheets of cells. Here were review our current understanding of how genes, growth, mechanics, and evolution interact to generate diverse structures. We illustrate problems and approaches with the complex three-dimensional trap of the bladderwort, Utricularia gibba, to show how a multidisciplinary approach can be extended to new model systems to understand how diverse plant shapes can develop and evolve. Copyright © 2017 Elsevier Ltd. All rights reserved.
Advanced Methodology for Simulation of Complex Flows Using Structured Grid Systems
NASA Technical Reports Server (NTRS)
Steinthorsson, Erlendur; Modiano, David
1995-01-01
Detailed simulations of viscous flows in complicated geometries pose a significant challenge to current capabilities of Computational Fluid Dynamics (CFD). To enable routine application of CFD to this class of problems, advanced methodologies are required that employ (a) automated grid generation, (b) adaptivity, (c) accurate discretizations and efficient solvers, and (d) advanced software techniques. Each of these ingredients contributes to increased accuracy, efficiency (in terms of human effort and computer time), and/or reliability of CFD software. In the long run, methodologies employing structured grid systems will remain a viable choice for routine simulation of flows in complex geometries only if genuinely automatic grid generation techniques for structured grids can be developed and if adaptivity is employed more routinely. More research in both these areas is urgently needed.
Anismus: the cause of constipation? Results of investigation and treatment.
Duthie, G S; Bartolo, D C
1992-01-01
Anismus, or failure of the somatic sphincter apparatus to relax at defecation, has been implicated as a major contributor to the problem of obstructed defecation. Current diagnostic methods depend on laboratory measurements of attempted defecation and the most complex, dynamic proctography has been the mainstay of diagnosis. Using a new computerized ambulatory method of recording sphincter function in these patients at home, we report an 80% reduction in our diagnostic rate suggesting that conventional tests fail to accurately diagnose this condition, probably because they poorly represent the natural physiology of defecation. Treatment of this distressing condition is more complex and a variety of surgical and pharmacological measures have failed. Biofeedback retraining of anorectal function of these patients has been very successful and represents the management of choice.
Moving into the 'patient-centred medical home': reforming Australian general practice.
Hayes, Paul; Lynch, Anthony; Stiffe, Jenni
2016-09-01
The Australian healthcare system is a complex network of services and providers funded and administered by federal, state and territory governments, supplemented by private health insurance and patient contributions. The broad geographical range, complexity and increasing demand within the Australian healthcare sector mean health expenditure is high. Aspects of current funding for the healthcare system have attracted criticism from medical practitioners, patients, representative organisations and independent statutory agencies. In response to the problems in primary care funding in Australia, The Royal Australian College of General Practitioners developed the Vision for general practice and a sustainable healthcare system (the Vision). The Vision presents a plan to improve healthcare delivery in Australia through greater quality, access and efficiency by reorienting how general practice services are funded based on the 'patient-centred medical home' model.
Search-free license plate localization based on saliency and local variance estimation
NASA Astrophysics Data System (ADS)
Safaei, Amin; Tang, H. L.; Sanei, S.
2015-02-01
In recent years, the performance and accuracy of automatic license plate number recognition (ALPR) systems have greatly improved, however the increasing number of applications for such systems have made ALPR research more challenging than ever. The inherent computational complexity of search dependent algorithms remains a major problem for current ALPR systems. This paper proposes a novel search-free method of localization based on the estimation of saliency and local variance. Gabor functions are then used to validate the choice of candidate license plate. The algorithm was applied to three image datasets with different levels of complexity and the results compared with a number of benchmark methods, particularly in terms of speed. The proposed method outperforms the state of the art methods and can be used for real time applications.
Understanding the determinants of problem-solving behavior in a complex environment
NASA Technical Reports Server (NTRS)
Casner, Stephen A.
1994-01-01
It is often argued that problem-solving behavior in a complex environment is determined as much by the features of the environment as by the goals of the problem solver. This article explores a technique to determine the extent to which measured features of a complex environment influence problem-solving behavior observed within that environment. In this study, the technique is used to determine how complex flight deck and air traffic control environment influences the strategies used by airline pilots when controlling the flight path of a modern jetliner. Data collected aboard 16 commercial flights are used to measure selected features of the task environment. A record of the pilots' problem-solving behavior is analyzed to determine to what extent behavior is adapted to the environmental features that were measured. The results suggest that the measured features of the environment account for as much as half of the variability in the pilots' problem-solving behavior and provide estimates on the probable effects of each environmental feature.
NASA Astrophysics Data System (ADS)
Ollé, Mercè; Pacha, Joan R.
1999-11-01
In the present work we use certain isolated symmetric periodic orbits found in some limiting Restricted Three-Body Problems to obtain, by numerical continuation, families of symmetric periodic orbits of the more general Spatial Elliptic Restricted Three Body Problem. In particular, the Planar Isosceles Restricted Three Body Problem, the Sitnikov Problem and the MacMillan problem are considered. A stability study for the periodic orbits of the families obtained - specially focused to detect transitions to complex instability - is also made.
Centrifugal Compressor Aeroelastic Analysis Code
NASA Astrophysics Data System (ADS)
Keith, Theo G., Jr.; Srivastava, Rakesh
2002-01-01
Centrifugal compressors are very widely used in the turbomachine industry where low mass flow rates are required. Gas turbine engines for tanks, rotorcraft and small jets rely extensively on centrifugal compressors for rugged and compact design. These compressors experience problems related with unsteadiness of flowfields, such as stall flutter, separation at the trailing edge over diffuser guide vanes, tip vortex unsteadiness, etc., leading to rotating stall and surge. Considerable interest exists in small gas turbine engine manufacturers to understand and eventually eliminate the problems related to centrifugal compressors. The geometric complexity of centrifugal compressor blades and the twisting of the blade passages makes the linear methods inapplicable. Advanced computational fluid dynamics (CFD) methods are needed for accurate unsteady aerodynamic and aeroelastic analysis of centrifugal compressors. Most of the current day industrial turbomachines and small aircraft engines are designed with a centrifugal compressor. With such a large customer base and NASA Glenn Research Center being, the lead center for turbomachines, it is important that adequate emphasis be placed on this area as well. Currently, this activity is not supported under any project at NASA Glenn.
Towards sustainable groundwater use: Setting long-term goals, backcasting, and managing adaptively
Gleeson, T.; Alley, W.M.; Allen, D.M.; Sophocleous, M.A.; Zhou, Y.; Taniguchi, M.; Vandersteen, J.
2012-01-01
The sustainability of crucial earth resources, such as groundwater, is a critical issue. We consider groundwater sustainability a value-driven process of intra- and intergenerational equity that balances the environment, society, and economy. Synthesizing hydrogeological science and current sustainability concepts, we emphasize three sustainability approaches: setting multigenerational sustainability goals, backcasting, and managing adaptively. As most aquifer problems are long-term problems, we propose that multigenerational goals (50 to 100 years) for water quantity and quality that acknowledge the connections between groundwater, surface water, and ecosystems be set for many aquifers. The goals should be set by a watershed- or aquifer-based community in an inclusive and participatory manner. Policies for shorter time horizons should be developed by backcasting, and measures implemented through adaptive management to achieve the long-term goals. Two case histories illustrate the importance and complexity of a multigenerational perspective and adaptive management. These approaches could transform aquifer depletion and contamination to more sustainable groundwater use, providing groundwater for current and future generations while protecting ecological integrity and resilience. ?? 2011, The Author(s). Ground Water ?? 2011, National Ground Water Association.
Mille, Marie-Laure; Creath, Robert A.; Prettyman, Michelle G.; Johnson Hilliard, Marjorie; Martinez, Katherine M.; MacKinnon, Colum D.; Rogers, Mark W.
2012-01-01
Disorders of posture, balance, and gait are debilitating motor manifestations of advancing Parkinson's disease requiring rehabilitation intervention. These problems often reflect difficulties with coupling or sequencing posture and locomotion during complex whole body movements linked with falls. Considerable progress has been made with demonstrating the effectiveness of exercise interventions for individuals with Parkinson's disease. However, gaps remain in the evidence base for specific interventions and the optimal content of exercise interventions. Using a conceptual theoretical framework and experimental findings, this perspective and review advances the viewpoint that rehabilitation interventions focused on separate or isolated components of posture, balance, or gait may limit the effectiveness of current clinical practices. It is argued that treatment effectiveness may be improved by directly targeting posture and locomotion coupling problems as causal factors contributing to balance and gait dysfunction. This approach may help advance current clinical practice and improve outcomes in rehabilitation for persons with Parkinson's disease. “. . .postural activity should be regarded as a function in its own right and not merely as a component of movement. . .” James Purdon Martin PMID:22295253
Complex fuzzy soft expert sets
NASA Astrophysics Data System (ADS)
Selvachandran, Ganeshsree; Hafeed, Nisren A.; Salleh, Abdul Razak
2017-04-01
Complex fuzzy sets and its accompanying theory although at its infancy, has proven to be superior to classical type-1 fuzzy sets, due its ability in representing time-periodic problem parameters and capturing the seasonality of the fuzziness that exists in the elements of a set. These are important characteristics that are pervasive in most real world problems. However, there are two major problems that are inherent in complex fuzzy sets: it lacks a sufficient parameterization tool and it does not have a mechanism to validate the values assigned to the membership functions of the elements in a set. To overcome these problems, we propose the notion of complex fuzzy soft expert sets which is a hybrid model of complex fuzzy sets and soft expert sets. This model incorporates the advantages of complex fuzzy sets and soft sets, besides having the added advantage of allowing the users to know the opinion of all the experts in a single model without the need for any additional cumbersome operations. As such, this model effectively improves the accuracy of representation of problem parameters that are periodic in nature, besides having a higher level of computational efficiency compared to similar models in literature.
ERIC Educational Resources Information Center
Bogard, Treavor; Liu, Min; Chiang, Yueh-hui Vanessa
2013-01-01
This multiple-case study examined how advanced learners solved a complex problem, focusing on how their frequency and application of cognitive processes contributed to differences in performance outcomes, and developing a mental model of a problem. Fifteen graduate students with backgrounds related to the problem context participated in the study.…
What Causes Care Coordination Problems? A Case for Microanalysis
Zachary, Wayne; Maulitz, Russell Charles; Zachary, Drew A.
2016-01-01
Introduction: Care coordination (CC) is an important fulcrum for pursuing a range of health care goals. Current research and policy analyses have focused on aggregated data rather than on understanding what happens within individual cases. At the case level, CC emerges as a complex network of communications among providers over time, crossing and recrossing many organizational boundaries. Micro-level analysis is needed to understand where and how CC fails, as well as to identify best practices and root causes of problems. Coordination Process Diagramming: Coordination Process Diagramming (CPD) is a new framework for representing and analyzing CC arcs at the micro level, separating an arc into its participants and roles, communication structure, organizational structures, and transitions of care, all on a common time line. Conclusion: Comparative CPD analysis across a sample of CC arcs identifies common CC problems and potential root causes, showing the potential value of the framework. The analyses also suggest intervention strategies that could be applied to attack the root causes of CC problems, including organizational changes, education and training, and additional health information technology development. PMID:27563685
Ying, Wenjun; Henriquez, Craig S
2007-04-01
A novel hybrid finite element method (FEM) for modeling the response of passive and active biological membranes to external stimuli is presented. The method is based on the differential equations that describe the conservation of electric flux and membrane currents. By introducing the electric flux through the cell membrane as an additional variable, the algorithm decouples the linear partial differential equation part from the nonlinear ordinary differential equation part that defines the membrane dynamics of interest. This conveniently results in two subproblems: a linear interface problem and a nonlinear initial value problem. The linear interface problem is solved with a hybrid FEM. The initial value problem is integrated by a standard ordinary differential equation solver such as the Euler and Runge-Kutta methods. During time integration, these two subproblems are solved alternatively. The algorithm can be used to model the interaction of stimuli with multiple cells of almost arbitrary geometries and complex ion-channel gating at the plasma membrane. Numerical experiments are presented demonstrating the uses of the method for modeling field stimulation and action potential propagation.
A network flow model for load balancing in circuit-switched multicomputers
NASA Technical Reports Server (NTRS)
Bokhari, Shahid H.
1990-01-01
In multicomputers that utilize circuit switching or wormhole routing, communication overhead depends largely on link contention - the variation due to distance between nodes is negligible. This has a major impact on the load balancing problem. In this case, there are some nodes with excess load (sources) and others with deficit load (sinks) and it is required to find a matching of sources to sinks that avoids contention. The problem is made complex by the hardwired routing on currently available machines: the user can control only which nodes communicate but not how the messages are routed. Network flow models of message flow in the mesh and the hypercube were developed to solve this problem. The crucial property of these models is the correspondence between minimum cost flows and correctly routed messages. To solve a given load balancing problem, a minimum cost flow algorithm is applied to the network. This permits one to determine efficiently a maximum contention free matching of sources to sinks which, in turn, tells one how much of the given imbalance can be eliminated without contention.
Woolcock, Michael
2018-06-01
In rich and poor countries alike, a core challenge is building the state's capability for policy implementation. Delivering high-quality public health and health care-affordably, reliably and at scale, for all-exemplifies this challenge, since doing so requires deftly integrating refined technical skills (surgery), broad logistics management (supply chains, facilities maintenance), adaptive problem solving (curative care), and resolving ideological differences (who pays? who provides?), even as the prevailing health problems themselves only become more diverse, complex, and expensive as countries become more prosperous. However, the current state of state capability in developing countries is demonstrably alarming, with the strains and demands only likely to intensify in the coming decades. Prevailing "best practice" strategies for building implementation capability-copying and scaling putative successes from abroad-are too often part of the problem, while individual training ("capacity building") and technological upgrades (e.g. new management information systems) remain necessary but deeply insufficient. An alternative approach is outlined, one centered on building implementation capability by working iteratively to solve problems nominated and prioritized by local actors.
Evolution, opportunity and challenges of transboundary water and energy problems in Central Asia.
Guo, Lidan; Zhou, Haiwei; Xia, Ziqiang; Huang, Feng
2016-01-01
Central Asia is one of the regions that suffer the most prominent transboundary water and energy problems in the world. Effective transboundary water-energy resource management and cooperation are closely related with socioeconomic development and stability in the entire Central Asia. Similar to Central Asia, Northwest China has an arid climate and is experiencing a water shortage. It is now facing imbalanced supply-demand relations of water and energy resources. These issues in Northwest China and Central Asia pose severe challenges in the implementation of the Silk Road Economic Belt strategy. Based on the analysis of water and energy distribution characteristics in Central Asia as well as demand characteristics of different countries, the complexity of local transboundary water problems was explored by reviewing corresponding historical problems of involved countries, correlated energy issues, and the evolution of inter-country water-energy cooperation. With references to experiences and lessons of five countries, contradictions, opportunities, challenges and strategies for transboundary water-energy cooperation between China and Central Asia were discussed under the promotion of the Silk Road Economic Belt construction based on current cooperation conditions.
A mixed-mode crack analysis of rectilinear anisotropic solids using conservation laws of elasticity
NASA Technical Reports Server (NTRS)
Wang, S. S.; Yau, J. F.; Corten, H. T.
1980-01-01
A very simple and convenient method of analysis for studying two-dimensional mixed-mode crack problems in rectilinear anisotropic solids is presented. The analysis is formulated on the basis of conservation laws of anisotropic elasticity and of fundamental relationships in anisotropic fracture mechanics. The problem is reduced to a system of linear algebraic equations in mixed-mode stress intensity factors. One of the salient features of the present approach is that it can determine directly the mixed-mode stress intensity solutions from the conservation integrals evaluated along a path removed from the crack-tip region without the need of solving the corresponding complex near-field boundary value problem. Several examples with solutions available in the literature are solved to ensure the accuracy of the current analysis. This method is further demonstrated to be superior to other approaches in its numerical simplicity and computational efficiency. Solutions of more complicated and practical engineering problems dealing with the crack emanating from a circular hole in composites are presented also to illustrate the capacity of this method.
Students' explanations in complex learning of disciplinary programming
NASA Astrophysics Data System (ADS)
Vieira, Camilo
Computational Science and Engineering (CSE) has been denominated as the third pillar of science and as a set of important skills to solve the problems of a global society. Along with the theoretical and the experimental approaches, computation offers a third alternative to solve complex problems that require processing large amounts of data, or representing complex phenomena that are not easy to experiment with. Despite the relevance of CSE, current professionals and scientists are not well prepared to take advantage of this set of tools and methods. Computation is usually taught in an isolated way from engineering disciplines, and therefore, engineers do not know how to exploit CSE affordances. This dissertation intends to introduce computational tools and methods contextualized within the Materials Science and Engineering curriculum. Considering that learning how to program is a complex task, the dissertation explores effective pedagogical practices that can support student disciplinary and computational learning. Two case studies will be evaluated to identify the characteristics of effective worked examples in the context of CSE. Specifically, this dissertation explores students explanations of these worked examples in two engineering courses with different levels of transparency: a programming course in materials science and engineering glass box and a thermodynamics course involving computational representations black box. Results from this study suggest that students benefit in different ways from writing in-code comments. These benefits include but are not limited to: connecting xv individual lines of code to the overall problem, getting familiar with the syntax, learning effective algorithm design strategies, and connecting computation with their discipline. Students in the glass box context generate higher quality explanations than students in the black box context. These explanations are related to students prior experiences. Specifically, students with low ability to do programming engage in a more thorough explanation process than students with high ability. This dissertation concludes proposing an adaptation to the instructional principles of worked-examples for the context of CSE education.
Sabia, Michael; Hirsh, Robert A; Torjman, Marc C; Wainer, Irving W; Cooper, Niti; Domsky, Richard; Goldberg, Michael E
2011-06-01
Historically, complex regional pain syndrome (CRPS) was poorly defined, which meant that scientists and clinicians faced much uncertainty in the study, diagnosis, and treatment of the syndrome. The problem could be attributed to a nonspecific diagnostic criteria, unknown pathophysiologic causes, and limited treatment options. The two forms of CRPS still are painful, debilitating disorders whose sufferers carry heavy emotional burdens. Current research has shown that CRPS I and CRPS II are distinctive processes, and the presence or absence of a partial nerve lesion distinguishes them apart. Ketamine has been the focus of various studies involving the treatment of CRPS; however, currently, there is incomplete data from evidence-based studies. The question as to why ketamine is effective in controlling the symptoms of a subset of patients with CRPS and not others remains to be answered. A possible explanation to this phenomenon is pharmacogenetic differences that may exist in different patient populations. This review summarizes important translational work recently published on the treatment of CRPS using ketamine. © Springer Science+Business Media, LLC 2011
Merging Digital Medicine and Economics: Two Moving Averages Unlock Biosignals for Better Health.
Elgendi, Mohamed
2018-01-06
Algorithm development in digital medicine necessitates ongoing knowledge and skills updating to match the current demands and constant progression in the field. In today's chaotic world there is an increasing trend to seek out simple solutions for complex problems that can increase efficiency, reduce resource consumption, and improve scalability. This desire has spilled over into the world of science and research where many disciplines have taken to investigating and applying more simplistic approaches. Interestingly, through a review of current literature and research efforts, it seems that the learning and teaching principles in digital medicine continue to push towards the development of sophisticated algorithms with a limited scope and has not fully embraced or encouraged a shift towards more simple solutions that yield equal or better results. This short note aims to demonstrate that within the world of digital medicine and engineering, simpler algorithms can offer effective and efficient solutions, where traditionally more complex algorithms have been used. Moreover, the note demonstrates that bridging different research disciplines is very beneficial and yields valuable insights and results.
Is the biggest security threat to medical information simply a lack of understanding?
Williams, Patricia A H
2011-01-01
Connecting Australian health services and the e-health initiative is a major focus in the current health environment. Many issues are presented as key to its success including solving issues with confidentiality and privacy. However, the main problem may not be these issues in sharing information but the fact that the point of origin of such records is still relatively insecure. This paper highlights why this may be the case. Research into the security of medical information has shown that many primary healthcare providers are unable to create an environment with effective information security. Numerous factors contribute to this complex situation including a trustful environment, the resultant security culture and the capability of individual healthcare organisations. Further, the growing importance of new directions in the use of patient information is considered. This paper discusses these issues and positions them within the complex environment that is healthcare. In our current health system infrastructure, the points of origin of patient information are our most vulnerable. This entwined with progressively new uses of this information expose additional security concerns, such as re-identification of information, that require attention.
The Complex Route to Success: Complex Problem-Solving Skills in the Prediction of University Success
ERIC Educational Resources Information Center
Stadler, Matthias J.; Becker, Nicolas; Greiff, Samuel; Spinath, Frank M.
2016-01-01
Successful completion of a university degree is a complex matter. Based on considerations regarding the demands of acquiring a university degree, the aim of this paper was to investigate the utility of complex problem-solving (CPS) skills in the prediction of objective and subjective university success (SUS). The key finding of this study was that…
Efficient solvers for coupled models in respiratory mechanics.
Verdugo, Francesc; Roth, Christian J; Yoshihara, Lena; Wall, Wolfgang A
2017-02-01
We present efficient preconditioners for one of the most physiologically relevant pulmonary models currently available. Our underlying motivation is to enable the efficient simulation of such a lung model on high-performance computing platforms in order to assess mechanical ventilation strategies and contributing to design more protective patient-specific ventilation treatments. The system of linear equations to be solved using the proposed preconditioners is essentially the monolithic system arising in fluid-structure interaction (FSI) extended by additional algebraic constraints. The introduction of these constraints leads to a saddle point problem that cannot be solved with usual FSI preconditioners available in the literature. The key ingredient in this work is to use the idea of the semi-implicit method for pressure-linked equations (SIMPLE) for getting rid of the saddle point structure, resulting in a standard FSI problem that can be treated with available techniques. The numerical examples show that the resulting preconditioners approach the optimal performance of multigrid methods, even though the lung model is a complex multiphysics problem. Moreover, the preconditioners are robust enough to deal with physiologically relevant simulations involving complex real-world patient-specific lung geometries. The same approach is applicable to other challenging biomedical applications where coupling between flow and tissue deformations is modeled with additional algebraic constraints. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Enhancements of evolutionary algorithm for the complex requirements of a nurse scheduling problem
NASA Astrophysics Data System (ADS)
Tein, Lim Huai; Ramli, Razamin
2014-12-01
Over the years, nurse scheduling is a noticeable problem that is affected by the global nurse turnover crisis. The more nurses are unsatisfied with their working environment the more severe the condition or implication they tend to leave. Therefore, the current undesirable work schedule is partly due to that working condition. Basically, there is a lack of complimentary requirement between the head nurse's liability and the nurses' need. In particular, subject to highly nurse preferences issue, the sophisticated challenge of doing nurse scheduling is failure to stimulate tolerance behavior between both parties during shifts assignment in real working scenarios. Inevitably, the flexibility in shifts assignment is hard to achieve for the sake of satisfying nurse diverse requests with upholding imperative nurse ward coverage. Hence, Evolutionary Algorithm (EA) is proposed to cater for this complexity in a nurse scheduling problem (NSP). The restriction of EA is discussed and thus, enhancement on the EA operators is suggested so that the EA would have the characteristic of a flexible search. This paper consists of three types of constraints which are the hard, semi-hard and soft constraints that can be handled by the EA with enhanced parent selection and specialized mutation operators. These operators and EA as a whole contribute to the efficiency of constraint handling, fitness computation as well as flexibility in the search, which correspond to the employment of exploration and exploitation principles.
Stochastic volatility models and Kelvin waves
NASA Astrophysics Data System (ADS)
Lipton, Alex; Sepp, Artur
2008-08-01
We use stochastic volatility models to describe the evolution of an asset price, its instantaneous volatility and its realized volatility. In particular, we concentrate on the Stein and Stein model (SSM) (1991) for the stochastic asset volatility and the Heston model (HM) (1993) for the stochastic asset variance. By construction, the volatility is not sign definite in SSM and is non-negative in HM. It is well known that both models produce closed-form expressions for the prices of vanilla option via the Lewis-Lipton formula. However, the numerical pricing of exotic options by means of the finite difference and Monte Carlo methods is much more complex for HM than for SSM. Until now, this complexity was considered to be an acceptable price to pay for ensuring that the asset volatility is non-negative. We argue that having negative stochastic volatility is a psychological rather than financial or mathematical problem, and advocate using SSM rather than HM in most applications. We extend SSM by adding volatility jumps and obtain a closed-form expression for the density of the asset price and its realized volatility. We also show that the current method of choice for solving pricing problems with stochastic volatility (via the affine ansatz for the Fourier-transformed density function) can be traced back to the Kelvin method designed in the 19th century for studying wave motion problems arising in fluid dynamics.
Advanced Computational Aeroacoustics Methods for Fan Noise Prediction
NASA Technical Reports Server (NTRS)
Envia, Edmane (Technical Monitor); Tam, Christopher
2003-01-01
Direct computation of fan noise is presently not possible. One of the major difficulties is the geometrical complexity of the problem. In the case of fan noise, the blade geometry is critical to the loading on the blade and hence the intensity of the radiated noise. The precise geometry must be incorporated into the computation. In computational fluid dynamics (CFD), there are two general ways to handle problems with complex geometry. One way is to use unstructured grids. The other is to use body fitted overset grids. In the overset grid method, accurate data transfer is of utmost importance. For acoustic computation, it is not clear that the currently used data transfer methods are sufficiently accurate as not to contaminate the very small amplitude acoustic disturbances. In CFD, low order schemes are, invariably, used in conjunction with unstructured grids. However, low order schemes are known to be numerically dispersive and dissipative. dissipative errors are extremely undesirable for acoustic wave problems. The objective of this project is to develop a high order unstructured grid Dispersion-Relation-Preserving (DRP) scheme. would minimize numerical dispersion and dissipation errors. contains the results of the funded portion of the project. scheme on an unstructured grid has been developed. constructed in the wave number space. The characteristics of the scheme can be improved by the inclusion of additional constraints. Stability of the scheme has been investigated. Stability can be improved by adopting the upwinding strategy.
Problems in particle theory. Technical report - 1993--1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adler, S.L.; Wilczek, F.
This report is a progress report on the work of two principal investigators in the broad area of particle physics theory, covering their personal work, that of their coworkers, and their proposed work for the future. One author has worked in the past on various topics in field theory and particle physics, among them current algebras, the physics of neutrino induced reactions, quantum electrodynamics (including strong magnetic field processes), the theory of the axial-vector current anomaly, topics in quantum gravity, and nonlinear models for quark confinement. While much of his work has been analytical, all of the projects listed abovemore » (except for the work on gravity) had phases which required considerable computer work as well. Over the next several years, he proposes to continue or initiate research on the following problems: (1) Acceleration algorithms for the Monte Carlo analysis of lattice field and gauge theories, and more generally, new research in computational neuroscience and pattern recognition. (2) Construction of quaternionic generalizations of complex quantum mechanics and field theory, and their application to composite models of quarks and leptons, and to the problem of unifying quantum theories of matter with general relativity. One author has worked on problems in exotic quantum statistics and its applications to condensed matter systems. His work has also continued on the quantum theory of black holes. This has evolved toward understanding properties of quantum field theory and string theory in incomplete regions of flat space.« less
Rogers, S J; Parcel, T L; Menaghan, E G
1991-06-01
We assess the impact of maternal sense of mastery and maternal working conditions on maternal perceptions of children's behavior problems as a means to study the transmission of social control across generations. We use a sample of 521 employed mothers and their four-to six-year-old children from the National Longitudinal Survey's Youth Cohort in 1986. Regarding working conditions, we consider mother's hourly wage, work hours, and job content including involvement with things (vs. people), the requisite level of physical activity, and occupational complexity. We also consider maternal and child background and current family characteristics, including marital status, family size, and home environment. Maternal mastery was related to fewer reported behavior problems among children. Lower involvement with people and higher involvement with things, as well as low physical activity, were related significantly to higher levels of perceived problems. In addition, recent changes in maternal marital status, including maternal marriage or remarriage, increased reports of problems; stronger home environments had the opposite effect. We interpret these findings as suggesting how maternal experiences of control in the workplace and personal resources of control can influence the internalization of control in children.
Development of Six Sigma methodology for CNC milling process improvements
NASA Astrophysics Data System (ADS)
Ismail, M. N.; Rose, A. N. M.; Mohammed, N. Z.; Rashid, M. F. F. Ab
2017-10-01
Quality and productivity have been identified as an important role in any organization, especially for manufacturing sectors to gain more profit that leads to success of a company. This paper reports a work improvement project in Kolej Kemahiran Tinggi MARA Kuantan. It involves problem identification in production of “Khufi” product and proposing an effective framework to improve the current situation effectively. Based on the observation and data collection on the work in progress (WIP) product, the major problem has been identified related to function of the product which is the parts can’t assemble properly due to dimension of the product is out of specification. The six sigma has been used as a methodology to study and improve of the problems identified. Six Sigma is a highly statistical and data driven approach to solving complex business problems. It uses a methodical five phase approach define, measure, analysis, improve and control (DMAIC) to help understand the process and the variables that affect it so that can be optimized the processes. Finally, the root cause and solution for the production of “Khufi” problem has been identified and implemented then the result for this product was successfully followed the specification of fitting.
Chen, Qing; Xu, Pengfei; Liu, Wenzhong
2016-01-01
Computer vision as a fast, low-cost, noncontact, and online monitoring technology has been an important tool to inspect product quality, particularly on a large-scale assembly production line. However, the current industrial vision system is far from satisfactory in the intelligent perception of complex grain images, comprising a large number of local homogeneous fragmentations or patches without distinct foreground and background. We attempt to solve this problem based on the statistical modeling of spatial structures of grain images. We present a physical explanation in advance to indicate that the spatial structures of the complex grain images are subject to a representative Weibull distribution according to the theory of sequential fragmentation, which is well known in the continued comminution of ore grinding. To delineate the spatial structure of the grain image, we present a method of multiscale and omnidirectional Gaussian derivative filtering. Then, a product quality classifier based on sparse multikernel–least squares support vector machine is proposed to solve the low-confidence classification problem of imbalanced data distribution. The proposed method is applied on the assembly line of a food-processing enterprise to classify (or identify) automatically the production quality of rice. The experiments on the real application case, compared with the commonly used methods, illustrate the validity of our method. PMID:26986726
Integrating complexity into data-driven multi-hazard supply chain network strategies
Long, Suzanna K.; Shoberg, Thomas G.; Ramachandran, Varun; Corns, Steven M.; Carlo, Hector J.
2013-01-01
Major strategies in the wake of a large-scale disaster have focused on short-term emergency response solutions. Few consider medium-to-long-term restoration strategies that reconnect urban areas to the national supply chain networks (SCN) and their supporting infrastructure. To re-establish this connectivity, the relationships within the SCN must be defined and formulated as a model of a complex adaptive system (CAS). A CAS model is a representation of a system that consists of large numbers of inter-connections, demonstrates non-linear behaviors and emergent properties, and responds to stimulus from its environment. CAS modeling is an effective method of managing complexities associated with SCN restoration after large-scale disasters. In order to populate the data space large data sets are required. Currently access to these data is hampered by proprietary restrictions. The aim of this paper is to identify the data required to build a SCN restoration model, look at the inherent problems associated with these data, and understand the complexity that arises due to integration of these data.
Expert systems for superalloy studies
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Kaukler, William F.
1990-01-01
There are many areas in science and engineering which require knowledge of an extremely complex foundation of experimental results in order to design methodologies for developing new materials or products. Superalloys are an area which fit well into this discussion in the sense that they are complex combinations of elements which exhibit certain characteristics. Obviously the use of superalloys in high performance, high temperature systems such as the Space Shuttle Main Engine is of interest to NASA. The superalloy manufacturing process is complex and the implementation of an expert system within the design process requires some thought as to how and where it should be implemented. A major motivation is to develop a methodology to assist metallurgists in the design of superalloy materials using current expert systems technology. Hydrogen embrittlement is disasterous to rocket engines and the heuristics can be very complex. Attacking this problem as one module in the overall design process represents a significant step forward. In order to describe the objectives of the first phase implementation, the expert system was designated Hydrogen Environment Embrittlement Expert System (HEEES).
A Java application for tissue section image analysis.
Kamalov, R; Guillaud, M; Haskins, D; Harrison, A; Kemp, R; Chiu, D; Follen, M; MacAulay, C
2005-02-01
The medical industry has taken advantage of Java and Java technologies over the past few years, in large part due to the language's platform-independence and object-oriented structure. As such, Java provides powerful and effective tools for developing tissue section analysis software. The background and execution of this development are discussed in this publication. Object-oriented structure allows for the creation of "Slide", "Unit", and "Cell" objects to simulate the corresponding real-world objects. Different functions may then be created to perform various tasks on these objects, thus facilitating the development of the software package as a whole. At the current time, substantial parts of the initially planned functionality have been implemented. Getafics 1.0 is fully operational and currently supports a variety of research projects; however, there are certain features of the software that currently introduce unnecessary complexity and inefficiency. In the future, we hope to include features that obviate these problems.
Information processing of earth resources data
NASA Technical Reports Server (NTRS)
Zobrist, A. L.; Bryant, N. A.
1982-01-01
Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.
Methods and compositions for efficient nucleic acid sequencing
Drmanac, Radoje
2006-07-04
Disclosed are novel methods and compositions for rapid and highly efficient nucleic acid sequencing based upon hybridization with two sets of small oligonucleotide probes of known sequences. Extremely large nucleic acid molecules, including chromosomes and non-amplified RNA, may be sequenced without prior cloning or subcloning steps. The methods of the invention also solve various current problems associated with sequencing technology such as, for example, high noise to signal ratios and difficult discrimination, attaching many nucleic acid fragments to a surface, preparing many, longer or more complex probes and labelling more species.
Methods and compositions for efficient nucleic acid sequencing
Drmanac, Radoje
2002-01-01
Disclosed are novel methods and compositions for rapid and highly efficient nucleic acid sequencing based upon hybridization with two sets of small oligonucleotide probes of known sequences. Extremely large nucleic acid molecules, including chromosomes and non-amplified RNA, may be sequenced without prior cloning or subcloning steps. The methods of the invention also solve various current problems associated with sequencing technology such as, for example, high noise to signal ratios and difficult discrimination, attaching many nucleic acid fragments to a surface, preparing many, longer or more complex probes and labelling more species.
Safe, Healthy Birth: What Every Pregnant Woman Needs to Know
Lothian, Judith A.
2009-01-01
In spite of technology and medical science's ability to manage complex health problems, the current maternity care environment has increased risks for healthy women and their babies. It comes as a surprise to most women that standard maternity care does not reflect best scientific evidence. In this column, evidence-based maternity care practices are discussed with an emphasis on the practices that increase safety for mother and baby, and what pregnant women need to know in order to have safe, healthy births is described. PMID:19750214
Bria, W F
1993-11-01
We have discussed several important transitions now occurring in PCIS that promise to improve the utility and availability of these systems for the average physician. Charles Babbage developed the first computers as "thinking machines" so that we may extend our ability to grapple with more and more complex problems. If current trends continue, we will finally witness the evolution of patient care computing from information icons of the few to clinical instruments improving the quality of medical decision making and care for all patients.
Magnetic resonance imaging of pelvic floor dysfunction.
Lalwani, Neeraj; Moshiri, Mariam; Lee, Jean H; Bhargava, Puneet; Dighe, Manjiri K
2013-11-01
Pelvic floor dysfunction is largely a complex problem of multiparous and postmenopausal women and is associated with pelvic floor or organ descent. Physical examination can underestimate the extent of the dysfunction and misdiagnose the disorders. Functional magnetic resonance (MR) imaging is emerging as a promising tool to evaluate the dynamics of the pelvic floor and use for surgical triage and operative planning. This article reviews the anatomy and pathology of pelvic floor dysfunction, typical imaging findings, and the current role of functional MR imaging. Copyright © 2013 Elsevier Inc. All rights reserved.
Swat, M J; Moodie, S; Wimalaratne, S M; Kristensen, N R; Lavielle, M; Mari, A; Magni, P; Smith, M K; Bizzotto, R; Pasotti, L; Mezzalana, E; Comets, E; Sarr, C; Terranova, N; Blaudez, E; Chan, P; Chard, J; Chatel, K; Chenel, M; Edwards, D; Franklin, C; Giorgino, T; Glont, M; Girard, P; Grenon, P; Harling, K; Hooker, A C; Kaye, R; Keizer, R; Kloft, C; Kok, J N; Kokash, N; Laibe, C; Laveille, C; Lestini, G; Mentré, F; Munafo, A; Nordgren, R; Nyberg, H B; Parra-Guillen, Z P; Plan, E; Ribba, B; Smith, G; Trocóniz, I F; Yvon, F; Milligan, P A; Harnisch, L; Karlsson, M; Hermjakob, H; Le Novère, N
2015-06-01
The lack of a common exchange format for mathematical models in pharmacometrics has been a long-standing problem. Such a format has the potential to increase productivity and analysis quality, simplify the handling of complex workflows, ensure reproducibility of research, and facilitate the reuse of existing model resources. Pharmacometrics Markup Language (PharmML), currently under development by the Drug Disease Model Resources (DDMoRe) consortium, is intended to become an exchange standard in pharmacometrics by providing means to encode models, trial designs, and modeling steps.
Swat, MJ; Moodie, S; Wimalaratne, SM; Kristensen, NR; Lavielle, M; Mari, A; Magni, P; Smith, MK; Bizzotto, R; Pasotti, L; Mezzalana, E; Comets, E; Sarr, C; Terranova, N; Blaudez, E; Chan, P; Chard, J; Chatel, K; Chenel, M; Edwards, D; Franklin, C; Giorgino, T; Glont, M; Girard, P; Grenon, P; Harling, K; Hooker, AC; Kaye, R; Keizer, R; Kloft, C; Kok, JN; Kokash, N; Laibe, C; Laveille, C; Lestini, G; Mentré, F; Munafo, A; Nordgren, R; Nyberg, HB; Parra-Guillen, ZP; Plan, E; Ribba, B; Smith, G; Trocóniz, IF; Yvon, F; Milligan, PA; Harnisch, L; Karlsson, M; Hermjakob, H; Le Novère, N
2015-01-01
The lack of a common exchange format for mathematical models in pharmacometrics has been a long-standing problem. Such a format has the potential to increase productivity and analysis quality, simplify the handling of complex workflows, ensure reproducibility of research, and facilitate the reuse of existing model resources. Pharmacometrics Markup Language (PharmML), currently under development by the Drug Disease Model Resources (DDMoRe) consortium, is intended to become an exchange standard in pharmacometrics by providing means to encode models, trial designs, and modeling steps. PMID:26225259
NASA Technical Reports Server (NTRS)
Mckay, Charles W.; Feagin, Terry; Bishop, Peter C.; Hallum, Cecil R.; Freedman, Glenn B.
1987-01-01
The principle focus of one of the RICIS (Research Institute for Computing and Information Systems) components is computer systems and software engineering in-the-large of the lifecycle of large, complex, distributed systems which: (1) evolve incrementally over a long time; (2) contain non-stop components; and (3) must simultaneously satisfy a prioritized balance of mission and safety critical requirements at run time. This focus is extremely important because of the contribution of the scaling direction problem to the current software crisis. The Computer Systems and Software Engineering (CSSE) component addresses the lifestyle issues of three environments: host, integration, and target.
Allen, D
1999-01-01
An update is provided on the barriers confronting the development of an effective HIV vaccine. These issues include political and organizational problems, inadequate research funding, pharmaceutical company reluctance to do vaccine research, and the scientific and testing complexities that must be overcome. Two preventive vaccines (Wyeth-Ayerst DNA and AIDSVAX), and two treatment vaccines (Wyeth-Ayerst DNA and Remune) currently in human trials in the United States are described, along with the rationale behind them.
Rohan, Kelly J.
2005-01-01
Seasonal affective disorder (SAD), characterized by fall/winter major depression with spring/summer remission, is a prevalent mental health problem. SAD etiology is not certain, but available models focus on neurotransmitters, hormones, circadian rhythm dysregulation, genetic polymorphisms, and psychological factors. Light therapy is established as the best available treatment for SAD. Alternative and/or supplementary approaches involving medications, cognitive-behavioral therapy, and exercise are currently being developed and evaluated. Given the complexity of the disorder, interdisciplinary research stands to make a significant contribution to advancing our understanding of SAD conceptualization and treatment. PMID:21179639
Yang, Caijun; Wu, Lina; Cai, Wenfang; Zhu, Wenwen; Shen, Qian; Li, Zongjie; Fang, Yu
2016-01-01
Drug shortages were a complex global problem. The aim of this study was to analyze, characterize, and assess the drug shortages, and identify possible solutions in Shaanxi Province, western China. A qualitative methodological approach was conducted during May-June 2015 and December 2015-January 2016. Semi-structured interviews were performed to gather information from representatives of hospital pharmacists, wholesalers, pharmaceutical producers, and local health authorities. Thirty participants took part in the study. Eight traditional Chinese medicines and 87 types of biologicals and chemicals were reported to be in short supply. Most were essential medicines. Five main determinants of drug shortages were detected: too low prices, too low market demands, Good Manufacturing Practice (GMP) issues, materials issues, and approval issues for imported drugs. Five different solutions were proposed by the participants: 1) let the market decide the drug price; 2) establish an information platform; 3) establish a reserve system; 4) enhance the communication among the three parties in the supply chain; and 5) improve hospital inventory management. Western China was currently experiencing a serious drug shortage. Numerous reasons for the shortage were identified. Most drug shortages in China were currently because of "too low prices." To solve this problem, all of the stakeholders, especially the government, needed to participate in managing the drug shortages.
Current state of herbicides in herbicide-resistant crops.
Green, Jerry M
2014-09-01
Current herbicide and herbicide trait practices are changing in response to the rapid spread of glyphosate-resistant weeds. Growers urgently needed glyphosate when glyphosate-resistant crops became available because weeds were becoming widely resistant to most commonly used selective herbicides, making weed management too complex and time consuming for large farm operations. Glyphosate made weed management easy and efficient by controlling all emerged weeds at a wide range of application timings. However, the intensive use of glyphosate over wide areas and concomitant decline in the use of other herbicides led eventually to the widespread evolution of weeds resistant to glyphosate. Today, weeds that are resistant to glyphosate and other herbicide types are threatening current crop production practices. Unfortunately, all commercial herbicide modes of action are over 20 years old and have resistant weed problems. The severity of the problem has prompted the renewal of efforts to discover new weed management technologies. One technology will be a new generation of crops with resistance to glyphosate, glufosinate and other existing herbicide modes of action. Other technologies will include new chemical, biological, cultural and mechanical methods for weed management. From the onset of commercialization, growers must now preserve the utility of new technologies by integrating their use with other weed management technologies in diverse and sustainable systems. © 2014 Society of Chemical Industry.
Stamovlasis, Dimitrios; Tsaparlis, Georgios
2003-07-01
The present study examines the role of limited human channel capacity from a science education perspective. A model of science problem solving has been previously validated by applying concepts and tools of complexity theory (the working memory, random walk method). The method correlated the subjects' rank-order achievement scores in organic-synthesis chemistry problems with the subjects' working memory capacity. In this work, we apply the same nonlinear approach to a different data set, taken from chemical-equilibrium problem solving. In contrast to the organic-synthesis problems, these problems are algorithmic, require numerical calculations, and have a complex logical structure. As a result, these problems cause deviations from the model, and affect the pattern observed with the nonlinear method. In addition to Baddeley's working memory capacity, the Pascual-Leone's mental (M-) capacity is examined by the same random-walk method. As the complexity of the problem increases, the fractal dimension of the working memory random walk demonstrates a sudden drop, while the fractal dimension of the M-capacity random walk decreases in a linear fashion. A review of the basic features of the two capacities and their relation is included. The method and findings have consequences for problem solving not only in chemistry and science education, but also in other disciplines.
Gakh, Andrei A.; Sachleben, Richard A.; Bryan, Jeff C.
1997-11-01
The race to create smaller devices is fueling much of the research in electronics. The competition has intensified with the advent of microelectromechanical systems (MEMS), in which miniaturization is already reaching the dimensional limits imposed by physics of current lithographic techniques. Also, in the realm of biochemistry, evidence is accumulating that certain enzyme complexes are capable of very sophisticated modes of motion. Complex synergistic biochemical complexes driven by sophisticated biomechanical processes are quite common. Their biochemical functions are based on the interplay of mechanical and chemical processes, including allosteric effects. In addition, the complexity of this interplay far exceeds thatmore » of typical chemical reactions. Understanding the behavior of artificial molecular devices as well as complex natural molecular biomechanical systems is difficult. Fortunately, the problem can be successfully resolved by direct molecular engineering of simple molecular systems that can mimic desired mechanical or electronic devices. These molecular systems are called technomimetics (the name is derived, by analogy, from biomimetics). Several classes of molecular systems that can mimic mechanical, electronic, or other features of macroscopic devices have been successfully synthesized by conventional chemical methods during the past two decades. In this article we discuss only one class of such model devices: molecular gearing systems.« less
Biclustering Protein Complex Interactions with a Biclique FindingAlgorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Chris; Zhang, Anne Ya; Holbrook, Stephen
2006-12-01
Biclustering has many applications in text mining, web clickstream mining, and bioinformatics. When data entries are binary, the tightest biclusters become bicliques. We propose a flexible and highly efficient algorithm to compute bicliques. We first generalize the Motzkin-Straus formalism for computing the maximal clique from L{sub 1} constraint to L{sub p} constraint, which enables us to provide a generalized Motzkin-Straus formalism for computing maximal-edge bicliques. By adjusting parameters, the algorithm can favor biclusters with more rows less columns, or vice verse, thus increasing the flexibility of the targeted biclusters. We then propose an algorithm to solve the generalized Motzkin-Straus optimizationmore » problem. The algorithm is provably convergent and has a computational complexity of O(|E|) where |E| is the number of edges. It relies on a matrix vector multiplication and runs efficiently on most current computer architectures. Using this algorithm, we bicluster the yeast protein complex interaction network. We find that biclustering protein complexes at the protein level does not clearly reflect the functional linkage among protein complexes in many cases, while biclustering at protein domain level can reveal many underlying linkages. We show several new biologically significant results.« less
NASA Astrophysics Data System (ADS)
Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan
2017-12-01
We examine students' mathematical performance on quantitative "synthesis problems" with varying mathematical complexity. Synthesis problems are tasks comprising multiple concepts typically taught in different chapters. Mathematical performance refers to the formulation, combination, and simplification of equations. Generally speaking, formulation and combination of equations require conceptual reasoning; simplification of equations requires manipulation of equations as computational tools. Mathematical complexity is operationally defined by the number and the type of equations to be manipulated concurrently due to the number of unknowns in each equation. We use two types of synthesis problems, namely, sequential and simultaneous tasks. Sequential synthesis tasks require a chronological application of pertinent concepts, and simultaneous synthesis tasks require a concurrent application of the pertinent concepts. A total of 179 physics major students from a second year mechanics course participated in the study. Data were collected from written tasks and individual interviews. Results show that mathematical complexity negatively influences the students' mathematical performance on both types of synthesis problems. However, for the sequential synthesis tasks, it interferes only with the students' simplification of equations. For the simultaneous synthesis tasks, mathematical complexity additionally impedes the students' formulation and combination of equations. Several reasons may explain this difference, including the students' different approaches to the two types of synthesis problems, cognitive load, and the variation of mathematical complexity within each synthesis type.
What happens in Josephson junctions at high critical current densities
NASA Astrophysics Data System (ADS)
Massarotti, D.; Stornaiuolo, D.; Lucignano, P.; Caruso, R.; Galletti, L.; Montemurro, D.; Jouault, B.; Campagnano, G.; Arani, H. F.; Longobardi, L.; Parlato, L.; Pepe, G. P.; Rotoli, G.; Tagliacozzo, A.; Lombardi, F.; Tafuri, F.
2017-07-01
The impressive advances in material science and nanotechnology are more and more promoting the use of exotic barriers and/or superconductors, thus paving the way to new families of Josephson junctions. Semiconducting, ferromagnetic, topological insulator and graphene barriers are leading to unconventional and anomalous aspects of the Josephson coupling, which might be useful to respond to some issues on key problems of solid state physics. However, the complexity of the layout and of the competing physical processes occurring in the junctions is posing novel questions on the interpretation of their phenomenology. We classify some significant behaviors of hybrid and unconventional junctions in terms of their first imprinting, i.e., current-voltage curves, and propose a phenomenological approach to describe some features of junctions characterized by relatively high critical current densities Jc. Accurate arguments on the distribution of switching currents will provide quantitative criteria to understand physical processes occurring in high-Jc junctions. These notions are universal and apply to all kinds of junctions.
ERIC Educational Resources Information Center
Hay, M. Cameron
2017-01-01
Undergraduate student learning focuses on the development of disciplinary strength in majors and minors so that students gain depth in particular fields, foster individual expertise, and learn problem solving from disciplinary perspectives. However, the complexities of real-world problems do not respect disciplinary boundaries. Complex problems…
The Process of Solving Complex Problems
ERIC Educational Resources Information Center
Fischer, Andreas; Greiff, Samuel; Funke, Joachim
2012-01-01
This article is about Complex Problem Solving (CPS), its history in a variety of research domains (e.g., human problem solving, expertise, decision making, and intelligence), a formal definition and a process theory of CPS applicable to the interdisciplinary field. CPS is portrayed as (a) knowledge acquisition and (b) knowledge application…
Communities of Practice: A New Approach to Solving Complex Educational Problems
ERIC Educational Resources Information Center
Cashman, J.; Linehan, P.; Rosser, M.
2007-01-01
Communities of Practice offer state agency personnel a promising approach for engaging stakeholder groups in collaboratively solving complex and, often, persistent problems in special education. Communities of Practice can help state agency personnel drive strategy, solve problems, promote the spread of best practices, develop members'…
6 Essential Questions for Problem Solving
ERIC Educational Resources Information Center
Kress, Nancy Emerson
2017-01-01
One of the primary expectations that the author has for her students is for them to develop greater independence when solving complex and unique mathematical problems. The story of how the author supports her students as they gain confidence and independence with complex and unique problem-solving tasks, while honoring their expectations with…
Students' and Teachers' Conceptual Metaphors for Mathematical Problem Solving
ERIC Educational Resources Information Center
Yee, Sean P.
2017-01-01
Metaphors are regularly used by mathematics teachers to relate difficult or complex concepts in classrooms. A complex topic of concern in mathematics education, and most STEM-based education classes, is problem solving. This study identified how students and teachers contextualize mathematical problem solving through their choice of metaphors.…
ERIC Educational Resources Information Center
Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan
2017-01-01
We examine students' mathematical performance on quantitative "synthesis problems" with varying mathematical complexity. Synthesis problems are tasks comprising multiple concepts typically taught in different chapters. Mathematical performance refers to the formulation, combination, and simplification of equations. Generally speaking,…
Grim, K.C.; Fairbrother, A.; Monfort, S.; Tan, S.; Rattner, B.A.; Gerould, S.; Beasley, V.; Aguirre, A.; Rowles, T.
2007-01-01
On March 13-15, 2007 nearly 50 scientists and administrators from the US and Canada participated in a Smithsonian-sponsored Wildlife Toxicology Workshop. Invitees were from academic, government, conservation and the private organizations and were selected to represent the diverse disciplines that encompass wildlife toxicology. The workshop addressed scientific and policy issues, strengths and weaknesses of current research strategies, interdisciplinary and science-based approaches in the study of complex contaminant issues, mechanisms for disseminating data to policy-makers, and the development of a partner network to meet the challenges facing wildlife toxicology over the next decade. Prior to the meeting, participants were asked to submit issues they deemed to be of highest concern which shaped four thematic groups for discussion: Wildlife Toxicology in Education, Risk Assessment, Multiple Stressors/Complex Mixtures, and Sub-Lethal to Population-Level Effects. From these discussion groups, 18 problem statements were developed and prioritized outlining what were deemed the most important issues to address now and into the future. Along with each problem statement participants developed potential solutions and action steps geared to move each issue forward. The workshop served as a stepping stone for action in the field of wildlife toxicology. These problem statements and the resulting action items are presented to the inter-disciplinary wildlife toxicology community for adoption, and future work and action items in these areas are encouraged. The workshop outcome looks to generate conversation and collaboration that will lead to the development of innovative research, future mechanisms for funding, workshops, working groups, and listserves within the wildlife toxicology community.
The effects of monitoring environment on problem-solving performance.
Laird, Brian K; Bailey, Charles D; Hester, Kim
2018-01-01
While effective and efficient solving of everyday problems is important in business domains, little is known about the effects of workplace monitoring on problem-solving performance. In a laboratory experiment, we explored the monitoring environment's effects on an individual's propensity to (1) establish pattern solutions to problems, (2) recognize when pattern solutions are no longer efficient, and (3) solve complex problems. Under three work monitoring regimes-no monitoring, human monitoring, and electronic monitoring-114 participants solved puzzles for monetary rewards. Based on research related to worker autonomy and theory of social facilitation, we hypothesized that monitored (versus non-monitored) participants would (1) have more difficulty finding a pattern solution, (2) more often fail to recognize when the pattern solution is no longer efficient, and (3) solve fewer complex problems. Our results support the first two hypotheses, but in complex problem solving, an interaction was found between self-assessed ability and the monitoring environment.
ERIC Educational Resources Information Center
de Leeuw, L.
Sixty-four fifth and sixth-grade pupils were taught number series extrapolation by either an algorithm, fully prescribed problem-solving method or a heuristic, less prescribed method. The trained problems were within categories of two degrees of complexity. There were 16 subjects in each cell of the 2 by 2 design used. Aptitude Treatment…
Merkes, Monika; Lewis, Virginia; Canaway, Rachel
2010-12-03
The co-occurrence of mental illness and substance use problems (referred to as "comorbidity" in this paper) is common, and is often reported by service providers as the expectation rather than the exception. Despite this, many different treatment service models are being used in the alcohol and other drugs (AOD) and mental health (MH) sectors to treat this complex client group. While there is abundant literature in the area of comorbidity treatment, no agreed overarching framework to describe the range of service delivery models is apparent internationally or at the national level. The aims of the current research were to identify and describe elements of good practice in current service models of treatment of comorbidity in Australia. The focus of the research was on models of service delivery. The research did not aim to measure the client outcomes achieved by individual treatment services, but sought to identify elements of good practice in services. Australian treatment services were identified to take part in the study through a process of expert consultation. The intent was to look for similarities in the delivery models being implemented across a diverse set of services that were perceived to be providing good quality treatment for people with comorbidity problems. A survey was designed based on a concept map of service delivery devised from a literature review. Seventeen Australian treatment services participated in the survey, which explored the context in which services operate, inputs such as organisational philosophy and service structure, policies and procedures that guide the way in which treatment is delivered by the service, practices that reflect the way treatment is provided to clients, and client impacts. The treatment of people with comorbidity of mental health and substance use disorders presents complex problems that require strong but flexible service models. While the treatment services included in this study reflected the diversity of settings and approaches described in the literature, the research found that they shared a range of common characteristics. These referred to: service linkages; workforce; policies, procedures and practices; and treatment.
Identification of QRS complex in non-stationary electrocardiogram of sick infants.
Kota, S; Swisher, C B; Al-Shargabi, T; Andescavage, N; du Plessis, A; Govindan, R B
2017-08-01
Due to the high-frequency of routine interventions in an intensive care setting, electrocardiogram (ECG) recordings from sick infants are highly non-stationary, with recurrent changes in the baseline, alterations in the morphology of the waveform, and attenuations of the signal strength. Current methods lack reliability in identifying QRS complexes (a marker of individual cardiac cycles) in the non-stationary ECG. In the current study we address this problem by proposing a novel approach to QRS complex identification. Our approach employs lowpass filtering, half-wave rectification, and the use of instantaneous Hilbert phase to identify QRS complexes in the ECG. We demonstrate the application of this method using ECG recordings from eight preterm infants undergoing intensive care, as well as from 18 normal adult volunteers available via a public database. We compared our approach to the commonly used approaches including Pan and Tompkins (PT), gqrs, wavedet, and wqrs for identifying QRS complexes and then compared each with manually identified QRS complexes. For preterm infants, a comparison between the QRS complexes identified by our approach and those identified through manual annotations yielded sensitivity and positive predictive values of 99% and 99.91%, respectively. The comparison metrics for each method are as follows: PT (sensitivity: 84.49%, positive predictive value: 99.88%), gqrs (85.25%, 99.49%), wavedet (95.24%, 99.86%), and wqrs (96.99%, 96.55%). Thus, the sensitivity values of the four methods previously described, are lower than the sensitivity of the method we propose; however, the positive predictive values of these other approaches is comparable to those of our method, with the exception of the wqrs approach, which yielded a slightly lower value. For adult ECG, our approach yielded a sensitivity of 99.78%, whereas PT yielded 99.79%. The positive predictive value was 99.42% for both our approach as well as for PT. We propose a novel method for identifying QRS complexes that outperforms common currently available tools for non-stationary ECG data in infants. For stationary ECG our proposed approach and the PT approach perform equally well. The ECG acquired in a clinical environment may be prone to issues related to non-stationarity, especially in critically ill patients. The approach proposed in this report offers superior reliability in these scenarios. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zittersteijn, M.; Vananti, A.; Schildknecht, T.; Dolado Perez, J. C.; Martinot, V.
2016-11-01
Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). The MTT problem quickly becomes an NP-hard combinatorial optimization problem. This means that the effort required to solve the MTT problem increases exponentially with the number of tracked objects. In an attempt to find an approximate solution of sufficient quality, several Population-Based Meta-Heuristic (PBMH) algorithms are implemented and tested on simulated optical measurements. These first results show that one of the tested algorithms, namely the Elitist Genetic Algorithm (EGA), consistently displays the desired behavior of finding good approximate solutions before reaching the optimum. The results further suggest that the algorithm possesses a polynomial time complexity, as the computation times are consistent with a polynomial model. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the association and orbit determination problems simultaneously, and is able to efficiently process large data sets with minimal manual intervention.
Guidance for modeling causes and effects in environmental problem solving
Armour, Carl L.; Williamson, Samuel C.
1988-01-01
Environmental problems are difficult to solve because their causes and effects are not easily understood. When attempts are made to analyze causes and effects, the principal challenge is organization of information into a framework that is logical, technically defensible, and easy to understand and communicate. When decisionmakers attempt to solve complex problems before an adequate cause and effect analysis is performed there are serious risks. These risks include: greater reliance on subjective reasoning, lessened chance for scoping an effective problem solving approach, impaired recognition of the need for supplemental information to attain understanding, increased chance for making unsound decisions, and lessened chance for gaining approval and financial support for a program/ Cause and effect relationships can be modeled. This type of modeling has been applied to various environmental problems, including cumulative impact assessment (Dames and Moore 1981; Meehan and Weber 1985; Williamson et al. 1987; Raley et al. 1988) and evaluation of effects of quarrying (Sheate 1986). This guidance for field users was written because of the current interest in documenting cause-effect logic as a part of ecological problem solving. Principal literature sources relating to the modeling approach are: Riggs and Inouye (1975a, b), Erickson (1981), and United States Office of Personnel Management (1986).
Enhanced algorithms for stochastic programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishna, Alamuru S.
1993-09-01
In this dissertation, we present some of the recent advances made in solving two-stage stochastic linear programming problems of large size and complexity. Decomposition and sampling are two fundamental components of techniques to solve stochastic optimization problems. We describe improvements to the current techniques in both these areas. We studied different ways of using importance sampling techniques in the context of Stochastic programming, by varying the choice of approximation functions used in this method. We have concluded that approximating the recourse function by a computationally inexpensive piecewise-linear function is highly efficient. This reduced the problem from finding the mean ofmore » a computationally expensive functions to finding that of a computationally inexpensive function. Then we implemented various variance reduction techniques to estimate the mean of a piecewise-linear function. This method achieved similar variance reductions in orders of magnitude less time than, when we directly applied variance-reduction techniques directly on the given problem. In solving a stochastic linear program, the expected value problem is usually solved before a stochastic solution and also to speed-up the algorithm by making use of the information obtained from the solution of the expected value problem. We have devised a new decomposition scheme to improve the convergence of this algorithm.« less
A prediction model to forecast the cost impact from a break in the production schedule
NASA Technical Reports Server (NTRS)
Delionback, L. M.
1977-01-01
The losses which are experienced after a break or stoppage in sequence of a production cycle portends an extremely complex situation and involves numerous variables, some of uncertain quantity and quality. There are no discrete formulas to define the losses during a gap in production. The techniques which are employed are therefore related to a prediction or forecast of the losses that take place, based on the conditions which exist in the production environment. Such parameters as learning curve slope, number of predecessor units, and length of time the production sequence is halted are utilized in formulating a prediction model. The pertinent current publications related to this subject are few in number, but are reviewed to provide an understanding of the problem. Example problems are illustrated together with appropriate trend curves to show the approach. Solved problems are also given to show the application of the models to actual cases or production breaks in the real world.
Scout: high-performance heterogeneous computing made simple
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jablin, James; Mc Cormick, Patrick; Herlihy, Maurice
2011-01-26
Researchers must often write their own simulation and analysis software. During this process they simultaneously confront both computational and scientific problems. Current strategies for aiding the generation of performance-oriented programs do not abstract the software development from the science. Furthermore, the problem is becoming increasingly complex and pressing with the continued development of many-core and heterogeneous (CPU-GPU) architectures. To acbieve high performance, scientists must expertly navigate both software and hardware. Co-design between computer scientists and research scientists can alleviate but not solve this problem. The science community requires better tools for developing, optimizing, and future-proofing codes, allowing scientists to focusmore » on their research while still achieving high computational performance. Scout is a parallel programming language and extensible compiler framework targeting heterogeneous architectures. It provides the abstraction required to buffer scientists from the constantly-shifting details of hardware while still realizing higb-performance by encapsulating software and hardware optimization within a compiler framework.« less
NASA Technical Reports Server (NTRS)
Sartori, Michael A.; Passino, Kevin M.; Antsaklis, Panos J.
1992-01-01
In rule-based AI planning, expert, and learning systems, it is often the case that the left-hand-sides of the rules must be repeatedly compared to the contents of some 'working memory'. The traditional approach to solve such a 'match phase problem' for production systems is to use the Rete Match Algorithm. Here, a new technique using a multilayer perceptron, a particular artificial neural network model, is presented to solve the match phase problem for rule-based AI systems. A syntax for premise formulas (i.e., the left-hand-sides of the rules) is defined, and working memory is specified. From this, it is shown how to construct a multilayer perceptron that finds all of the rules which can be executed for the current situation in working memory. The complexity of the constructed multilayer perceptron is derived in terms of the maximum number of nodes and the required number of layers. A method for reducing the number of layers to at most three is also presented.
Complex networks for data-driven medicine: the case of Class III dentoskeletal disharmony
NASA Astrophysics Data System (ADS)
Scala, A.; Auconi, P.; Scazzocchio, M.; Caldarelli, G.; McNamara, JA; Franchi, L.
2014-11-01
In the last decade, the availability of innovative algorithms derived from complexity theory has inspired the development of highly detailed models in various fields, including physics, biology, ecology, economy, and medicine. Due to the availability of novel and ever more sophisticated diagnostic procedures, all biomedical disciplines face the problem of using the increasing amount of information concerning each patient to improve diagnosis and prevention. In particular, in the discipline of orthodontics the current diagnostic approach based on clinical and radiographic data is problematic due to the complexity of craniofacial features and to the numerous interacting co-dependent skeletal and dentoalveolar components. In this study, we demonstrate the capability of computational methods such as network analysis and module detection to extract organizing principles in 70 patients with excessive mandibular skeletal protrusion with underbite, a condition known in orthodontics as Class III malocclusion. Our results could possibly constitute a template framework for organising the increasing amount of medical data available for patients’ diagnosis.
Increasingly automated procedure acquisition in dynamic systems
NASA Technical Reports Server (NTRS)
Mathe, Nathalie; Kedar, Smadar
1992-01-01
Procedures are widely used by operators for controlling complex dynamic systems. Currently, most development of such procedures is done manually, consuming a large amount of paper, time, and manpower in the process. While automated knowledge acquisition is an active field of research, not much attention has been paid to the problem of computer-assisted acquisition and refinement of complex procedures for dynamic systems. The Procedure Acquisition for Reactive Control Assistant (PARC), which is designed to assist users in more systematically and automatically encoding and refining complex procedures. PARC is able to elicit knowledge interactively from the user during operation of the dynamic system. We categorize procedure refinement into two stages: diagnosis - diagnose the failure and choose a repair - and repair - plan and perform the repair. The basic approach taken in PARC is to assist the user in all steps of this process by providing increased levels of assistance with layered tools. We illustrate the operation of PARC in refining procedures for the control of a robot arm.
Snowden, Thomas J; van der Graaf, Piet H; Tindall, Marcus J
2017-07-01
Complex models of biochemical reaction systems have become increasingly common in the systems biology literature. The complexity of such models can present a number of obstacles for their practical use, often making problems difficult to intuit or computationally intractable. Methods of model reduction can be employed to alleviate the issue of complexity by seeking to eliminate those portions of a reaction network that have little or no effect upon the outcomes of interest, hence yielding simplified systems that retain an accurate predictive capacity. This review paper seeks to provide a brief overview of a range of such methods and their application in the context of biochemical reaction network models. To achieve this, we provide a brief mathematical account of the main methods including timescale exploitation approaches, reduction via sensitivity analysis, optimisation methods, lumping, and singular value decomposition-based approaches. Methods are reviewed in the context of large-scale systems biology type models, and future areas of research are briefly discussed.
A Complex Network Approach to Stylometry
Amancio, Diego Raphael
2015-01-01
Statistical methods have been widely employed to study the fundamental properties of language. In recent years, methods from complex and dynamical systems proved useful to create several language models. Despite the large amount of studies devoted to represent texts with physical models, only a limited number of studies have shown how the properties of the underlying physical systems can be employed to improve the performance of natural language processing tasks. In this paper, I address this problem by devising complex networks methods that are able to improve the performance of current statistical methods. Using a fuzzy classification strategy, I show that the topological properties extracted from texts complement the traditional textual description. In several cases, the performance obtained with hybrid approaches outperformed the results obtained when only traditional or networked methods were used. Because the proposed model is generic, the framework devised here could be straightforwardly used to study similar textual applications where the topology plays a pivotal role in the description of the interacting agents. PMID:26313921
Automated Approach to Very High-Order Aeroacoustic Computations. Revision
NASA Technical Reports Server (NTRS)
Dyson, Rodger W.; Goodrich, John W.
2001-01-01
Computational aeroacoustics requires efficient, high-resolution simulation tools. For smooth problems, this is best accomplished with very high-order in space and time methods on small stencils. However, the complexity of highly accurate numerical methods can inhibit their practical application, especially in irregular geometries. This complexity is reduced by using a special form of Hermite divided-difference spatial interpolation on Cartesian grids, and a Cauchy-Kowalewski recursion procedure for time advancement. In addition, a stencil constraint tree reduces the complexity of interpolating grid points that am located near wall boundaries. These procedures are used to develop automatically and to implement very high-order methods (> 15) for solving the linearized Euler equations that can achieve less than one grid point per wavelength resolution away from boundaries by including spatial derivatives of the primitive variables at each grid point. The accuracy of stable surface treatments is currently limited to 11th order for grid aligned boundaries and to 2nd order for irregular boundaries.
Nielsen, H Bjørn; Almeida, Mathieu; Juncker, Agnieszka Sierakowska; Rasmussen, Simon; Li, Junhua; Sunagawa, Shinichi; Plichta, Damian R; Gautier, Laurent; Pedersen, Anders G; Le Chatelier, Emmanuelle; Pelletier, Eric; Bonde, Ida; Nielsen, Trine; Manichanh, Chaysavanh; Arumugam, Manimozhiyan; Batto, Jean-Michel; Quintanilha Dos Santos, Marcelo B; Blom, Nikolaj; Borruel, Natalia; Burgdorf, Kristoffer S; Boumezbeur, Fouad; Casellas, Francesc; Doré, Joël; Dworzynski, Piotr; Guarner, Francisco; Hansen, Torben; Hildebrand, Falk; Kaas, Rolf S; Kennedy, Sean; Kristiansen, Karsten; Kultima, Jens Roat; Léonard, Pierre; Levenez, Florence; Lund, Ole; Moumen, Bouziane; Le Paslier, Denis; Pons, Nicolas; Pedersen, Oluf; Prifti, Edi; Qin, Junjie; Raes, Jeroen; Sørensen, Søren; Tap, Julien; Tims, Sebastian; Ussery, David W; Yamada, Takuji; Renault, Pierre; Sicheritz-Ponten, Thomas; Bork, Peer; Wang, Jun; Brunak, Søren; Ehrlich, S Dusko
2014-08-01
Most current approaches for analyzing metagenomic data rely on comparisons to reference genomes, but the microbial diversity of many environments extends far beyond what is covered by reference databases. De novo segregation of complex metagenomic data into specific biological entities, such as particular bacterial strains or viruses, remains a largely unsolved problem. Here we present a method, based on binning co-abundant genes across a series of metagenomic samples, that enables comprehensive discovery of new microbial organisms, viruses and co-inherited genetic entities and aids assembly of microbial genomes without the need for reference sequences. We demonstrate the method on data from 396 human gut microbiome samples and identify 7,381 co-abundance gene groups (CAGs), including 741 metagenomic species (MGS). We use these to assemble 238 high-quality microbial genomes and identify affiliations between MGS and hundreds of viruses or genetic entities. Our method provides the means for comprehensive profiling of the diversity within complex metagenomic samples.
In Vitro/In Vivo Evaluation of Dexamethasone--PAMAM Dendrimer Complexes for Retinal Drug Delivery.
Yavuz, Burçin; Pehlivan, Sibel Bozdağ; Vural, İmran; Ünlü, Nurşen
2015-11-01
Current treatment options for diabetic retinopathy (DR) have side effects because of invasive application and topical application does not generally result in therapeutic levels in the target tissue. Therefore, improving the drug delivery to retina, following topical administration, might be a solution to DR treatment problems. The purpose of this study was to investigate the complexation effects of poly(amidoamine) (PAMAM) dendrimers on ocular absorption of dexamethasone (DEX). Using different PAMAM generations, complex formulations were prepared and characterized. Formulations were evaluated in terms of cytotoxicity and cell permeability, as well as ex vivo transport across ocular tissues. The ocular pharmacokinetic properties of DEX formulations were studied in Sprague-Dawley rats following topical and subconjunctival applications, to evaluate the effect of PAMAM on retinal delivery of DEX. Methyl-thiazol-tetrazolium (MTT) assay indicated that all groups resulted in cell viability comparable to DEX solution (87.5%), with the cell viability being the lowest for G3 complex at 73.5%. Transport study results showed that dendrimer complexation increases DEX transport across both cornea and sclera tissues. The results of in vivo studies were also indicated that especially anionic DEX-PAMAM complex formulations have reached higher DEX concentrations in ocular tissues compared with plain DEX suspension. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Vaseem, Mohammad; McKerricher, Garret; Shamim, Atif
2016-01-13
Currently, silver-nanoparticle-based inkjet ink is commercially available. This type of ink has several serious problems such as a complex synthesis protocol, high cost, high sintering temperatures (∼200 °C), particle aggregation, nozzle clogging, poor shelf life, and jetting instability. For the emerging field of printed electronics, these shortcomings in conductive inks are barriers for their widespread use in practical applications. Formulating particle-free silver inks has potential to solve these issues and requires careful design of the silver complexation. The ink complex must meet various requirements, such as in situ reduction, optimum viscosity, storage and jetting stability, smooth uniform sintered films, excellent adhesion, and high conductivity. This study presents a robust formulation of silver-organo-complex (SOC) ink, where complexing molecules act as reducing agents. The 17 wt % silver loaded ink was printed and sintered on a wide range of substrates with uniform surface morphology and excellent adhesion. The jetting stability was monitored for 5 months to confirm that the ink was robust and highly stable with consistent jetting performance. Radio frequency inductors, which are highly sensitive to metal quality, were demonstrated as a proof of concept on flexible PEN substrate. This is a major step toward producing high-quality electronic components with a robust inkjet printing process.
NASA Astrophysics Data System (ADS)
Bonner, J.
2006-05-01
Differences in energy partitioning of seismic phases from earthquakes and explosions provide the opportunity for event identification. In this talk, I will briefly review teleseismic Ms:mb and P/S ratio techniques that help identify events based on differences in compressional, shear, and surface wave energy generation from explosions and earthquakes. With the push to identify smaller yield explosions, the identification process has become increasingly complex as varied types of explosions, including chemical, mining, and nuclear, must be identified at regional distances. Thus, I will highlight some of the current views and problems associated with the energy partitioning of seismic phases from single- and delay-fired chemical explosions. One problem yet to have a universally accepted answer is whether the explosion and earthquake populations, based on the Ms:mb discriminants, should be separated at smaller magnitudes. I will briefly describe the datasets and theory that support either converging or parallel behavior of these populations. Also, I will discuss improvement to the currently used methods that will better constrain this problem in the future. I will also discuss the role of regional P/S ratios in identifying explosions. In particular, recent datasets from South Africa, Scandinavia, and the Western United States collected from earthquakes, single-fired chemical explosions, and/or delay-fired mining explosions have provide new insight into regional P, S, Lg, and Rg energy partitioning. Data from co-located mining and chemical explosions suggest that some mining explosions may be used for limited calibration of regional discriminants in regions where no historic explosion data is available.
Cloud Computing for Complex Performance Codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin
This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.
Xu, Wei
2007-12-01
This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.
Transformations of software design and code may lead to reduced errors
NASA Technical Reports Server (NTRS)
Connelly, E. M.
1983-01-01
The capability of programmers and non-programmers to specify problem solutions by developing example-solutions and also for the programmers by writing computer programs was investigated; each method of specification was accomplished at various levels of problem complexity. The level of difficulty of each problem was reflected by the number of steps needed by the user to develop a solution. Machine processing of the user inputs permitted inferences to be developed about the algorithms required to solve a particular problem. The interactive feedback of processing results led users to a more precise definition of the desired solution. Two participant groups (programmers and bookkeepers/accountants) working with three levels of problem complexity and three levels of processor complexity were used. The experimental task employed required specification of a logic for solution of a Navy task force problem.
Hoskinson, A-M; Caballero, M D; Knight, J K
2013-06-01
If students are to successfully grapple with authentic, complex biological problems as scientists and citizens, they need practice solving such problems during their undergraduate years. Physics education researchers have investigated student problem solving for the past three decades. Although physics and biology problems differ in structure and content, the instructional purposes align closely: explaining patterns and processes in the natural world and making predictions about physical and biological systems. In this paper, we discuss how research-supported approaches developed by physics education researchers can be adopted by biologists to enhance student problem-solving skills. First, we compare the problems that biology students are typically asked to solve with authentic, complex problems. We then describe the development of research-validated physics curricula emphasizing process skills in problem solving. We show that solving authentic, complex biology problems requires many of the same skills that practicing physicists and biologists use in representing problems, seeking relationships, making predictions, and verifying or checking solutions. We assert that acquiring these skills can help biology students become competent problem solvers. Finally, we propose how biology scholars can apply lessons from physics education in their classrooms and inspire new studies in biology education research.
Analogy as a strategy for supporting complex problem solving under uncertainty.
Chan, Joel; Paletz, Susannah B F; Schunn, Christian D
2012-11-01
Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving.
Toward the establishment of design guidelines for effective 3D perspective interfaces
NASA Astrophysics Data System (ADS)
Fitzhugh, Elisabeth; Dixon, Sharon; Aleva, Denise; Smith, Eric; Ghrayeb, Joseph; Douglas, Lisa
2009-05-01
The propagation of information operation technologies, with correspondingly vast amounts of complex network information to be conveyed, significantly impacts operator workload. Information management research is rife with efforts to develop schemes to aid operators to identify, review, organize, and retrieve the wealth of available data. Data may take on such distinct forms as intelligence libraries, logistics databases, operational environment models, or network topologies. Increased use of taxonomies and semantic technologies opens opportunities to employ network visualization as a display mechanism for diverse information aggregations. The broad applicability of network visualizations is still being tested, but in current usage, the complexity of densely populated abstract networks suggests the potential utility of 3D. Employment of 2.5D in network visualization, using classic perceptual cues, creates a 3D experience within a 2D medium. It is anticipated that use of 3D perspective (2.5D) will enhance user ability to visually inspect large, complex, multidimensional networks. Current research for 2.5D visualizations demonstrates that display attributes, including color, shape, size, lighting, atmospheric effects, and shadows, significantly impact operator experience. However, guidelines for utilization of attributes in display design are limited. This paper discusses pilot experimentation intended to identify potential problem areas arising from these cues and determine how best to optimize perceptual cue settings. Development of optimized design guidelines will ensure that future experiments, comparing network displays with other visualizations, are not confounded or impeded by suboptimal attribute characterization. Current experimentation is anticipated to support development of cost-effective, visually effective methods to implement 3D in military applications.
Multi-Criteria Approach in Multifunctional Building Design Process
NASA Astrophysics Data System (ADS)
Gerigk, Mateusz
2017-10-01
The paper presents new approach in multifunctional building design process. Publication defines problems related to the design of complex multifunctional buildings. Currently, contemporary urban areas are characterized by very intensive use of space. Today, buildings are being built bigger and contain more diverse functions to meet the needs of a large number of users in one capacity. The trends show the need for recognition of design objects in an organized structure, which must meet current design criteria. The design process in terms of the complex system is a theoretical model, which is the basis for optimization solutions for the entire life cycle of the building. From the concept phase through exploitation phase to disposal phase multipurpose spaces should guarantee aesthetics, functionality, system efficiency, system safety and environmental protection in the best possible way. The result of the analysis of the design process is presented as a theoretical model of the multifunctional structure. Recognition of multi-criteria model in the form of Cartesian product allows to create a holistic representation of the designed building in the form of a graph model. The proposed network is the theoretical base that can be used in the design process of complex engineering systems. The systematic multi-criteria approach makes possible to maintain control over the entire design process and to provide the best possible performance. With respect to current design requirements, there are no established design rules for multifunctional buildings in relation to their operating phase. Enrichment of the basic criteria with functional flexibility criterion makes it possible to extend the exploitation phase which brings advantages on many levels.
High-Accuracy, Compact Scanning Method and Circuit for Resistive Sensor Arrays.
Kim, Jong-Seok; Kwon, Dae-Yong; Choi, Byong-Deok
2016-01-26
The zero-potential scanning circuit is widely used as read-out circuit for resistive sensor arrays because it removes a well known problem: crosstalk current. The zero-potential scanning circuit can be divided into two groups based on type of row drivers. One type is a row driver using digital buffers. It can be easily implemented because of its simple structure, but we found that it can cause a large read-out error which originates from on-resistance of the digital buffers used in the row driver. The other type is a row driver composed of operational amplifiers. It, very accurately, reads the sensor resistance, but it uses a large number of operational amplifiers to drive rows of the sensor array; therefore, it severely increases the power consumption, cost, and system complexity. To resolve the inaccuracy or high complexity problems founded in those previous circuits, we propose a new row driver which uses only one operational amplifier to drive all rows of a sensor array with high accuracy. The measurement results with the proposed circuit to drive a 4 × 4 resistor array show that the maximum error is only 0.1% which is remarkably reduced from 30.7% of the previous counterpart.
NASA Astrophysics Data System (ADS)
Vickers, Ken
2005-03-01
The education and training of the workforce needed to assure global competitiveness of American industry in high technology areas, along with the proper role of various disciplines in that educational process, is currently being re-examined. Several academic areas in science and engineering have reported results from such studies that revealed several broad themes of educational need that span and cross the boundaries of science and engineering. They included greater attention to and the development of team-building skills, personal or interactive skills, creative ability, and a business or entrepreneurial where-with-all. We will report in this paper the results of a fall 2000 Department of Education FIPSE grant to implement changes in its graduate physics program to address these issues. The proposal goal was to produce next-generation physics graduate students that are trained to evaluate and overcome complex technical problems by their participation in courses emphasizing the commercialization of technology research. To produce next-generation physics graduates who have learned to work with their student colleagues for their mutual success in an industrial-like group setting. And finally, to produce graduates who can lead interdisciplinary groups in solving complex problems in their career field.