Science.gov

Sample records for analyzing complex problems

  1. Analyzing the many skills involved in solving complex physics problems

    NASA Astrophysics Data System (ADS)

    Adams, Wendy K.; Wieman, Carl E.

    2015-05-01

    We have empirically identified over 40 distinct sub-skills that affect a person's ability to solve complex problems in many different contexts. The identification of so many sub-skills explains why it has been so difficult to teach or assess problem solving as a single skill. The existence of these sub-skills is supported by several studies comparing a wide range of individuals' strengths and weaknesses in these sub-skills, their "problem solving fingerprint," while solving different types of problems including a classical mechanics problem, quantum mechanics problems, and a complex trip-planning problem with no physics. We see clear differences in the problem solving fingerprint of physics and engineering majors compared to the elementary education majors that we tested. The implications of these findings for guiding the teaching and assessing of problem solving in physics instruction are discussed.

  2. The Bright Side of Being Blue: Depression as an Adaptation for Analyzing Complex Problems

    ERIC Educational Resources Information Center

    Andrews, Paul W.; Thomson, J. Anderson, Jr.

    2009-01-01

    Depression is the primary emotional condition for which help is sought. Depressed people often report persistent rumination, which involves analysis, and complex social problems in their lives. Analysis is often a useful approach for solving complex problems, but it requires slow, sustained processing, so disruption would interfere with problem…

  3. Analyzing elastoplastic large deformation problems with the complex variable element-free Galerkin method

    NASA Astrophysics Data System (ADS)

    Li, D. M.; Liew, K. M.; Cheng, Y. M.

    2014-06-01

    Using the complex variable moving least-squares (CVMLS) approximation, a complex variable element-free Galerkin (CVEFG) method for two-dimensional elastoplastic large deformation problems is presented. This meshless method has higher computational precision and efficiency because in the CVMLS approximation, the trial function of a two-dimensional problem is formed with a one-dimensional basis function. For two-dimensional elastoplastic large deformation problems, the Galerkin weak form is employed to obtain its equation system. The penalty method is used to impose essential boundary conditions. Then the corresponding formulae of the CVEFG method for two-dimensional elastoplastic large deformation problems are derived. In comparison with the conventional EFG method, our study shows that the CVEFG method has higher precision and efficiency. For illustration purpose, a few selected numerical examples are presented to demonstrate the advantages of the CVEFG method.

  4. The bright side of being blue: Depression as an adaptation for analyzing complex problems

    PubMed Central

    Andrews, Paul W.; Thomson, J. Anderson

    2009-01-01

    Depression ranks as the primary emotional problem for which help is sought. Depressed people often have severe, complex problems, and rumination is a common feature. Depressed people often believe that their ruminations give them insight into their problems, but clinicians often view depressive rumination as pathological because it is difficult to disrupt and interferes with the ability to concentrate on other things. Abundant evidence indicates that depressive rumination involves the analysis of episode-related problems. Because analysis is time consuming and requires sustained processing, disruption would interfere with problem-solving. The analytical rumination (AR) hypothesis proposes that depression is an adaptation that evolved as a response to complex problems and whose function is to minimize disruption of rumination and sustain analysis of complex problems. It accomplishes this by giving episode-related problems priority access to limited processing resources, by reducing the desire to engage in distracting activities (anhedonia), and by producing psychomotor changes that reduce exposure to distracting stimuli. Because processing resources are limited, the inability to concentrate on other things is a tradeoff that must be made to sustain analysis of the triggering problem. The AR hypothesis is supported by evidence from many levels, including genes, neurotransmitters and their receptors, neurophysiology, neuroanatomy, neuroenergetics, pharmacology, cognition and behavior, and the efficacy of treatments. In addition, we address and provide explanations for puzzling findings in the cognitive and behavioral genetics literatures on depression. In the process, we challenge the belief that serotonin transmission is low in depression. Finally, we discuss implications of the hypothesis for understanding and treating depression. PMID:19618990

  5. Implementation of Complexity Analyzing Based on Additional Effect

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang

    According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.

  6. Analyzing Static Loading of Complex Structures

    NASA Technical Reports Server (NTRS)

    Gallear, D. C.

    1986-01-01

    Critical loading conditions determined from analysis of each structural element. Automated Thrust Structures Loads and Stresses (ATLAS) system is series of programs developed to analyze elements of complex structure under static-loading conditions. ATLAS calculates internal loads, beam-bending loads, column- and web-buckling loads, beam and panel stresses, and beam-corner stresses. Programs written in FORTRAN IV and Assembler for batch execution.

  7. The Politics of Analyzing Social Problems.

    ERIC Educational Resources Information Center

    Ross, Robert; Staines, Graham L.

    Two crucial processes are discussed: (1) that through which social problems become public issues; and (2) that through which conflicts between competing diagnoses of, and responses to, publicly recognized social problems are resolved. Regularities in these transformations are conceptualized as follows: groups differ in their definitions of social…

  8. Analyzing and Detecting Problems in Systems of Systems

    NASA Technical Reports Server (NTRS)

    Lindvall, Mikael; Ackermann, Christopher; Stratton, William C.; Sibol, Deane E.; Godfrey, Sally

    2008-01-01

    Many software systems are evolving complex system of systems (SoS) for which inter-system communication is mission-critical. Evidence indicates that transmission failures and performance issues are not uncommon occurrences. In a NASA-supported Software Assurance Research Program (SARP) project, we are researching a new approach addressing such problems. In this paper, we are presenting an approach for analyzing inter-system communications with the goal to uncover both transmission errors and performance problems. Our approach consists of a visualization and an evaluation component. While the visualization of the observed communication aims to facilitate understanding, the evaluation component automatically checks the conformance of an observed communication (actual) to a desired one (planned). The actual and the planned are represented as sequence diagrams. The evaluation algorithm checks the conformance of the actual to the planned diagram. We have applied our approach to the communication of aerospace systems and were successful in detecting and resolving even subtle and long existing transmission problems.

  9. Analyzing Complex Metabolomic Networks: Experiments and Simulation

    NASA Astrophysics Data System (ADS)

    Steuer, R.; Kurths, J.; Fiehn, O.; Weckwerth, W.

    2002-03-01

    In the recent years, remarkable advances in molecular biology have enabled us to measure the behavior of complex regularity networks underlying biological systems. In particular, high throughput techniques, such as gene expression arrays, allow a fast acquisition of a large number of simultaneously measured variables. Similar to gene expression, the analysis of metabolomic datasets results in a huge number of metabolite co-regulations: Metabolites are the end products of cellular regulatory processes, their level can be regarded as the ultimate response to genetic or environmental changes. In this presentation we focus on the topological description of such networks, using both, experimental data and simulations. In particular, we discuss the possibility to deduce novel links between metabolites, using concepts from (nonlinear) time series analysis and information theory.

  10. Software Analyzes Complex Systems in Real Time

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Expert system software programs, also known as knowledge-based systems, are computer programs that emulate the knowledge and analytical skills of one or more human experts, related to a specific subject. SHINE (Spacecraft Health Inference Engine) is one such program, a software inference engine (expert system) designed by NASA for the purpose of monitoring, analyzing, and diagnosing both real-time and non-real-time systems. It was developed to meet many of the Agency s demanding and rigorous artificial intelligence goals for current and future needs. NASA developed the sophisticated and reusable software based on the experience and requirements of its Jet Propulsion Laboratory s (JPL) Artificial Intelligence Research Group in developing expert systems for space flight operations specifically, the diagnosis of spacecraft health. It was designed to be efficient enough to operate in demanding real time and in limited hardware environments, and to be utilized by non-expert systems applications written in conventional programming languages. The technology is currently used in several ongoing NASA applications, including the Mars Exploration Rovers and the Spacecraft Health Automatic Reasoning Pilot (SHARP) program for the diagnosis of telecommunication anomalies during the Neptune Voyager Encounter. It is also finding applications outside of the Space Agency.

  11. Quantum Computing: Solving Complex Problems

    ScienceCinema

    DiVincenzo, David [IBM Watson Research Center

    2009-09-01

    One of the motivating ideas of quantum computation was that there could be a new kind of machine that would solve hard problems in quantum mechanics. There has been significant progress towards the experimental realization of these machines (which I will review), but there are still many questions about how such a machine could solve computational problems of interest in quantum physics. New categorizations of the complexity of computational problems have now been invented to describe quantum simulation. The bad news is that some of these problems are believed to be intractable even on a quantum computer, falling into a quantum analog of the NP class. The good news is that there are many other new classifications of tractability that may apply to several situations of physical interest.

  12. Quantum Computing: Solving Complex Problems

    SciTech Connect

    DiVincenzo, David

    2007-04-12

    One of the motivating ideas of quantum computation was that there could be a new kind of machine that would solve hard problems in quantum mechanics. There has been significant progress towards the experimental realization of these machines (which I will review), but there are still many questions about how such a machine could solve computational problems of interest in quantum physics. New categorizations of the complexity of computational problems have now been invented to describe quantum simulation. The bad news is that some of these problems are believed to be intractable even on a quantum computer, falling into a quantum analog of the NP class. The good news is that there are many other new classifications of tractability that may apply to several situations of physical interest.

  13. Quantum Computing: Solving Complex Problems

    SciTech Connect

    DiVincenzo, David

    2007-04-11

    One of the motivating ideas of quantum computation was that there could be a new kind of machine that would solve hard problems in quantum mechanics. There has been significant progress towards the experimental realization of these machines (which I will review), but there are still many questions about how such a machine could solve computational problems of interest in quantum physics. New categorizations of the complexity of computational problems have now been invented to describe quantum simulation. The bad news is that some of these problems are believed to be intractable even on a quantum computer, falling into a quantum analog of the NP class. The good news is that there are many other new classifications of tractability that may apply to several situations of physical interest.

  14. Analyzing the Origins of Childhood Externalizing Behavioral Problems

    ERIC Educational Resources Information Center

    Barnes, J. C.; Boutwell, Brian B.; Beaver, Kevin M.; Gibson, Chris L.

    2013-01-01

    Drawing on a sample of twin children from the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B; Snow et al., 2009), the current study analyzed 2 of the most prominent predictors of externalizing behavioral problems (EBP) in children: (a) parental use of spankings and (b) childhood self-regulation. A variety of statistical techniques were…

  15. Analyzing the origins of childhood externalizing behavioral problems.

    PubMed

    Barnes, J C; Boutwell, Brian B; Beaver, Kevin M; Gibson, Chris L

    2013-12-01

    Drawing on a sample of twin children from the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B; Snow et al., 2009), the current study analyzed 2 of the most prominent predictors of externalizing behavioral problems (EBP) in children: (a) parental use of spankings and (b) childhood self-regulation. A variety of statistical techniques were employed, and, overall, the findings can be summarized into 2 points. First, the results show that the relationships among spanking, self-regulation, and EBP are highly nuanced in that multiple explanations for their intercorrelations appear to fit the data (e.g., bidirectional relationships and shared methods variance). Second, genetic influences accounted for variance in each variable (EBP, spankings received, self-regulation) and even explained a portion of the covariance among the different variables. Thus, research that does not consider genetic influences when analyzing these associations runs a risk of model misspecification. PMID:23477531

  16. Analyzing Problem's Difficulty Based on Neural Networks and Knowledge Map

    ERIC Educational Resources Information Center

    Kuo, Rita; Lien, Wei-Peng; Chang, Maiga; Heh, Jia-Sheng

    2004-01-01

    This paper proposes a methodology to calculate both the difficulty of the basic problems and the difficulty of solving a problem. The method to calculate the difficulty of problem is according to the process of constructing a problem, including Concept Selection, Unknown Designation, and Proposition Construction. Some necessary measures observed…

  17. Analyzing Large Protein Complexes by Structural Mass Spectrometry

    PubMed Central

    Kirshenbaum, Noam; Michaelevski, Izhak; Sharon, Michal

    2010-01-01

    Living cells control and regulate their biological processes through the coordinated action of a large number of proteins that assemble themselves into an array of dynamic, multi-protein complexes1. To gain a mechanistic understanding of the various cellular processes, it is crucial to determine the structure of such protein complexes, and reveal how their structural organization dictates their function. Many aspects of multi-protein complexes are, however, difficult to characterize, due to their heterogeneous nature, asymmetric structure, and dynamics. Therefore, new approaches are required for the study of the tertiary levels of protein organization. One of the emerging structural biology tools for analyzing macromolecular complexes is mass spectrometry (MS)2-5. This method yields information on the complex protein composition, subunit stoichiometry, and structural topology. The power of MS derives from its high sensitivity and, as a consequence, low sample requirement, which enables examination of protein complexes expressed at endogenous levels. Another advantage is the speed of analysis, which allows monitoring of reactions in real time. Moreover, the technique can simultaneously measure the characteristics of separate populations co-existing in a mixture. Here, we describe a detailed protocol for the application of structural MS to the analysis of large protein assemblies. The procedure begins with the preparation of gold-coated capillaries for nanoflow electrospray ionization (nESI). It then continues with sample preparation, emphasizing the buffer conditions which should be compatible with nESI on the one hand, and enable to maintain complexes intact on the other. We then explain, step-by-step, how to optimize the experimental conditions for high mass measurements and acquire MS and tandem MS spectra. Finally, we chart the data processing and analyses that follow. Rather than attempting to characterize every aspect of protein assemblies, this protocol

  18. Analyzing patterns in experts' approaches to solving experimental problems

    NASA Astrophysics Data System (ADS)

    Čančula, Maja Poklinek; Planinšič, Gorazd; Etkina, Eugenia

    2015-04-01

    We report detailed observations of three pairs of expert scientists and a pair of advanced undergraduate students solving an experimental optics problem. Using a new method ("transition graphs") of visualizing sequences of logical steps, we were able to compare the groups and identify patterns that could not be found using previously existing methods. While the problem solving of undergraduates significantly differed from that of experts at the beginning of the process, it gradually became more similar to the expert problem solving. We mapped problem solving steps and their sequence to the elements of an approach to teaching and learning physics called Investigative Science Learning Environment (ISLE), and we speculate that the ISLE educational framework closely represents the actual work of physicists.

  19. Special Education Provision in Nigeria: Analyzing Contexts, Problems, and Prospects

    ERIC Educational Resources Information Center

    Obiakor, Festus E.; Offor, MaxMary Tabugbo

    2011-01-01

    Nigeria has made some efforts to educate all of its citizenry, including those with disabilities. And, it has struggled to make sure that programs are available to those who need them. However, its traditional, sociocultural, and educational problems have prevented some programmatic consistency and progress. As a result, the special education…

  20. Program for Analyzing Flows in a Complex Network

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok Kumar

    2006-01-01

    Generalized Fluid System Simulation Program (GFSSP) version 4 is a general-purpose computer program for analyzing steady-state and transient flows in a complex fluid network. The program is capable of modeling compressibility, fluid transients (e.g., water hammers), phase changes, mixtures of chemical species, and such externally applied body forces as gravitational and centrifugal ones. A graphical user interface enables the user to interactively develop a simulation of a fluid network consisting of nodes and branches. The user can also run the simulation and view the results in the interface. The system of equations for conservation of mass, energy, chemical species, and momentum is solved numerically by a combination of the Newton-Raphson and successive-substitution methods.

  1. Complex Problem Solving--More than Reasoning?

    ERIC Educational Resources Information Center

    Wustenberg, Sascha; Greiff, Samuel; Funke, Joachim

    2012-01-01

    This study investigates the internal structure and construct validity of Complex Problem Solving (CPS), which is measured by a "Multiple-Item-Approach." It is tested, if (a) three facets of CPS--"rule identification" (adequateness of strategies), "rule knowledge" (generated knowledge) and "rule application" (ability to control a system)--can be…

  2. Selecting model complexity in learning problems

    SciTech Connect

    Buescher, K.L.; Kumar, P.R.

    1993-10-01

    To learn (or generalize) from noisy data, one must resist the temptation to pick a model for the underlying process that overfits the data. Many existing techniques solve this problem at the expense of requiring the evaluation of an absolute, a priori measure of each model`s complexity. We present a method that does not. Instead, it uses a natural, relative measure of each model`s complexity. This method first creates a pool of ``simple`` candidate models using part of the data and then selects from among these by using the rest of the data.

  3. Refined scale-dependent permutation entropy to analyze systems complexity

    NASA Astrophysics Data System (ADS)

    Wu, Shuen-De; Wu, Chiu-Wen; Humeau-Heurtier, Anne

    2016-05-01

    Multiscale entropy (MSE) has become a prevailing method to quantify the complexity of systems. Unfortunately, MSE has a temporal complexity in O(N2) , which is unrealistic for long time series. Moreover, MSE relies on the sample entropy computation which is length-dependent and which leads to large variance and possible undefined entropy values for short time series. Here, we propose and introduce a new multiscale complexity measure, the refined scale-dependent permutation entropy (RSDPE). Through the processing of different kinds of synthetic data and real signals, we show that RSDPE has a behavior close to the one of MSE. Furthermore, RSDPE has a temporal complexity in O(N) . Finally, RSDPE has the advantage of being much less length-dependent than MSE. From all this, we conclude that RSDPE over-performs MSE in terms of computational cost and computational accuracy.

  4. Analyzing Complex and Structured Data via Unsupervised Learning Techniques

    NASA Astrophysics Data System (ADS)

    Polsterer, Kai Lars; Gieseke, Fabian; Gianniotis, Nikos; Kügler, Dennis

    2015-08-01

    In the last decades more and more dedicated all-sky-surveys created an enormous amount of data which is publicly available on the internet. The resulting datasets contain spatial, spectral, and temporal information which exhibit complex structures in the respective domain. The capability to deal with morphological features, spectral signatures, and complex time series data has become very important but is still a challenging task. A common approach when processing this kind of structured data is to extract representative features and use those for a further analysis. We present unsupervised learning approaches that help to visualize / cluster these complex data sets by e.g. deriving rotation / translation invariant prototypes or capturing the latent dynamics of time series without employing features and using echo-state-networks instead.

  5. New Approach to Analyzing Physics Problems: A Taxonomy of Introductory Physics Problems

    ERIC Educational Resources Information Center

    Teodorescu, Raluca E.; Bennhold, Cornelius; Feldman, Gerald; Medsker, Larry

    2013-01-01

    This paper describes research on a classification of physics problems in the context of introductory physics courses. This classification, called the Taxonomy of Introductory Physics Problems (TIPP), relates physics problems to the cognitive processes required to solve them. TIPP was created in order to design educational objectives, to develop…

  6. Fractal applications to complex crustal problems

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1989-01-01

    Complex scale-invariant problems obey fractal statistics. The basic definition of a fractal distribution is that the number of objects with a characteristic linear dimension greater than r satisfies the relation N = about r exp -D where D is the fractal dimension. Fragmentation often satisfies this relation. The distribution of earthquakes satisfies this relation. The classic relationship between the length of a rocky coast line and the step length can be derived from this relation. Power law relations for spectra can also be related to fractal dimensions. Topography and gravity are examples. Spectral techniques can be used to obtain maps of fractal dimension and roughness amplitude. These provide a quantitative measure of texture analysis. It is argued that the distribution of stress and strength in a complex crustal region, such as the Alps, is fractal. Based on this assumption, the observed frequency-magnitude relation for the seismicity in the region can be derived.

  7. Complex energies and the polyelectronic Stark problem

    NASA Astrophysics Data System (ADS)

    Themelis, Spyros I.; Nicolaides, Cleanthes A.

    2000-12-01

    The problem of computing the energy shifts and widths of ground or excited N-electron atomic states perturbed by weak or strong static electric fields is dealt with by formulating a state-specific complex eigenvalue Schrödinger equation (CESE), where the complex energy contains the field-induced shift and width. The CESE is solved to all orders nonperturbatively, by using separately optimized N-electron function spaces, composed of real and complex one-electron functions, the latter being functions of a complex coordinate. The use of such spaces is a salient characteristic of the theory, leading to economy and manageability of calculation in terms of a two-step computational procedure. The first step involves only Hermitian matrices. The second adds complex functions and the overall computation becomes non-Hermitian. Aspects of the formalism and of computational strategy are compared with those of the complex absorption potential (CAP) method, which was recently applied for the calculation of field-induced complex energies in H and Li. Also compared are the numerical results of the two methods, and the questions of accuracy and convergence that were posed by Sahoo and Ho (Sahoo S and Ho Y K 2000 J. Phys. B: At. Mol. Opt. Phys. 33 2195) are explored further. We draw attention to the fact that, because in the region where the field strength is weak the tunnelling rate (imaginary part of the complex eigenvalue) diminishes exponentially, it is possible for even large-scale nonperturbative complex eigenvalue calculations either to fail completely or to produce seemingly stable results which, however, are wrong. It is in this context that the discrepancy in the width of Li 1s22s 2S between results obtained by the CAP method and those obtained by the CESE method is interpreted. We suggest that the very-weak-field regime must be computed by the golden rule, provided the continuum is represented accurately. In this respect, existing one-particle semiclassical formulae seem

  8. MatOFF: A Tool For Analyzing Behaviorally-Complex Neurophysiological Experiments

    PubMed Central

    Genovesio, Aldo; Mitz, Andrew R.

    2007-01-01

    The simple operant conditioning originally used in behavioral neurophysiology 30 years ago has given way to complex and sophisticated behavioral paradigms; so much so, that early general purpose programs for analyzing neurophysiological data are ill-suited for complex experiments. The trend has been to develop custom software for each class of experiment, but custom software can have serious drawbacks. We describe here a general purpose software tool for behavioral and electrophysiological studies, called MatOFF, that is especially suited for processing neurophysiological data gathered during the execution of complex behaviors. Written in the MATLAB programming language, MatOFF solves the problem of handling complex analysis requirements in a unique and powerful way. While other neurophysiological programs are either a loose collection of tools or append MATLAB as a post-processing step, MatOFF is an integrated environment that supports MATLAB scripting within the event search engine safely isolated in programming sandbox. The results from scripting are stored separately, but in parallel with the raw data, and thus available to all subsequent MatOFF analysis and display processing. An example from a recently published experiment shows how all the features of MatOFF work together to analyze complex experiments and mine neurophysiological data in efficient ways. PMID:17604115

  9. System and method for modeling and analyzing complex scenarios

    DOEpatents

    Shevitz, Daniel Wolf

    2013-04-09

    An embodiment of the present invention includes a method for analyzing and solving possibility tree. A possibility tree having a plurality of programmable nodes is constructed and solved with a solver module executed by a processor element. The solver module executes the programming of said nodes, and tracks the state of at least a variable through a branch. When a variable of said branch is out of tolerance with a parameter, the solver disables remaining nodes of the branch and marks the branch as an invalid solution. The valid solutions are then aggregated and displayed as valid tree solutions.

  10. Estimating uncertainties in complex joint inverse problems

    NASA Astrophysics Data System (ADS)

    Afonso, Juan Carlos

    2016-04-01

    Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related

  11. Network-Thinking: Graphs to Analyze Microbial Complexity and Evolution

    PubMed Central

    Corel, Eduardo; Lopez, Philippe; Méheust, Raphaël; Bapteste, Eric

    2016-01-01

    The tree model and tree-based methods have played a major, fruitful role in evolutionary studies. However, with the increasing realization of the quantitative and qualitative importance of reticulate evolutionary processes, affecting all levels of biological organization, complementary network-based models and methods are now flourishing, inviting evolutionary biology to experience a network-thinking era. We show how relatively recent comers in this field of study, that is, sequence-similarity networks, genome networks, and gene families–genomes bipartite graphs, already allow for a significantly enhanced usage of molecular datasets in comparative studies. Analyses of these networks provide tools for tackling a multitude of complex phenomena, including the evolution of gene transfer, composite genes and genomes, evolutionary transitions, and holobionts. PMID:26774999

  12. Analyzing complex networks through correlations in centrality measurements

    NASA Astrophysics Data System (ADS)

    Furlan Ronqui, José Ricardo; Travieso, Gonzalo

    2015-05-01

    Many real world systems can be expressed as complex networks of interconnected nodes. It is frequently important to be able to quantify the relative importance of the various nodes in the network, a task accomplished by defining some centrality measures, with different centrality definitions stressing different aspects of the network. It is interesting to know to what extent these different centrality definitions are related for different networks. In this work, we study the correlation between pairs of a set of centrality measures for different real world networks and two network models. We show that the centralities are in general correlated, but with stronger correlations for network models than for real networks. We also show that the strength of the correlation of each pair of centralities varies from network to network. Taking this fact into account, we propose the use of a centrality correlation profile, consisting of the values of the correlation coefficients between all pairs of centralities of interest, as a way to characterize networks. Using the yeast protein interaction network as an example we show also that the centrality correlation profile can be used to assess the adequacy of a network model as a representation of a given real network.

  13. Analyzing complex gaze behavior in the natural world

    NASA Astrophysics Data System (ADS)

    Pelz, Jeff B.; Kinsman, Thomas B.; Evans, Karen M.

    2011-03-01

    The history of eye-movement research extends back at least to 1794, when Erasmus Darwin (Charles' grandfather) published Zoonomia, including descriptions of eye movements due to self-motion. But research on eye movements was restricted to the laboratory for 200 years, until Michael Land built the first wearable eyetracker at the University of Sussex and published the seminal paper "Where we look when we steer" [1]. In the intervening centuries, we learned a tremendous amount about the mechanics of the oculomotor system and how it responds to isolated stimuli, but virtually nothing about how we actually use our eyes to explore, gather information, navigate, and communicate in the real world. Inspired by Land's work, we have been working to extend knowledge in these areas by developing hardware, algorithms, and software that have allowed researchers to ask questions about how we actually use vision in the real world. Central to that effort are new methods for analyzing the volumes of data that come from the experiments made possible by the new systems. We describe a number of recent experiments and SemantiCode, a new program that supports assisted coding of eye-movement data collected in unrestricted environments.

  14. Analyzing Problems in Schools and School Systems: A Theoretical Approach. Topics in Educational Leadership.

    ERIC Educational Resources Information Center

    Gaynor, Alan Kibbe

    This book is directed toward students in organizational-theory and problem-analysis classes and their professors, as well as school administrators seeking to examine their problems and policies from new perspectives. It explains and illustrates methodology for describing, documenting, and analyzing organizational problems. Part I, "Methodology,"…

  15. Hybrid techniques for complex aerospace electromagnetics problems

    NASA Technical Reports Server (NTRS)

    Aberle, Jim

    1993-01-01

    Important aerospace electromagnetics problems include the evaluation of antenna performance on aircraft and the prediction and control of the aircraft's electromagnetic signature. Due to the ever increasing complexity and expense of aircraft design, aerospace engineers have become increasingly dependent on computer solutions. Traditionally, computational electromagnetics (CEM) has relied primarily on four disparate techniques: the method of moments (MoM), the finite-difference time-domain (FDTD) technique, the finite element method (FEM), and high frequency asymptotic techniques (HFAT) such as ray tracing. Each of these techniques has distinct advantages and disadvantages, and no single technique is capable of accurately solving all problems of interest on computers that are available now or will be available in the foreseeable future. As a result, new approaches that overcome the deficiencies of traditional techniques are beginning to attract a great deal of interest in the CEM community. Among these new approaches are hybrid methods which combine two or more of these techniques into a coherent model. During the ASEE Summer Faculty Fellowship Program a hybrid FEM/MoM computer code was developed and applied to a geometry containing features found on many modern aircraft.

  16. The Guarding Problem - Complexity and Approximation

    NASA Astrophysics Data System (ADS)

    Reddy, T. V. Thirumala; Krishna, D. Sai; Rangan, C. Pandu

    Let G = (V, E) be the given graph and G R = (V R ,E R ) and G C = (V C ,E C ) be the sub graphs of G such that V R ∩ V C = ∅ and V R ∪ V C = V. G C is referred to as the cops region and G R is called as the robber region. Initially a robber is placed at some vertex of V R and the cops are placed at some vertices of V C . The robber and cops may move from their current vertices to one of their neighbours. While a cop can move only within the cops region, the robber may move to any neighbour. The robber and cops move alternatively. A vertex v ∈ V C is said to be attacked if the current turn is the robber's turn, the robber is at vertex u where u ∈ V R , (u,v) ∈ E and no cop is present at v. The guarding problem is to find the minimum number of cops required to guard the graph G C from the robber's attack. We first prove that the decision version of this problem when G R is an arbitrary undirected graph is PSPACE-hard. We also prove that the complexity of the decision version of the guarding problem when G R is a wheel graph is NP-hard. We then present approximation algorithms if G R is a star graph, a clique and a wheel graph with approximation ratios H(n 1), 2 H(n 1) and left( H(n1) + 3/2 right) respectively, where H(n1) = 1 + 1/2 + ... + 1/n1 and n 1 = ∣ V R ∣.

  17. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 2. [sample problem library guide

    NASA Technical Reports Server (NTRS)

    Jackson, C. E., Jr.

    1977-01-01

    A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.

  18. Decomposing a complex design problem using CLIPS

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    1990-01-01

    Many engineering systems are large and multidisciplinary. Before the design of such complex systems can begin, much time and money are invested in determining the possible couplings among the participating subsystems and their parts. For designs based on existing concepts, like commercial aircraft design, the subsystems and their couplings are usually well-established. However, for designs based on novel concepts, like large space platforms, the determination of the subsystems, couplings, and participating disciplines is an important task. Moreover, this task must be repeated as new information becomes available or as the design specifications change. Determining the subsystems is not an easy, straightforward process and often important couplings are overlooked. The design manager must know how to divide the design work among the design teams so that changes in one subsystem will have predictable effects on other subsystems. The resulting subsystems must be ordered into a hierarchical structure before the planning documents and milestones of the design project are set. The success of a design project often depends on the wise choice of design variables, constraints, objective functions, and the partitioning of these among the design teams. Very few tools are available to aid the design manager in determining the hierarchical structure of a design problem and assist in making these decisions.

  19. Problems and solutions in analyzing partial-reflection drift data by correlation techniques

    NASA Technical Reports Server (NTRS)

    Meek, C. E.

    1984-01-01

    Solutions in analyzing partial reflection drift data by correlation techniques are discussed. The problem of analyzing spaced antenna drift data breaks down into the general categories of raw data collection and storage, correlation calculation, interpretation of correlations, location of time lags for peak correlation, and velocity calculation.

  20. Complex network problems in physics, computer science and biology

    NASA Astrophysics Data System (ADS)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe

  1. Team-Based Complex Problem Solving: A Collective Cognition Perspective

    ERIC Educational Resources Information Center

    Hung, Woei

    2013-01-01

    Today, much problem solving is performed by teams, rather than individuals. The complexity of these problems has exceeded the cognitive capacity of any individual and requires a team of members to solve them. The success of solving these complex problems not only relies on individual team members who possess different but complementary expertise,…

  2. Solving Complex Problems: A Convergent Approach to Cognitive Load Measurement

    ERIC Educational Resources Information Center

    Zheng, Robert; Cook, Anne

    2012-01-01

    The study challenged the current practices in cognitive load measurement involving complex problem solving by manipulating the presence of pictures in multiple rule-based problem-solving situations and examining the cognitive load resulting from both off-line and online measures associated with complex problem solving. Forty-eight participants…

  3. Managing Complex Problems in Rangeland Ecosystems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Management of rangelands, and natural resources in general, has become increasingly complex. There is an atmosphere of increasing expectations for conservation efforts associated with a variety of issues from water quality to endangered species. We argue that many current issues are complex by their...

  4. Complex partial status epilepticus: a recurrent problem.

    PubMed Central

    Cockerell, O C; Walker, M C; Sander, J W; Shorvon, S D

    1994-01-01

    Twenty patients with complex partial status epilepticus were identified retrospectively from a specialist neurology hospital. Seventeen patients experienced recurrent episodes of complex partial status epilepticus, often occurring at regular intervals, usually over many years, and while being treated with effective anti-epileptic drugs. No unifying cause for the recurrences, and no common epilepsy aetiologies, were identified. In spite of the frequency of recurrence and length of history, none of the patients showed any marked evidence of cognitive or neurological deterioration. Complex partial status epilepticus is more common than is generally recognised, should be differentiated from other forms of non-convulsive status, and is often difficult to treat. PMID:8021671

  5. Organizational Structure and Complex Problem Solving

    ERIC Educational Resources Information Center

    Becker, Selwyn W.; Baloff, Nicholas

    1969-01-01

    The problem-solving efficiency of different organization structures is discussed in relation to task requirements and the appropriate organizational behavior, to group adaptation to a task over time, and to various group characteristics. (LN)

  6. Solving the Inverse-Square Problem with Complex Variables

    ERIC Educational Resources Information Center

    Gauthier, N.

    2005-01-01

    The equation of motion for a mass that moves under the influence of a central, inverse-square force is formulated and solved as a problem in complex variables. To find the solution, the constancy of angular momentum is first established using complex variables. Next, the complex position coordinate and complex velocity of the particle are assumed…

  7. Building an information model (with the help of PSL/PSA). [Problem Statement Language/Problem Statement Analyzer

    NASA Technical Reports Server (NTRS)

    Callender, E. D.; Farny, A. M.

    1983-01-01

    Problem Statement Language/Problem Statement Analyzer (PSL/PSA) applications, which were once a one-step process in which product system information was immediately translated into PSL statements, have in light of experience been shown to result in inconsistent representations. These shortcomings have prompted the development of an intermediate step, designated the Product System Information Model (PSIM), which provides a basis for the mutual understanding of customer terminology and the formal, conceptual representation of that product system in a PSA data base. The PSIM is initially captured as a paper diagram, followed by formal capture in the PSL/PSA data base.

  8. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  9. Analyzing the Responses of 7-8 Year Olds When Solving Partitioning Problems

    ERIC Educational Resources Information Center

    Badillo, Edelmira; Font, Vicenç; Edo, Mequè

    2015-01-01

    We analyze the mathematical solutions of 7- to 8-year-old pupils while individually solving an arithmetic problem. The analysis was based on the "configuration of objects," an instrument derived from the onto-semiotic approach to mathematical knowledge. Results are illustrated through a number of cases. From the analysis of mathematical…

  10. How Unstable Are Complex Financial Systems? Analyzing an Inter-bank Network of Credit Relations

    NASA Astrophysics Data System (ADS)

    Sinha, Sitabhra; Thess, Maximilian; Markose, Sheri

    The recent worldwide economic crisis of 2007-09 has focused attention on the need to analyze systemic risk in complex financial networks. We investigate the problem of robustness of such systems in the context of the general theory of dynamical stability in complex networks and, in particular, how the topology of connections influence the risk of the failure of a single institution triggering a cascade of successive collapses propagating through the network. We use data on bilateral liabilities (or exposure) in the derivatives market between 202 financial intermediaries based in USA and Europe in the last quarter of 2009 to empirically investigate the network structure of the over-the-counter (OTC) derivatives market. We observe that the network exhibits both heterogeneity in node properties and the existence of communities. It also has a prominent core-periphery organization and can resist large-scale collapse when subjected to individual bank defaults (however, failure of any bank in the core may result in localized collapse of the innermost core with substantial loss of capital) but is vulnerable to system-wide breakdown as a result of an accompanying liquidity crisis.

  11. Analyzing HIV/AIDS and Alcohol and Other Drug Use as a Social Problem

    PubMed Central

    PATTERSON, DAVID A.; Wolf (Adelv unegv Waya), Silver

    2012-01-01

    Most prevention and intervention activities directed toward HIV/AIDS and alcohol and other drug use separately as well as the combining of the two (e.g., those who are both HIV/AIDS and using alcohol and other drugs) comes in the form of specific, individualized therapies without consideration of social influences that may have a greater impact on this population. Approaching this social problem from the narrowed view of individualized, mi-cro solutions disregards the larger social conditions that affect or perhaps even are at the root of the problem. This paper analyzes the social problem of HIV/AIDS and alcohol and other drug abuse using three sociological perspectives—social construction theory, ethnomethodology, and conflict theory—informing the reader of the broader influences accompanying this problem. PMID:23264724

  12. Multigrid Methods for Aerodynamic Problems in Complex Geometries

    NASA Technical Reports Server (NTRS)

    Caughey, David A.

    1995-01-01

    Work has been directed at the development of efficient multigrid methods for the solution of aerodynamic problems involving complex geometries, including the development of computational methods for the solution of both inviscid and viscous transonic flow problems. The emphasis is on problems of complex, three-dimensional geometry. The methods developed are based upon finite-volume approximations to both the Euler and the Reynolds-Averaged Navier-Stokes equations. The methods are developed for use on multi-block grids using diagonalized implicit multigrid methods to achieve computational efficiency. The work is focused upon aerodynamic problems involving complex geometries, including advanced engine inlets.

  13. Complex Mathematical Problem Solving by Individuals and Dyads.

    ERIC Educational Resources Information Center

    Vye, Nancy J.; Goldman, Susan R.; Voss, James F.; Hmelo, Cindy; Williams, Susan; Cognition and Technology Group at Vanderbilt University

    1997-01-01

    Describes two studies of mathematical problem solving using an episode from "The Adventures of Jasper Woodbury," a set of curriculum materials that afford complex problem-solving opportunities. Discussion focuses on characteristics of problems that make solutions difficult, kinds of reasoning that dyadic interactions support, and considerations of…

  14. Preparing for Complexity and Wicked Problems through Transformational Learning Approaches

    ERIC Educational Resources Information Center

    Yukawa, Joyce

    2015-01-01

    As the information environment becomes increasingly complex and challenging, Library and Information Studies (LIS) education is called upon to nurture innovative leaders capable of managing complex situations and "wicked problems." While disciplinary expertise remains essential, higher levels of mental complexity and adaptive…

  15. A New Approach to Analyzing the Cognitive Load in Physics Problems

    NASA Astrophysics Data System (ADS)

    Teodorescu, Raluca

    2010-02-01

    I will present a Taxonomy of Introductory Physics Problems (TIPP), which relates physics problems to the cognitive processes and the knowledge required to solve them. TIPP was created for designing and clarifying educational objectives, for developing assessments to evaluate components of the problem-solving process, and for guiding curriculum design in introductory physics courses. To construct TIPP, I considered processes that have been identified either by cognitive science and expert-novice research or by direct observation of students' behavior while solving physics problems. Based on Marzano and Kendall's taxonomy [1], I developed a procedure to classify physics problems according to the cognitive processes that they involve and the knowledge to which they refer. The procedure is applicable to any physics problem and its validity and reliability have been confirmed. This algorithm was then used to build TIPP, which is a database that contains text-based and research-based physics problems and explains their relationship to cognitive processes and knowledge. TIPP has been used in the years 2006--2009 to reform the first semester of the introductory algebra-based physics course at The George Washington University. The reform targeted students' cognitive development and attitudes improvement. The methodology employed in the course involves exposing students to certain types of problems in a variety of contexts with increasing complexity. To assess the effectiveness of our approach, rubrics were created to evaluate students' problem-solving abilities and the Colorado Learning Attitudes about Science Survey (CLASS) was administered pre- and post-instruction to determine students' shift in dispositions towards learning physics. Our results show definitive gains in the areas targeted by our curricular reform.[4pt] [1] R.J. Marzano and J.S. Kendall, The New Taxonomy of Educational Objectives, 2^nd Ed., (Corwin Press, Thousand Oaks, 2007). )

  16. The ESTER particle and plasma analyzer complex for the phobos mission

    NASA Astrophysics Data System (ADS)

    Afonin, V. V.; McKenna-Lawlor, S.; Kiraly, P.; Marsden, R.; Richter, A.; Rusznyak, P.; Shutte, N. M.; Szabo, L.; Szalai, S.; Szucs, I. T.; Varhalmi, L.; Witte, M.

    1990-05-01

    The ESTER particle and plasma analyzer system for the Phobos Mission comprised a complex of three instruments (LET, SLED and HARP) serviced by a common Data Processing Unit. An account is provided of this complex, its objectives and excellent performance in space.

  17. Minimum structural controllability problems of complex networks

    NASA Astrophysics Data System (ADS)

    Yin, Hongli; Zhang, Siying

    2016-02-01

    Controllability of complex networks has been one of the attractive research areas for both network and control community, and has yielded many promising and significant results in minimum inputs and minimum driver vertices. However, few studies have been devoted to studying the minimum controlled vertex set through which control over the network with arbitrary structure can be achieved. In this paper, we prove that the minimum driver vertices driven by different inputs are not sufficient to ensure the full control of the network when the associated graph contains the inaccessible strongly connected component which has perfect matching and propose an algorithm to identify a minimum controlled vertex set for network with arbitrary structure using convenient graph and mathematical tools. And the simulation results show that the controllability of network is correlated to the number of inaccessible strongly connected components which have perfect matching and these results promote us to better understand the relationship between the network's structural characteristics and its control.

  18. Completed Beltrami-Michell formulation for analyzing mixed boundary value problems in elasticity

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Kaljevic, Igor; Hopkins, Dale A.; Saigal, Sunil

    1995-01-01

    In elasticity, the method of forces, wherein stress parameters are considered as the primary unknowns, is known as the Beltrami-Michell formulation (BMF). The existing BMF can only solve stress boundary value problems; it cannot handle the more prevalent displacement of mixed boundary value problems of elasticity. Therefore, this formulation, which has restricted application, could not become a true alternative to the Navier's displacement method, which can solve all three types of boundary value problems. The restrictions in the BMF have been alleviated by augmenting the classical formulation with a novel set of conditions identified as the boundary compatibility conditions. This new method, which completes the classical force formulation, has been termed the completed Beltrami-Michell formulation (CBMF). The CBMF can solve general elasticity problems with stress, displacement, and mixed boundary conditions in terms of stresses as the primary unknowns. The CBMF is derived from the stationary condition of the variational functional of the integrated force method. In the CBMF, stresses for kinematically stable structures can be obtained without any reference to the displacements either in the field or on the boundary. This paper presents the CBMF and its derivation from the variational functional of the integrated force method. Several examples are presented to demonstrate the applicability of the completed formulation for analyzing mixed boundary value problems under thermomechanical loads. Selected example problems include a cylindrical shell wherein membrane and bending responses are coupled, and a composite circular plate.

  19. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 1. [thermal analyzer manual

    NASA Technical Reports Server (NTRS)

    Lee, H. P.

    1977-01-01

    The NASTRAN Thermal Analyzer Manual describes the fundamental and theoretical treatment of the finite element method, with emphasis on the derivations of the constituent matrices of different elements and solution algorithms. Necessary information and data relating to the practical applications of engineering modeling are included.

  20. The complex problem of sensitive skin.

    PubMed

    Marriott, Marie; Holmes, Jo; Peters, Lisa; Cooper, Karen; Rowson, Matthew; Basketter, David A

    2005-08-01

    There exists within the population subsets of individuals who display heightened skin reactivity to materials the majority find tolerable. In a series of investigations, we have examined interrelationships between many of the endpoints associated with the term 'sensitive skin'. In the most recent work, 58 volunteers were treated with 10% lactic acid, 50% ethanol, 0.5% menthol and 1.0% capsaicin on the nasolabial fold, unoccluded, with sensory reactions recorded at 2.5 min, 5 min and 8 min after application. Urticant susceptibility was evaluated with 1 m benzoic acid and 125 mM trans-cinnamic acid applied to the volar forearm for 20 min. A 2 x 23-h patch test was also conducted using 0.1% and 0.3% sodium dodecyl sulfate, 0.3% and 0.6% cocamidopropyl betaine and 0.1% and 0.2% benzalkonium chloride to determine irritant susceptibility. As found in previous studies, increased susceptibility to one endpoint was not predictive of sensitivity to another. In our experience, nasolabial stinging was a poor predictor of general skin sensitivity. Nevertheless, it may be possible to identify in the normal population individuals who, coincidentally, are more generally sensitive to a range of non-immunologic adverse skin reactions. Whether such individuals are those who experience problems with skin care products remains to be addressed. PMID:16033403

  1. On the Complexity of Rearrangement Problems under the Breakpoint Distance

    PubMed Central

    2014-01-01

    Abstract We study the complexity of rearrangement problems in the generalized breakpoint model of Tannier et al. and settle several open questions. We improve the algorithm for the median problem and show that it is equivalent to the problem of finding maximum cardinality nonbipartite matching (under linear reduction). On the other hand, we prove that the more general small phylogeny problem is NP-hard. Surprisingly, we show that it is already NP-hard (or even APX-hard) for a quartet phylogeny. We also show that in the unichromosomal and the multilinear breakpoint model the halving problem is NP-hard, refuting the conjecture of Tannier et al. Interestingly, this is the first problem that is harder in the breakpoint model than in the double cut and join or reversal models. PMID:24200391

  2. Semantic Annotation of Complex Text Structures in Problem Reports

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Throop, David R.; Fleming, Land D.

    2011-01-01

    Text analysis is important for effective information retrieval from databases where the critical information is embedded in text fields. Aerospace safety depends on effective retrieval of relevant and related problem reports for the purpose of trend analysis. The complex text syntax in problem descriptions has limited statistical text mining of problem reports. The presentation describes an intelligent tagging approach that applies syntactic and then semantic analysis to overcome this problem. The tags identify types of problems and equipment that are embedded in the text descriptions. The power of these tags is illustrated in a faceted searching and browsing interface for problem report trending that combines automatically generated tags with database code fields and temporal information.

  3. Particle swarm optimization for complex nonlinear optimization problems

    NASA Astrophysics Data System (ADS)

    Alexandridis, Alex; Famelis, Ioannis Th.; Tsitouras, Charalambos

    2016-06-01

    This work presents the application of a technique belonging to evolutionary computation, namely particle swarm optimization (PSO), to complex nonlinear optimization problems. To be more specific, a PSO optimizer is setup and applied to the derivation of Runge-Kutta pairs for the numerical solution of initial value problems. The effect of critical PSO operational parameters on the performance of the proposed scheme is thoroughly investigated.

  4. Theory of periodically specified problems: Complexity and approximability

    SciTech Connect

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.; Rosenkrantz, D.J.

    1997-12-05

    We study the complexity and the efficient approximability of graph and satisfiability problems when specified using various kinds of periodic specifications studied. The general results obtained include the following: (1) We characterize the complexities of several basic generalized CNF satisfiability problems SAT(S) [Sc78], when instances are specified using various kinds of 1- and 2-dimensional periodic specifications. We outline how this characterization can be used to prove a number of new hardness results for the complexity classes DSPACE(n), NSPACE(n), DEXPTIME, NEXPTIME, EXPSPACE etc. These results can be used to prove in a unified way the hardness of a number of combinatorial problems when instances are specified succinctly using various succient specifications considered in the literature. As one corollary, we show that a number of basic NP-hard problems because EXPSPACE-hard when inputs are represented using 1-dimensional infinite periodic wide specifications. This answers a long standing open question posed by Orlin. (2) We outline a simple yet a general technique to devise approximation algorithms with provable worst case performance guarantees for a number of combinatorial problems specified periodically. Our efficient approximation algorithms and schemes are based on extensions of the ideas and represent the first non-trivial characterization of a class of problems having an {epsilon}-approximation (or PTAS) for periodically specified NEXPTIME-hard problems. Two of properties of our results are: (i) For the first time, efficient approximation algorithms and schemes have been developed for natural NEXPTIME-complete problems. (ii) Our results are the first polynomial time approximation algorithms with good performance guarantees for hard problems specified using various kinds of periodic specifications considered in this paper.

  5. Application of NASA management approach to solve complex problems on earth

    NASA Technical Reports Server (NTRS)

    Potate, J. S.

    1972-01-01

    The application of NASA management approach to solving complex problems on earth is discussed. The management of the Apollo program is presented as an example of effective management techniques. Four key elements of effective management are analyzed. Photographs of the Cape Kennedy launch sites and supporting equipment are included to support the discussions.

  6. Investigating the Effect of Complexity Factors in Gas Law Problems

    ERIC Educational Resources Information Center

    Schuttlefield, Jennifer D.; Kirk, John; Pienta, Norbert J.; Tang, Hui

    2012-01-01

    Undergraduate students were asked to complete gas law questions using a Web-based tool as a first step in our understanding of the role of cognitive load in chemistry word questions and in helping us assess student problem-solving. Each question contained five different complexity factors, which were randomly assigned by the tool so that a…

  7. What Do Employers Pay for Employees' Complex Problem Solving Skills?

    ERIC Educational Resources Information Center

    Ederer, Peer; Nedelkoska, Ljubica; Patt, Alexander; Castellazzi, Silvia

    2015-01-01

    We estimate the market value that employers assign to the complex problem solving (CPS) skills of their employees, using individual-level Mincer-style wage regressions. For the purpose of the study, we collected new and unique data using psychometric measures of CPS and an extensive background questionnaire on employees' personal and work history.…

  8. Emergent Science: Solving complex science problems via collaborations

    NASA Astrophysics Data System (ADS)

    Li, X.; Ramachandran, R.; Wilson, B. D.; Lynnes, C.; Conover, H.

    2009-12-01

    The recent advances in Cyberinfrastructure have democratized the use of computational and data resources. These resources together with new social networking and collaboration technologies, present an unprecedented opportunity to impact the science process. These advances can move the science process from “circumspect science” -- where scientists publish only when the project is complete, publish only the final results, seldom publish things that did not work, and communicate results with each other using paper technology -- to “open science” -- where scientists can share and publish every element in their research, from the data used as input, workflows used to analyze these data sets, possibly failed experiments, and the final results. Open science can foster novel ways of social collaboration in science. We are already seeing the impact of social collaboration in our daily lives. A simple example is the use of reviews posted online by other consumers while evaluating whether to buy a product or not. This phenomenon has been well documented and is referred by many names such as Smart Mobs, Wisdom of Crowds, Wikinomics, Crowd sourcing, We-Think and swarm collaboration. Similar social collaborations during the science process can lead to “emergent science”. We define "emergent science" as way complex science problems can be solved and new research directions forged out of a multiplicity of relatively simple collaborative interactions. There are, however, barriers that prevent social collaboration within the science process. Some of these barriers are technical such as lack of science collaboration platforms and the others are social. The success of any collaborative platform has to take into account the incentives or motivation for the scientists to participate. This presentation will address obstacles facing emergent science and will suggest possible solutions required to build a critical mass.

  9. The Complex Route to Success: Complex Problem-Solving Skills in the Prediction of University Success

    ERIC Educational Resources Information Center

    Stadler, Matthias J.; Becker, Nicolas; Greiff, Samuel; Spinath, Frank M.

    2016-01-01

    Successful completion of a university degree is a complex matter. Based on considerations regarding the demands of acquiring a university degree, the aim of this paper was to investigate the utility of complex problem-solving (CPS) skills in the prediction of objective and subjective university success (SUS). The key finding of this study was that…

  10. Complexity and efficient approximability of two dimensional periodically specified problems

    SciTech Connect

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.

    1996-09-01

    The authors consider the two dimensional periodic specifications: a method to specify succinctly objects with highly regular repetitive structure. These specifications arise naturally when processing engineering designs including VLSI designs. These specifications can specify objects whose sizes are exponentially larger than the sizes of the specification themselves. Consequently solving a periodically specified problem by explicitly expanding the instance is prohibitively expensive in terms of computational resources. This leads one to investigate the complexity and efficient approximability of solving graph theoretic and combinatorial problems when instances are specified using two dimensional periodic specifications. They prove the following results: (1) several classical NP-hard optimization problems become NEXPTIME-hard, when instances are specified using two dimensional periodic specifications; (2) in contrast, several of these NEXPTIME-hard problems have polynomial time approximation algorithms with guaranteed worst case performance.

  11. Analyzing Pre-Service Primary Teachers' Fraction Knowledge Structures through Problem Posing

    ERIC Educational Resources Information Center

    Kilic, Cigdem

    2015-01-01

    In this study it was aimed to determine pre-service primary teachers' knowledge structures of fraction through problem posing activities. A total of 90 pre-service primary teachers participated in this study. A problem posing test consisting of two questions was used and the participants were asked to generate as many as problems based on the…

  12. A Generalized Topological Entropy for Analyzing the Complexity of DNA Sequences

    PubMed Central

    Jiang, Qinghua; Xu, Li; Peng, Jiajie; Wang, Yong; Wang, Yadong

    2014-01-01

    Topological entropy is one of the most difficult entropies to be used to analyze the DNA sequences, due to the finite sample and high-dimensionality problems. In order to overcome these problems, a generalized topological entropy is introduced. The relationship between the topological entropy and the generalized topological entropy is compared, which shows the topological entropy is a special case of the generalized entropy. As an application the generalized topological entropy in introns, exons and promoter regions was computed, respectively. The results indicate that the entropy of introns is higher than that of exons, and the entropy of the exons is higher than that of the promoter regions for each chromosome, which suggest that DNA sequence of the promoter regions is more regular than the exons and introns. PMID:24533097

  13. A Comparison of Geographic Information Systems, Complex Networks, and Other Models for Analyzing Transportation Network Topologies

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia (Technical Monitor); Kuby, Michael; Tierney, Sean; Roberts, Tyler; Upchurch, Christopher

    2005-01-01

    This report reviews six classes of models that are used for studying transportation network topologies. The report is motivated by two main questions. First, what can the "new science" of complex networks (scale-free, small-world networks) contribute to our understanding of transport network structure, compared to more traditional methods? Second, how can geographic information systems (GIS) contribute to studying transport networks? The report defines terms that can be used to classify different kinds of models by their function, composition, mechanism, spatial and temporal dimensions, certainty, linearity, and resolution. Six broad classes of models for analyzing transport network topologies are then explored: GIS; static graph theory; complex networks; mathematical programming; simulation; and agent-based modeling. Each class of models is defined and classified according to the attributes introduced earlier. The paper identifies some typical types of research questions about network structure that have been addressed by each class of model in the literature.

  14. Analyzing networks of phenotypes in complex diseases: methodology and applications in COPD

    PubMed Central

    2014-01-01

    Background The investigation of complex disease heterogeneity has been challenging. Here, we introduce a network-based approach, using partial correlations, that analyzes the relationships among multiple disease-related phenotypes. Results We applied this method to two large, well-characterized studies of chronic obstructive pulmonary disease (COPD). We also examined the associations between these COPD phenotypic networks and other factors, including case-control status, disease severity, and genetic variants. Using these phenotypic networks, we have detected novel relationships between phenotypes that would not have been observed using traditional epidemiological approaches. Conclusion Phenotypic network analysis of complex diseases could provide novel insights into disease susceptibility, disease severity, and genetic mechanisms. PMID:24964944

  15. Analyzing the causation of a railway accident based on a complex network

    NASA Astrophysics Data System (ADS)

    Ma, Xin; Li, Ke-Ping; Luo, Zi-Yan; Zhou, Jin

    2014-02-01

    In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents.

  16. Analyzing complex patients' temporal histories: new frontiers in temporal data mining.

    PubMed

    Sacchi, Lucia; Dagliati, Arianna; Bellazzi, Riccardo

    2015-01-01

    In recent years, data coming from hospital information systems (HIS) and local healthcare organizations have started to be intensively used for research purposes. This rising amount of available data allows reconstructing the compete histories of the patients, which have a strong temporal component. This chapter introduces the major challenges faced by temporal data mining researchers in an era when huge quantities of complex clinical temporal data are becoming available. The analysis is focused on the peculiar features of this kind of data and describes the methodological and technological aspects that allow managing such complex framework. The chapter shows how heterogeneous data can be processed to derive a homogeneous representation. Starting from this representation, it illustrates different techniques for jointly analyze such kind of data. Finally, the technological strategies that allow creating a common data warehouse to gather data coming from different sources and with different formats are presented. PMID:25417081

  17. Sorting of Streptomyces Cell Pellets Using a Complex Object Parametric Analyzer and Sorter

    PubMed Central

    Petrus, Marloes L. C.; van Veluw, G. Jerre; Wösten, Han A. B.; Claessen, Dennis

    2014-01-01

    Streptomycetes are filamentous soil bacteria that are used in industry for the production of enzymes and antibiotics. When grown in bioreactors, these organisms form networks of interconnected hyphae, known as pellets, which are heterogeneous in size. Here we describe a method to analyze and sort mycelial pellets using a Complex Object Parametric Analyzer and Sorter (COPAS). Detailed instructions are given for the use of the instrument and the basic statistical analysis of the data. We furthermore describe how pellets can be sorted according to user-defined settings, which enables downstream processing such as the analysis of the RNA or protein content. Using this methodology the mechanism underlying heterogeneous growth can be tackled. This will be instrumental for improving streptomycetes as a cell factory, considering the fact that productivity correlates with pellet size. PMID:24561666

  18. Data Mining and Complex Problems: Case Study in Composite Materials

    NASA Technical Reports Server (NTRS)

    Rabelo, Luis; Marin, Mario

    2009-01-01

    Data mining is defined as the discovery of useful, possibly unexpected, patterns and relationships in data using statistical and non-statistical techniques in order to develop schemes for decision and policy making. Data mining can be used to discover the sources and causes of problems in complex systems. In addition, data mining can support simulation strategies by finding the different constants and parameters to be used in the development of simulation models. This paper introduces a framework for data mining and its application to complex problems. To further explain some of the concepts outlined in this paper, the potential application to the NASA Shuttle Reinforced Carbon-Carbon structures and genetic programming is used as an illustration.

  19. Complexity and approximability of quantified and stochastic constraint satisfaction problems

    SciTech Connect

    Hunt, H. B.; Stearns, R. L.; Marathe, M. V.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SAT{sub c}(S)). Here, we study simultaneously the complexity of and the existence of efficient approximation algorithms for a number of variants of the problems SAT(S) and SAT{sub c}(S), and for many different D, C, and S. These problem variants include decision and optimization problems, for formulas, quantified formulas stochastically-quantified formulas. We denote these problems by Q-SAT(S), MAX-Q-SAT(S), S-SAT(S), MAX-S-SAT(S) MAX-NSF-Q-SAT(S) and MAX-NSF-S-SAT(S). The main contribution is the development of a unified predictive theory for characterizing the the complexity of these problems. Our unified approach is based on the following basic two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic representability. Let k {ge} 2. Let S be a finite set of finite-arity relations on {Sigma}{sub k} with the following condition on S: All finite arity relations on {Sigma}{sub k} can be represented as finite existentially-quantified conjunctions of relations in S applied to variables (to variables and constant symbols in C), Then we prove the following new results: (1) The problems SAT(S) and SAT{sub c}(S) are both NQL-complete and {le}{sub logn}{sup bw}-complete for NP. (2) The problems Q-SAT(S), Q-SAT{sub c}(S), are PSPACE-complete. Letting k = 2, the problem S-SAT(S) and S-SAT{sub c}(S) are PSPACE-complete. (3) {exists} {epsilon} > 0 for which approximating the problems MAX-Q-SAT(S) within {epsilon} times optimum is PSPACE-hard. Letting k =: 2, {exists} {epsilon} > 0 for which approximating the problems MAX-S-SAT(S) within {epsilon} times optimum is PSPACE-hard. (4

  20. Analyzing complex functional brain networks: Fusing statistics and network science to understand the brain*†

    PubMed Central

    Simpson, Sean L.; Bowman, F. DuBois; Laurienti, Paul J.

    2014-01-01

    Complex functional brain network analyses have exploded over the last decade, gaining traction due to their profound clinical implications. The application of network science (an interdisciplinary offshoot of graph theory) has facilitated these analyses and enabled examining the brain as an integrated system that produces complex behaviors. While the field of statistics has been integral in advancing activation analyses and some connectivity analyses in functional neuroimaging research, it has yet to play a commensurate role in complex network analyses. Fusing novel statistical methods with network-based functional neuroimage analysis will engender powerful analytical tools that will aid in our understanding of normal brain function as well as alterations due to various brain disorders. Here we survey widely used statistical and network science tools for analyzing fMRI network data and discuss the challenges faced in filling some of the remaining methodological gaps. When applied and interpreted correctly, the fusion of network scientific and statistical methods has a chance to revolutionize the understanding of brain function. PMID:25309643

  1. Increased complexity in carcinomas: Analyzing and modeling the interaction of human cancer cells with their microenvironment.

    PubMed

    Stadler, Mira; Walter, Stefanie; Walzl, Angelika; Kramer, Nina; Unger, Christine; Scherzer, Martin; Unterleuthner, Daniela; Hengstschläger, Markus; Krupitza, Georg; Dolznig, Helmut

    2015-12-01

    Solid cancers are not simple accumulations of malignant tumor cells but rather represent complex organ-like structures. Despite a more chaotic general appearance as compared to the highly organized setup of healthy tissues, cancers still show highly differentiated structures and a close interaction with and dependency on the interwoven connective tissue. This complexity within cancers is not known in detail at the molecular level so far. The first part of this article will shortly describe the technology and strategies to quantify and dissect the heterogeneity in human solid cancers. Moreover, there is urgent need to better understand human cancer biology since the development of novel anti-cancer drugs is far from being efficient, predominantly due to the scarcity of predictive preclinical models. Hence, in vivo and in vitro models were developed, which better recapitulate the complexity of human cancers, by their intrinsic three-dimensional nature and the cellular heterogeneity and allow functional intervention for hypothesis testing. Therefore, in the second part 3D in vitro cancer models are presented that analyze and depict the heterogeneity in human cancers. Advantages and drawbacks of each model are highlighted and their suitability to preclinical drug testing is discussed. PMID:26320002

  2. On the complexity of some quadratic Euclidean 2-clustering problems

    NASA Astrophysics Data System (ADS)

    Kel'manov, A. V.; Pyatkin, A. V.

    2016-03-01

    Some problems of partitioning a finite set of points of Euclidean space into two clusters are considered. In these problems, the following criteria are minimized: (1) the sum over both clusters of the sums of squared pairwise distances between the elements of the cluster and (2) the sum of the (multiplied by the cardinalities of the clusters) sums of squared distances from the elements of the cluster to its geometric center, where the geometric center (or centroid) of a cluster is defined as the mean value of the elements in that cluster. Additionally, another problem close to (2) is considered, where the desired center of one of the clusters is given as input, while the center of the other cluster is unknown (is the variable to be optimized) as in problem (2). Two variants of the problems are analyzed, in which the cardinalities of the clusters are (1) parts of the input or (2) optimization variables. It is proved that all the considered problems are strongly NP-hard and that, in general, there is no fully polynomial-time approximation scheme for them (unless P = NP).

  3. COMPLEXITY&APPROXIMABILITY OF QUANTIFIED&STOCHASTIC CONSTRAINT SATISFACTION PROBLEMS

    SciTech Connect

    Hunt, H. B.; Marathe, M. V.; Stearns, R. E.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S and T be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SATc(S).) Here, we study simultaneously the complexity of decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. We present simple yet general techniques to characterize simultaneously, the complexity or efficient approximability of a number of versions/variants of the problems SAT(S), Q-SAT(S), S-SAT(S),MAX-Q-SAT(S) etc., for many different such D,C ,S, T. These versions/variants include decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. Our unified approach is based on the following two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic represent ability. Some of the results extend the earlier results in [Pa85,LMP99,CF+93,CF+94O]u r techniques and results reported here also provide significant steps towards obtaining dichotomy theorems, for a number of the problems above, including the problems MAX-&-SAT( S), and MAX-S-SAT(S). The discovery of such dichotomy theorems, for unquantified formulas, has received significant recent attention in the literature [CF+93,CF+94,Cr95,KSW97

  4. Analyzing Student Modeling Cycles in the Context of a "Real World" Problem

    ERIC Educational Resources Information Center

    Schorr, Roberta Y.; Amit, Miriam

    2005-01-01

    Many students do not apply their real world intuitions and sense-making abilities when solving mathematics problems in school. In an effort to better understand how to help students draw upon these valued resources, we investigate the manner in which the solution to a particular problem activity is repeatedly re-interpreted by a student. This is…

  5. TOPAZ - the transient one-dimensional pipe flow analyzer: code validation and sample problems

    SciTech Connect

    Winters, W.S.

    1985-10-01

    TOPAZ is a ''user friendly'' computer code for modeling the one-dimensional-transient physics of multi-species gas transfer in arbitrary arrangements of pipes, valves, vessels, and flow branches. This document presents a series of sample problems designed to aid potential users in creating TOPAZ input files. To the extent possible, sample problems were selected for which analytical solutions currently exist. TOPAZ comparisons with such solutions are intended to provide a measure of code validation.

  6. Complexity and Approximation of a Geometric Local Robot Assignment Problem

    NASA Astrophysics Data System (ADS)

    Bonorden, Olaf; Degener, Bastian; Kempkes, Barbara; Pietrzyk, Peter

    We introduce a geometric multi-robot assignment problem. Robots positioned in a Euclidean space have to be assigned to treasures in such a way that their joint strength is sufficient to unearth a treasure with a given weight. The robots have a limited range and thus can only be assigned to treasures in their proximity. The objective is to unearth as many treasures as possible. We investigate the complexity of several variants of this problem and show whether they are in {mathcal P} or are mathcal{ NP}-complete. Furthermore, we provide a distributed and local constant-factor approximation algorithm using constant-factor resource augmentation for the two-dimensional setting with {mathcal O}(log^*n) communication rounds.

  7. Coordinating complex problem-solving among distributed intelligent agents

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1992-01-01

    A process-oriented control model is described for distributed problem solving. The model coordinates the transfer and manipulation of information across independent networked applications, both intelligent and conventional. The model was implemented using SOCIAL, a set of object-oriented tools for distributing computing. Complex sequences of distributed tasks are specified in terms of high level scripts. Scripts are executed by SOCIAL objects called Manager Agents, which realize an intelligent coordination model that routes individual tasks to suitable server applications across the network. These tools are illustrated in a prototype distributed system for decision support of ground operations for NASA's Space Shuttle fleet.

  8. Analyzing Energy and Resource Problems: An Interdisciplinary Approach to Mathematical Modeling.

    ERIC Educational Resources Information Center

    Fishman, Joseph

    1993-01-01

    Suggests ways in which mathematical models can be presented and developed in the classroom to promote discussion, analysis, and understanding of issues related to energy consumption. Five problems deal with past trends and future projections of availability of a nonrenewable resource, natural gas. (Contains 13 references.) (MDH)

  9. Case Studies in Critical Ecoliteracy: A Curriculum for Analyzing the Social Foundations of Environmental Problems

    ERIC Educational Resources Information Center

    Turner, Rita; Donnelly, Ryan

    2013-01-01

    This article outlines the features and application of a set of model curriculum materials that utilize eco-democratic principles and humanities-based content to cultivate critical analysis of the cultural foundations of socio-environmental problems. We first describe the goals and components of the materials, then discuss results of their use in…

  10. INVESTIGATION OF ANALYZER PROBLEMS IN THE MEASUREMENT OF NOX FROM METHANOL VEHICLES

    EPA Science Inventory

    The study investigated the extent and source of irregularities related to the measurement of NOx emissions from methanol cars. Corrective measures also were explored. It was observed that NOx chemiluminescent analyzers respond to methanol and formaldehyde after being exposed to h...

  11. Analyzing and Attempting to Overcome Prospective Teachers' Difficulties during Problem-Solving Instruction

    ERIC Educational Resources Information Center

    Karp, Alexander

    2010-01-01

    This article analyzes the experiences of prospective secondary mathematics teachers during a teaching methods course, offered prior to their student teaching, but involving actual teaching and reflexive analysis of this teaching. The study focuses on the pedagogical difficulties that arose during their teaching, in which prospective teachers…

  12. The problem of motivating teaching staff in a complex amalgamation.

    PubMed

    Kenrick, M A

    1993-09-01

    This paper addresses some of the problems brought about by the merger of a number of schools of nursing into a new complex amalgamation. A very real concern in the new colleges of nursing and midwifery in the United Kingdom is the effect of amalgamation on management systems and staff morale. The main focus of this paper is the motivation of staff during this time of change. There is currently a lack of security amongst staff and in many instances the personal job satisfaction of nurse teachers and managers of nurse education has been reduced, which has made the task of motivating staff difficult. Hence, two major theories of motivation and the implications of these theories for managers of nurse education are discussed. The criteria used for the selection of managers within the new colleges, leadership styles and organizational structures are reviewed. The amalgamations have brought about affiliation with higher-education institutions. Some problems associated with these mergers and the effects on the motivation of staff both within the higher-education institutions and the nursing colleges are outlined. Strategies for overcoming some of the problems are proposed including job enlargement, job enrichment, potential achievement rewards and the use of individual performance reviews which may be useful for assessing the ability of all staff, including managers, in the new amalgamations. PMID:8258610

  13. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    NASA Technical Reports Server (NTRS)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  14. Leveraging Cultural Resources through Teacher Pedagogical Reasoning: Elementary Grade Teachers Analyze Second Language Learners' Science Problem Solving

    ERIC Educational Resources Information Center

    Buxton, Cory A.; Salinas, Alejandra; Mahotiere, Margarette; Lee, Okhee; Secada, Walter G.

    2013-01-01

    Grounded in teacher professional development addressing the intersection of student diversity and content area instruction, this study examined school teachers' pedagogical reasoning complexity as they reflected on their second language learners' science problem solving abilities using both home and school contexts. Teachers responded to interview…

  15. Quantum trajectories in complex space: one-dimensional stationary scattering problems.

    PubMed

    Chou, Chia-Chun; Wyatt, Robert E

    2008-04-21

    One-dimensional time-independent scattering problems are investigated in the framework of the quantum Hamilton-Jacobi formalism. The equation for the local approximate quantum trajectories near the stagnation point of the quantum momentum function is derived, and the first derivative of the quantum momentum function is related to the local structure of quantum trajectories. Exact complex quantum trajectories are determined for two examples by numerically integrating the equations of motion. For the soft potential step, some particles penetrate into the nonclassical region, and then turn back to the reflection region. For the barrier scattering problem, quantum trajectories may spiral into the attractors or from the repellers in the barrier region. Although the classical potentials extended to complex space show different pole structures for each problem, the quantum potentials present the same second-order pole structure in the reflection region. This paper not only analyzes complex quantum trajectories and the total potentials for these examples but also demonstrates general properties and similar structures of the complex quantum trajectories and the quantum potentials for one-dimensional time-independent scattering problems. PMID:18433189

  16. Understanding the determinants of problem-solving behavior in a complex environment

    NASA Technical Reports Server (NTRS)

    Casner, Stephen A.

    1994-01-01

    It is often argued that problem-solving behavior in a complex environment is determined as much by the features of the environment as by the goals of the problem solver. This article explores a technique to determine the extent to which measured features of a complex environment influence problem-solving behavior observed within that environment. In this study, the technique is used to determine how complex flight deck and air traffic control environment influences the strategies used by airline pilots when controlling the flight path of a modern jetliner. Data collected aboard 16 commercial flights are used to measure selected features of the task environment. A record of the pilots' problem-solving behavior is analyzed to determine to what extent behavior is adapted to the environmental features that were measured. The results suggest that the measured features of the environment account for as much as half of the variability in the pilots' problem-solving behavior and provide estimates on the probable effects of each environmental feature.

  17. Human opinion dynamics: An inspiration to solve complex optimization problems

    NASA Astrophysics Data System (ADS)

    Kaur, Rishemjit; Kumar, Ritesh; Bhondekar, Amol P.; Kapur, Pawan

    2013-10-01

    Human interactions give rise to the formation of different kinds of opinions in a society. The study of formations and dynamics of opinions has been one of the most important areas in social physics. The opinion dynamics and associated social structure leads to decision making or so called opinion consensus. Opinion formation is a process of collective intelligence evolving from the integrative tendencies of social influence with the disintegrative effects of individualisation, and therefore could be exploited for developing search strategies. Here, we demonstrate that human opinion dynamics can be utilised to solve complex mathematical optimization problems. The results have been compared with a standard algorithm inspired from bird flocking behaviour and the comparison proves the efficacy of the proposed approach in general. Our investigation may open new avenues towards understanding the collective decision making.

  18. Human opinion dynamics: An inspiration to solve complex optimization problems

    PubMed Central

    Kaur, Rishemjit; Kumar, Ritesh; Bhondekar, Amol P.; Kapur, Pawan

    2013-01-01

    Human interactions give rise to the formation of different kinds of opinions in a society. The study of formations and dynamics of opinions has been one of the most important areas in social physics. The opinion dynamics and associated social structure leads to decision making or so called opinion consensus. Opinion formation is a process of collective intelligence evolving from the integrative tendencies of social influence with the disintegrative effects of individualisation, and therefore could be exploited for developing search strategies. Here, we demonstrate that human opinion dynamics can be utilised to solve complex mathematical optimization problems. The results have been compared with a standard algorithm inspired from bird flocking behaviour and the comparison proves the efficacy of the proposed approach in general. Our investigation may open new avenues towards understanding the collective decision making. PMID:24141795

  19. Applied social and behavioral science to address complex health problems.

    PubMed

    Livingood, William C; Allegrante, John P; Airhihenbuwa, Collins O; Clark, Noreen M; Windsor, Richard C; Zimmerman, Marc A; Green, Lawrence W

    2011-11-01

    Complex and dynamic societal factors continue to challenge the capacity of the social and behavioral sciences in preventive medicine and public health to overcome the most seemingly intractable health problems. This paper proposes a fundamental shift from a research approach that presumes to identify (from highly controlled trials) universally applicable interventions expected to be implemented "with fidelity" by practitioners, to an applied social and behavioral science approach similar to that of engineering. Such a shift would build on and complement the recent recommendations of the NIH Office of Behavioral and Social Science Research and require reformulation of the research-practice dichotomy. It would also require disciplines now engaged in preventive medicine and public health practice to develop a better understanding of systems thinking and the science of application that is sensitive to the complexity, interactivity, and unique elements of community and practice settings. Also needed is a modification of health-related education to ensure that those entering the disciplines develop instincts and capacities as applied scientists. PMID:22011425

  20. Strategies in Forecasting Outcomes in Ethical Decision-making: Identifying and Analyzing the Causes of the Problem

    PubMed Central

    Beeler, Cheryl K.; Antes, Alison L.; Wang, Xiaoqian; Caughron, Jared J.; Thiel, Chase E.; Mumford, Michael D.

    2010-01-01

    This study examined the role of key causal analysis strategies in forecasting and ethical decision-making. Undergraduate participants took on the role of the key actor in several ethical problems and were asked to identify and analyze the causes, forecast potential outcomes, and make a decision about each problem. Time pressure and analytic mindset were manipulated while participants worked through these problems. The results indicated that forecast quality was associated with decision ethicality, and the identification of the critical causes of the problem was associated with both higher quality forecasts and higher ethicality of decisions. Neither time pressure nor analytic mindset impacted forecasts or ethicality of decisions. Theoretical and practical implications of these findings are discussed. PMID:20352056

  1. Effects of friction and heat conduction on sound propagation in ducts. [analyzing complex aerodynamic noise problems

    NASA Technical Reports Server (NTRS)

    Huerre, P.; Karamcheti, K.

    1976-01-01

    The theory of sound propagation is examined in a viscous, heat-conducting fluid, initially at rest and in a uniform state, and contained in a rigid, impermeable duct with isothermal walls. Topics covered include: (1) theoretical formulation of the small amplitude fluctuating motions of a viscous, heat-conducting and compressible fluid; (2) sound propagation in a two dimensional duct; and (3) perturbation study of the inplane modes.

  2. Deep graphs—A general framework to represent and analyze heterogeneous complex systems across scales

    NASA Astrophysics Data System (ADS)

    Traxl, Dominik; Boers, Niklas; Kurths, Jürgen

    2016-06-01

    Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of

  3. Deep graphs-A general framework to represent and analyze heterogeneous complex systems across scales.

    PubMed

    Traxl, Dominik; Boers, Niklas; Kurths, Jürgen

    2016-06-01

    Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of

  4. Complex Problem Exercises in Developing Engineering Students' Conceptual and Procedural Knowledge of Electromagnetics

    ERIC Educational Resources Information Center

    Leppavirta, J.; Kettunen, H.; Sihvola, A.

    2011-01-01

    Complex multistep problem exercises are one way to enhance engineering students' learning of electromagnetics (EM). This study investigates whether exposure to complex problem exercises during an introductory EM course improves students' conceptual and procedural knowledge. The performance in complex problem exercises is compared to prior success…

  5. Inverse Problems in Complex Models and Applications to Earth Sciences

    NASA Astrophysics Data System (ADS)

    Bosch, M. E.

    2015-12-01

    The inference of the subsurface earth structure and properties requires the integration of different types of data, information and knowledge, by combined processes of analysis and synthesis. To support the process of integrating information, the regular concept of data inversion is evolving to expand its application to models with multiple inner components (properties, scales, structural parameters) that explain multiple data (geophysical survey data, well-logs, core data). The probabilistic inference methods provide the natural framework for the formulation of these problems, considering a posterior probability density function (PDF) that combines the information from a prior information PDF and the new sets of observations. To formulate the posterior PDF in the context of multiple datasets, the data likelihood functions are factorized assuming independence of uncertainties for data originating across different surveys. A realistic description of the earth medium requires modeling several properties and structural parameters, which relate to each other according to dependency and independency notions. Thus, conditional probabilities across model components also factorize. A common setting proceeds by structuring the model parameter space in hierarchical layers. A primary layer (e.g. lithology) conditions a secondary layer (e.g. physical medium properties), which conditions a third layer (e.g. geophysical data). In general, less structured relations within model components and data emerge from the analysis of other inverse problems. They can be described with flexibility via direct acyclic graphs, which are graphs that map dependency relations between the model components. Examples of inverse problems in complex models can be shown at various scales. At local scale, for example, the distribution of gas saturation is inferred from pre-stack seismic data and a calibrated rock-physics model. At regional scale, joint inversion of gravity and magnetic data is applied

  6. Can SNOMED CT fulfill the vision of a compositional terminology? Analyzing the use case for Problem List

    PubMed Central

    Campbell, James R.; Xu, Junchuan; Fung, Kin Wah

    2011-01-01

    We analyzed 598 of 63,952 terms employed in problem list entries from seven major healthcare institutions that were not mapped with UMLS to SNOMED CT when preparing the NLM UMLS-CORE problem list subset. We intended to determine whether published or post-coordinated SNOMED concepts could accurately capture the problems as stated by the clinician and to characterize the workload for the local terminology manager. From the terms we analyzed, we estimate that 7.5% of the total terms represent ambiguous statements that require clarification. Of those terms which were unambiguous, we estimate that 38.1% could be encoded using the SNOMED CT January 2011 pre-coordinated (published core) content. 60.4% of unambiguous terms required post-coordination to capture the term meaning within the SNOMED model. Approximately 28.5% of post-coordinated content could not be fully defined and required primitive forms. This left 1.5% of unambiguous terms which were expressed with meaning which could not be represented in SNOMED CT. We estimate from our study that 98.5% of clinical terms unambiguously suggested for the problem list can be equated to published concepts or can be modeled with SNOMED CT but that roughly one in four SNOMED modeled expressions fail to represent the full meaning of the term. Implications for the business model of the local terminology manager and the development of SNOMED CT are discussed. PMID:22195069

  7. A novel approach to analyze membrane proteins by laser mass spectrometry: from protein subunits to the integral complex.

    PubMed

    Morgner, Nina; Kleinschroth, Thomas; Barth, Hans-Dieter; Ludwig, Bernd; Brutschy, Bernhard

    2007-08-01

    A novel laser-based mass spectrometry method termed LILBID (laser-induced liquid bead ion desorption) is applied to analyze large integral membrane protein complexes and their subunits. In this method the ions are IR-laser desorbed from aqueous microdroplets containing the hydrophobic protein complexes solubilized by detergent. The method is highly sensitive, very efficient in sample handling, relatively tolerant to various buffers, and detects the ions in narrow, mainly low-charge state distributions. The crucial experimental parameter determining whether the integral complex or its subunits are observed is the laser intensity: At very low intensity level corresponding to an ultrasoft desorption, the intact complexes, together with few detergent molecules, are transferred into vacuum. Under these conditions the oligomerization state of the complex (i.e., its quaternary structure) may be analyzed. At higher laser intensity, complexes are thermolyzed into subunits, with any residual detergent being stripped off to yield the true mass of the polypeptides. The model complexes studied are derived from the respiratory chain of the soil bacterium Paracoccus denitrificans and include complexes III (cytochrome bc(1) complex) and IV (cytochrome c oxidase). These are well characterized multi-subunit membrane proteins, with the individual hydrophobic subunits being composed of up to 12 transmembrane helices. PMID:17544294

  8. Eye-Tracking Study of Complexity in Gas Law Problems

    ERIC Educational Resources Information Center

    Tang, Hui; Pienta, Norbert

    2012-01-01

    This study, part of a series investigating students' use of online tools to assess problem solving, uses eye-tracking hardware and software to explore the effect of problem difficulty and cognitive processes when students solve gas law word problems. Eye movements are indices of cognition; eye-tracking data typically include the location,…

  9. An Eye-Tracking Paradigm for Analyzing the Processing Time of Sentences with Different Linguistic Complexities

    PubMed Central

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics. PMID:24950184

  10. An eye-tracking paradigm for analyzing the processing time of sentences with different linguistic complexities.

    PubMed

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics. PMID:24950184

  11. Using New Models to Analyze Complex Regularities of the World: Commentary on Musso et al. (2013)

    ERIC Educational Resources Information Center

    Nokelainen, Petri; Silander, Tomi

    2014-01-01

    This commentary to the recent article by Musso et al. (2013) discusses issues related to model fitting, comparison of classification accuracy of generative and discriminative models, and two (or more) cultures of data modeling. We start by questioning the extremely high classification accuracy with an empirical data from a complex domain. There is…

  12. The Influence of Prior Experience and Process Utilization in Solving Complex Problems.

    ERIC Educational Resources Information Center

    Sterner, Paula; Wedman, John

    By using ill-structured problems and examining problem- solving processes, this study was conducted to explore the nature of solving complex, multistep problems, focusing on how prior knowledge, problem-solving process utilization, and analogical problem solving are related to success. Twenty-four college students qualified to participate by…

  13. Complex Problem Solving in Educational Contexts--Something beyond "g": Concept, Assessment, Measurement Invariance, and Construct Validity

    ERIC Educational Resources Information Center

    Greiff, Samuel; Wustenberg, Sascha; Molnar, Gyongyver; Fischer, Andreas; Funke, Joachim; Csapo, Beno

    2013-01-01

    Innovative assessments of cross-curricular competencies such as complex problem solving (CPS) have currently received considerable attention in large-scale educational studies. This study investigated the nature of CPS by applying a state-of-the-art approach to assess CPS in high school. We analyzed whether two processes derived from cognitive…

  14. An Effective Methodology for Processing and Analyzing Large, Complex Spacecraft Data Streams

    ERIC Educational Resources Information Center

    Teymourlouei, Haydar

    2013-01-01

    The emerging large datasets have made efficient data processing a much more difficult task for the traditional methodologies. Invariably, datasets continue to increase rapidly in size with time. The purpose of this research is to give an overview of some of the tools and techniques that can be utilized to manage and analyze large datasets. We…

  15. Investigating the Effect of Complexity Factors in Stoichiometry Problems Using Logistic Regression and Eye Tracking

    ERIC Educational Resources Information Center

    Tang, Hui; Kirk, John; Pienta, Norbert J.

    2014-01-01

    This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we…

  16. A note on the Dirichlet problem for model complex partial differential equations

    NASA Astrophysics Data System (ADS)

    Ashyralyev, Allaberen; Karaca, Bahriye

    2016-08-01

    Complex model partial differential equations of arbitrary order are considered. The uniqueness of the Dirichlet problem is studied. It is proved that the Dirichlet problem for higher order of complex partial differential equations with one complex variable has infinitely many solutions.

  17. A complexity analysis of space-bounded learning algorithms for the constraint satisfaction problem

    SciTech Connect

    Bayardo, R.J. Jr.; Miranker, D.P.

    1996-12-31

    Learning during backtrack search is a space-intensive process that records information (such as additional constraints) in order to avoid redundant work. In this paper, we analyze the effects of polynomial-space-bounded learning on runtime complexity of backtrack search. One space-bounded learning scheme records only those constraints with limited size, and another records arbitrarily large constraints but deletes those that become irrelevant to the portion of the search space being explored. We find that relevance-bounded learning allows better runtime bounds than size-bounded learning on structurally restricted constraint satisfaction problems. Even when restricted to linear space, our relevance-bounded learning algorithm has runtime complexity near that of unrestricted (exponential space-consuming) learning schemes.

  18. The Interfacial Interaction Problem in Complex Multiple Porosity Fractured Reservoirs

    NASA Astrophysics Data System (ADS)

    Suarez-Arriaga, Mario-Cesar

    2003-04-01

    Many productive reservoirs (oil, gas, water, geothermal) are associated to natural fracturing. Fault zones and fractures act as open networks for fluid and energy flow from depth. Their petrophysical parameters are heterogeneous and randomly distributed, conforming extremely complex natural systems. Here, the simultaneous heat and mass flows are coupled to the deformation of thermoporoelastic rocks. The system's volume is divided into N interacting continua, each one occupying a region of space Vn wrapped by a surface Sn (n=1,N). The mass flow is represented by: ∂/∂t ∫ Vn ρf φdV + ∫ Sn F⃗M ṡ n⃗dS = ∫ Vn qMdV (3) Taking into account a non-isothermal process the coupled equation of energy is: ∂/∂t ∫ Vn [φρf hf + (1 - φ)ρrhr]dV + ∫ Sn F⃗E ṡ n⃗dS = ∫ Vn qEdV (4) Where t means time, φ is porosity, ρf, ρr are fluid and rock densities, F⃗M and F⃗E are total mass and energy flows, qM and qE are volumetric mass and energy extracted or injected into Vn, hf and hr are specific enthalpies for fluid and rock respectively. Rock deformation is coupled through the equation: ∇⃗ ṡ (ρf/μK ṡ ∇⃗pφ)Vn = φ (Dtρf + ρf/VφdVφ/dt)Vn (5) K is the absolute permeability tensor, μ means dynamic fluid viscosity, Dt is a total derivative, pφ is pore pressure and Vφ is the volume of pores in Vn. The N media interact with each other, every one has its own parameters and its own interporosity flow. Modelling these coupled phenomena requires to average highly contrasting physical properties, independently of the method used in the solution of equations. A lot of attention has been devoted to develop realistic numerical models to describe flows in reservoirs under exploitation. But to the best of our knowledge very little attention has been focused on the problem of interfacial interaction and averaging petrophysical parameters in multiple porosity reservoirs.

  19. Detrended Partial-Cross-Correlation Analysis: A New Method for Analyzing Correlations in Complex System

    PubMed Central

    Yuan, Naiming; Fu, Zuntao; Zhang, Huan; Piao, Lin; Xoplaki, Elena; Luterbacher, Juerg

    2015-01-01

    In this paper, a new method, detrended partial-cross-correlation analysis (DPCCA), is proposed. Based on detrended cross-correlation analysis (DCCA), this method is improved by including partial-correlation technique, which can be applied to quantify the relations of two non-stationary signals (with influences of other signals removed) on different time scales. We illustrate the advantages of this method by performing two numerical tests. Test I shows the advantages of DPCCA in handling non-stationary signals, while Test II reveals the “intrinsic” relations between two considered time series with potential influences of other unconsidered signals removed. To further show the utility of DPCCA in natural complex systems, we provide new evidence on the winter-time Pacific Decadal Oscillation (PDO) and the winter-time Nino3 Sea Surface Temperature Anomaly (Nino3-SSTA) affecting the Summer Rainfall over the middle-lower reaches of the Yangtze River (SRYR). By applying DPCCA, better significant correlations between SRYR and Nino3-SSTA on time scales of 6 ~ 8 years are found over the period 1951 ~ 2012, while significant correlations between SRYR and PDO on time scales of 35 years arise. With these physically explainable results, we have confidence that DPCCA is an useful method in addressing complex systems. PMID:25634341

  20. Mass spectrometric methods to analyze the structural organization of macromolecular complexes.

    PubMed

    Rajabi, Khadijeh; Ashcroft, Alison E; Radford, Sheena E

    2015-11-01

    With the development of soft ionization techniques such as electrospray ionization (ESI), mass spectrometry (MS) has found widespread application in structural biology. The ability to transfer large biomolecular complexes intact into the gas-phase, combined with the low sample consumption and high sensitivity of MS, has made ESI-MS a method of choice for the characterization of macromolecules. This paper describes the application of MS to study large non-covalent complexes. We categorize the available techniques in two groups. First, solution-based techniques in which the biomolecules are labeled in solution and subsequently characterized by MS. Three MS-based techniques are discussed, namely hydroxyl radical footprinting, cross-linking and hydrogen/deuterium exchange (HDX) MS. In the second group, MS-based techniques to probe intact biomolecules in the gas-phase, e.g. side-chain microsolvation, HDX and ion mobility spectrometry are discussed. Together, the approaches place MS as a powerful methodology for an ever growing plethora of structural applications. PMID:25782628

  1. Taking advantage of local structure descriptors to analyze interresidue contacts in protein structures and protein complexes.

    PubMed

    Martin, Juliette; Regad, Leslie; Etchebest, Catherine; Camproux, Anne-Claude

    2008-11-15

    Interresidue protein contacts in proteins structures and at protein-protein interface are classically described by the amino acid types of interacting residues and the local structural context of the contact, if any, is described using secondary structures. In this study, we present an alternate analysis of interresidue contact using local structures defined by the structural alphabet introduced by Camproux et al. This structural alphabet allows to describe a 3D structure as a sequence of prototype fragments called structural letters, of 27 different types. Each residue can then be assigned to a particular local structure, even in loop regions. The analysis of interresidue contacts within protein structures defined using Voronoï tessellations reveals that pairwise contact specificity is greater in terms of structural letters than amino acids. Using a simple heuristic based on specificity score comparison, we find that 74% of the long-range contacts within protein structures are better described using structural letters than amino acid types. The investigation is extended to a set of protein-protein complexes, showing that the similar global rules apply as for intraprotein contacts, with 64% of the interprotein contacts best described by local structures. We then present an evaluation of pairing functions integrating structural letters to decoy scoring and show that some complexes could benefit from the use of structural letter-based pairing functions. PMID:18491388

  2. Technologically Mediated Complex Problem-Solving on a Statistics Task

    ERIC Educational Resources Information Center

    Scanlon, Eileen; Blake, Canan; Joiner, Richard; O'Shea, Tim

    2005-01-01

    Simulations on computers can allow many experiments to be conducted quickly to help students develop an understanding of statistical topics. We used a simulation of a challenging problem in statistics as the focus of an exploration of situations where members of a problem-solving group are physically separated then reconnected via combinations of…

  3. THE ROLE OF PROBLEM SOLVING IN COMPLEX INTRAVERBAL REPERTOIRES

    PubMed Central

    Sautter, Rachael A; LeBlanc, Linda A; Jay, Allison A; Goldsmith, Tina R; Carr, James E

    2011-01-01

    We examined whether typically developing preschoolers could learn to use a problem-solving strategy that involved self-prompting with intraverbal chains to provide multiple responses to intraverbal categorization questions. Teaching the children to use the problem-solving strategy did not produce significant increases in target responses until problem solving was modeled and prompted. Following the model and prompts, all participants showed immediate significant increases in intraverbal categorization, and all prompts were quickly eliminated. Use of audible self-prompts was evident initially for all participants, but declined over time for 3 of the 4 children. Within-session response patterns remained consistent with use of the problem-solving strategy even when self-prompts were not audible. These findings suggest that teaching and prompting a problem-solving strategy can be an effective way to produce intraverbal categorization responses. PMID:21709781

  4. Teaching Problem Solving; the Effect of Algorithmic and Heuristic Problem Solving Training in Relation to Task Complexity and Relevant Aptitudes.

    ERIC Educational Resources Information Center

    de Leeuw, L.

    Sixty-four fifth and sixth-grade pupils were taught number series extrapolation by either an algorithm, fully prescribed problem-solving method or a heuristic, less prescribed method. The trained problems were within categories of two degrees of complexity. There were 16 subjects in each cell of the 2 by 2 design used. Aptitude Treatment…

  5. Nuclear three-body problem in the complex energy plane: Complex-scaling Slater method

    NASA Astrophysics Data System (ADS)

    Kruppa, A. T.; Papadimitriou, G.; Nazarewicz, W.; Michel, N.

    2014-01-01

    Background: The physics of open quantum systems is an interdisciplinary area of research. The nuclear "openness" manifests itself through the presence of the many-body continuum representing various decay, scattering, and reaction channels. As the radioactive nuclear beam experimentation extends the known nuclear landscape toward the particle drip lines, the coupling to the continuum space becomes exceedingly more important. Of particular interest are weakly bound and unbound nuclear states appearing around particle thresholds. Theories of such nuclei must take into account their open quantum nature. Purpose: To describe open quantum systems, we introduce a complex-scaling (CS) approach in the Slater basis. We benchmark it with the complex-energy Gamow shell model (GSM) by studying energies and wave functions of the bound and unbound states of the two-neutron halo nucleus 6He viewed as an α +n+n cluster system. Methods: Both CS and GSM approaches are applied to a translationally invariant Hamiltonian with the two-body interaction approximated by the finite-range central Minnesota force. In the CS approach, we use the Slater basis, which exhibits the correct asymptotic behavior at large distances. To extract particle densities from the back-rotated CS solutions, we apply the Tikhonov regularization procedure, which minimizes the ultraviolet numerical noise. Results: We show that the CS-Slater method is both accurate and efficient. Its equivalence to the GSM approach has been demonstrated numerically for both energies and wave functions of 6He. One important technical aspect of our calculation was to fully retrieve the correct asymptotic behavior of a resonance state from the complex-scaled (square-integrable) wave function. While standard applications of the inverse complex transformation to the complex-rotated solution provide unstable results, the stabilization method fully reproduces the GSM benchmark. We also propose a method to determine the smoothing

  6. Complex Causal Process Diagrams for Analyzing the Health Impacts of Policy Interventions

    PubMed Central

    Joffe, Michael; Mindell, Jennifer

    2006-01-01

    Causal diagrams are rigorous tools for controlling confounding. They also can be used to describe complex causal systems, which is done routinely in communicable disease epidemiology. The use of change diagrams has advantages over static diagrams, because change diagrams are more tractable, relate better to interventions, and have clearer interpretations. Causal diagrams are a useful basis for modeling. They make assumptions explicit, provide a framework for analysis, generate testable predictions, explore the effects of interventions, and identify data gaps. Causal diagrams can be used to integrate different types of information and to facilitate communication both among public health experts and between public health experts and experts in other fields. Causal diagrams allow the use of instrumental variables, which can help control confounding and reverse causation. PMID:16449586

  7. Problems in processing multizonal video information at specialized complexes

    NASA Technical Reports Server (NTRS)

    Shamis, V. A.

    1979-01-01

    Architectural requirements of a minicomputer-based specialized complex for automated digital analysis of multizonal video data are examined. The logic structure of multizonal video data and the complex mathematical provision required for the analysis of such data are described. The composition of the specialized complex, its operating system, and the required set of peripheral devices are discussed. It is noted that although much of the analysis can be automated, the operator-computer dialog mode is essential for certain stages of the analysis.

  8. The Fallacy of Univariate Solutions to Complex Systems Problems.

    PubMed

    Lessov-Schlaggar, Christina N; Rubin, Joshua B; Schlaggar, Bradley L

    2016-01-01

    Complex biological systems, by definition, are composed of multiple components that interact non-linearly. The human brain constitutes, arguably, the most complex biological system known. Yet most investigation of the brain and its function is carried out using assumptions appropriate for simple systems-univariate design and linear statistical approaches. This heuristic must change before we can hope to discover and test interventions to improve the lives of individuals with complex disorders of brain development and function. Indeed, a movement away from simplistic models of biological systems will benefit essentially all domains of biology and medicine. The present brief essay lays the foundation for this argument. PMID:27375425

  9. The Fallacy of Univariate Solutions to Complex Systems Problems

    PubMed Central

    Lessov-Schlaggar, Christina N.; Rubin, Joshua B.; Schlaggar, Bradley L.

    2016-01-01

    Complex biological systems, by definition, are composed of multiple components that interact non-linearly. The human brain constitutes, arguably, the most complex biological system known. Yet most investigation of the brain and its function is carried out using assumptions appropriate for simple systems—univariate design and linear statistical approaches. This heuristic must change before we can hope to discover and test interventions to improve the lives of individuals with complex disorders of brain development and function. Indeed, a movement away from simplistic models of biological systems will benefit essentially all domains of biology and medicine. The present brief essay lays the foundation for this argument. PMID:27375425

  10. Asbestos quantification in track ballast, a complex analytical problem

    NASA Astrophysics Data System (ADS)

    Cavallo, Alessandro

    2016-04-01

    Track ballast forms the trackbeb upon which railroad ties are laid. It is used to bear the load from the railroad ties, to facilitate water drainage, and also to keep down vegetation. It is typically made of angular crushed stone, with a grain size between 30 and 60 mm, with good mechanical properties (high compressive strength, freeze - thaw resistance, resistance to fragmentation). The most common rock types are represented by basalts, porphyries, orthogneisses, some carbonatic rocks and "green stones" (serpentinites, prasinites, amphibolites, metagabbros). Especially "green stones" may contain traces, and sometimes appreciable amounts of asbestiform minerals (chrysotile and/or fibrous amphiboles, generally tremolite - actinolite). In Italy, the chrysotile asbestos mine in Balangero (Turin) produced over 5 Mt railroad ballast (crushed serpentinites), which was used for the railways in northern and central Italy, from 1930 up to 1990. In addition to Balangero, several other serpentinite and prasinite quarries (e.g. Emilia Romagna) provided the railways ballast up to the year 2000. The legal threshold for asbestos content in track ballast is established in 1000 ppm: if the value is below this threshold, the material can be reused, otherwise it must be disposed of as hazardous waste, with very high costs. The quantitative asbestos determination in rocks is a very complex analytical issue: although techniques like TEM-SAED and micro-Raman are very effective in the identification of asbestos minerals, a quantitative determination on bulk materials is almost impossible or really expensive and time consuming. Another problem is represented by the discrimination of asbestiform minerals (e.g. chrysotile, asbestiform amphiboles) from the common acicular - pseudo-fibrous varieties (lamellar serpentine minerals, prismatic/acicular amphiboles). In this work, more than 200 samples from the main Italian rail yards were characterized by a combined use of XRD and a special SEM

  11. Problem analysis of geotechnical well drilling in complex environment

    NASA Astrophysics Data System (ADS)

    Kasenov, A. K.; Biletskiy, M. T.; Ratov, B. T.; Korotchenko, T. V.

    2015-02-01

    The article examines primary causes of problems occurring during the drilling of geotechnical wells (injection, production and monitoring wells) for in-situ leaching to extract uranium in South Kazakhstan. Such a drilling problem as hole caving which is basically caused by various chemical and physical factors (hydraulic, mechanical, etc.) has been thoroughly investigated. The analysis of packing causes has revealed that this problem usually occurs because of insufficient amount of drilling mud being associated with small cross section downward flow and relatively large cross section upward flow. This is explained by the fact that when spear bores are used to drill clay rocks, cutting size is usually rather big and there is a risk for clay particles to coagulate.

  12. [Problems of formal organizational structure of industrial health care complexes].

    PubMed

    Włodarczyk, C

    1978-01-01

    The author formulates the thesis that the description of organizational structure of industrial health care complex calls for isolation of the following aspects:--structure of territorial links--systemof organizational units and divisions--organization of basic functions--structure of management--structure of supervision of middle and lowe-level personnel--composition of health care complex council--system of accessibility ranges. Each of the above aspects has been considered on the basis of operative rules of law, using organizational analysis methods. PMID:745544

  13. Client-Centered Problem-Solving Networks in Complex Organizations.

    ERIC Educational Resources Information Center

    Tucker, Charles; Hanna, Michael

    Employees in different kinds of organizations were surveyed for their perceptions of their companies' client and operational problem-solving networks. The individuals came from a manufacturing firm, a community college, a telephone company, a farmers' cooperative, and a hospital. Interviews were conducted with those people reporting numerous…

  14. The Teaching-Upbringing Complex: Experience, Problems, Prospects.

    ERIC Educational Resources Information Center

    Vul'fov, B. Z.; And Others

    1990-01-01

    Describes the teaching-upbringing complex (UVK), a new type of Soviet school that attempts to deal with raising and educating children in an integrated manner. Stresses combining required subjects with students' special interests to encourage student achievement and teacher involvement. Concentrates on the development of self-expression and…

  15. Games that Enlist Collective Intelligence to Solve Complex Scientific Problems.

    PubMed

    Burnett, Stephen; Furlong, Michelle; Melvin, Paul Guy; Singiser, Richard

    2016-03-01

    There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article. PMID:27047610

  16. Games that Enlist Collective Intelligence to Solve Complex Scientific Problems

    PubMed Central

    Burnett, Stephen; Furlong, Michelle; Melvin, Paul Guy; Singiser, Richard

    2016-01-01

    There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article. PMID:27047610

  17. On the Complexity of the Asymmetric VPN Problem

    NASA Astrophysics Data System (ADS)

    Rothvoß, Thomas; Sanità, Laura

    We give the first constant factor approximation algorithm for the asymmetric Virtual Private Network (textsc{Vpn}) problem with arbitrary concave costs. We even show the stronger result, that there is always a tree solution of cost at most 2·OPT and that a tree solution of (expected) cost at most 49.84·OPT can be determined in polynomial time.

  18. Problem-oriented stereo vision quality evaluation complex

    NASA Astrophysics Data System (ADS)

    Sidorchuk, D.; Gusamutdinova, N.; Konovalenko, I.; Ershov, E.

    2015-12-01

    We describe an original low cost hardware setting for efficient testing of stereo vision algorithms. The method uses a combination of a special hardware setup and mathematical model and is easy to construct, precise in applications of our interest. For a known scene we derive its analytical representation, called virtual scene. Using a four point correspondence between the scene and virtual one we compute extrinsic camera parameters, and project virtual scene on the image plane, which is the ground truth for depth map. Another result, presented in this paper, is a new depth map quality metric. Its main purpose is to tune stereo algorithms for particular problem, e.g. obstacle avoidance.

  19. On the problem of constructing a modern, economic radiotelescope complex

    NASA Technical Reports Server (NTRS)

    Bogomolov, A. F.; Sokolov, A. G.; Poperechenko, B. A.; Polyak, V. S.

    1977-01-01

    Criteria for comparing and planning the technical and economic characteristics of large parabolic reflector antenna systems and other types used in radioastronomy and deep space communications are discussed. The experience gained in making and optimizing a series of highly efficient parabolic antennas in the USSR is reviewed. Several ways are indicated for further improving the complex characteristics of antennas similar to the original TNA-1500 64m radio telescope. The suggestions can be applied in planning the characteristics of radiotelescopes which are now being built, in particular, the TNA-8000 with a diameter of 128 m.

  20. Assessing Complex Problem-Solving Skills and Knowledge Assembly Using Web-Based Hypermedia Design.

    ERIC Educational Resources Information Center

    Dabbagh, Nada

    This research project studied the effects of hierarchical versus heterarchical hypermedia structures of Web-based case representations on complex problem-solving skills and knowledge assembly in problem-centered learning environments in order to develop a system or model that informs the design of Web-based cases for ill-structured problems across…

  1. Complex multipole beam approach to electromagnetic scattering problems

    NASA Astrophysics Data System (ADS)

    Mittra, Raj; Boag, Amir

    1994-03-01

    A novel approach to reducing the matrix size associated with the Method of Moments (MoM) solution of the problem of electromagnetic scattering from arbitrary shaped closed bodies is presented in this paper. The key step in this approach is to represent the scattered field in terms of a series of beams produced by multipole sources resemble the Gabor basis functions. By utilizing the properties of the Gabor series, guidelines for selecting the orders as well as locations of the multipole sources are developed. It is shown that the present approach not only reduces the number of unknowns, but also generates a generalized impedance matrix with a banded structure and a low condition number. The accuracy of the proposed method is verified by comparing the numerical results with those derived by using the method of moments.

  2. How Students Circumvent Problem-Solving Strategies that Require Greater Cognitive Complexity.

    ERIC Educational Resources Information Center

    Niaz, Mansoor

    1996-01-01

    Analyzes the great diversity in problem-solving strategies used by students in solving a chemistry problem and discusses the relationship between these variables and different cognitive variables. Concludes that students try to circumvent certain problem-solving strategies by adapting flexible and stylistic innovations that render the cognitive…

  3. Dusty (complex) plasmas: recent developments, advances, and unsolved problems

    NASA Astrophysics Data System (ADS)

    Popel, Sergey

    The area of dusty (complex) plasma research is a vibrant subfield of plasma physics that be-longs to frontier research in physical sciences. This area is intrinsically interdisciplinary and encompasses astrophysics, planetary science, atmospheric science, magnetic fusion energy sci-ence, and various applied technologies. The research in dusty plasma started after two major discoveries in very different areas: (1) the discovery by the Voyager 2 spacecraft in 1980 of the radial spokes in Saturn's B ring, and (2) the discovery of the early 80's growth of contaminating dust particles in plasma processing. Dusty plasmas are ubiquitous in the universe; examples are proto-planetary and solar nebulae, molecular clouds, supernovae explosions, interplanetary medium, circumsolar rings, and asteroids. Within the solar system, we have planetary rings (e.g., Saturn and Jupiter), Martian atmosphere, cometary tails and comae, dust clouds on the Moon, etc. Close to the Earth, there are noctilucent clouds and polar mesospheric summer echoes, which are clouds of tiny (charged) ice particles that are formed in the summer polar mesosphere at the altitudes of about 82-95 km. Dust and dusty plasmas are also found in the vicinity of artificial satellites and space stations. Dust also turns out to be common in labo-ratory plasmas, such as in the processing of semiconductors and in tokamaks. In processing plasmas, dust particles are actually grown in the discharge from the reactive gases used to form the plasmas. An example of the relevance of industrial dusty plasmas is the growth of silicon microcrystals for improved solar cells in the future. In fact, nanostructured polymorphous sili-con films provide solar cells with high and time stable efficiency. These nano-materials can also be used for the fabrication of ultra-large-scale integration circuits, display devices, single elec-tron devices, light emitting diodes, laser diodes, and others. In microelectronic industries, dust has to be

  4. Measurements of student understanding on complex scientific reasoning problems

    NASA Astrophysics Data System (ADS)

    Izumi, Alisa Sau-Lin

    While there has been much discussion of cognitive processes underlying effective scientific teaching, less is known about the response nature of assessments targeting processes of scientific reasoning specific to biology content. This study used multiple-choice (m-c) and short-answer essay student responses to evaluate progress in high-order reasoning skills. In a pilot investigation of student responses on a non-content-based test of scientific thinking, it was found that some students showed a pre-post gain on the m-c test version while showing no gain on a short-answer essay version of the same questions. This result led to a subsequent research project focused on differences between alternate versions of tests of scientific reasoning. Using m-c and written responses from biology tests targeted toward the skills of (1) reasoning with a model and (2) designing controlled experiments, test score frequencies, factor analysis, and regression models were analyzed to explore test format differences. Understanding the format differences in tests is important for the development of practical ways to identify student gains in scientific reasoning. The overall results suggested test format differences. Factor analysis revealed three interpretable factors---m-c format, genetics content, and model-based reasoning. Frequency distributions on the m-c and open explanation portions of the hybrid items revealed that many students answered the m-c portion of an item correctly but gave inadequate explanations. In other instances students answered the m-c portion incorrectly yet demonstrated sufficient explanation or answered the m-c correctly and also provided poor explanations. When trying to fit test score predictors for non-associated student measures---VSAT, MSAT, high school grade point average, or final course grade---the test scores accounted for close to zero percent of the variance. Overall, these results point to the importance of using multiple methods of testing and of

  5. Sleep, Cognition, and Behavioral Problems in School-Age Children: A Century of Research Meta-Analyzed

    ERIC Educational Resources Information Center

    Astill, Rebecca G.; Van der Heijden, Kristiaan B.; Van IJzendoorn, Marinus H.; Van Someren, Eus J. W.

    2012-01-01

    Clear associations of sleep, cognitive performance, and behavioral problems have been demonstrated in meta-analyses of studies in adults. This meta-analysis is the first to systematically summarize all relevant studies reporting on sleep, cognition, and behavioral problems in healthy school-age children (5-12 years old) and incorporates 86 studies…

  6. One Problem, Many Solutions: Simple Statistical Approaches Help Unravel the Complexity of the Immune System in an Ecological Context

    PubMed Central

    Matson, Kevin D.; Tieleman, B. Irene

    2011-01-01

    The immune system is a complex collection of interrelated and overlapping solutions to the problem of disease. To deal with this complexity, researchers have devised multiple ways to measure immune function and to analyze the resulting data. In this way both organisms and researchers employ many tactics to solve a complex problem. One challenge facing ecological immunologists is the question of how these many dimensions of immune function can be synthesized to facilitate meaningful interpretations and conclusions. We tackle this challenge by employing and comparing several statistical methods, which we used to test assumptions about how multiple aspects of immune function are related at different organizational levels. We analyzed three distinct datasets that characterized 1) species, 2) subspecies, and 3) among- and within-individual level differences in the relationships among multiple immune indices. Specifically, we used common principal components analysis (CPCA) and two simpler approaches, pair-wise correlations and correlation circles. We also provide a simple example of how these techniques could be used to analyze data from multiple studies. Our findings lead to several general conclusions. First, relationships among indices of immune function may be consistent among some organizational groups (e.g. months over the annual cycle) but not others (e.g. species); therefore any assumption of consistency requires testing before further analyses. Second, simple statistical techniques used in conjunction with more complex multivariate methods give a clearer and more robust picture of immune function than using complex statistics alone. Moreover, these simpler approaches have potential for analyzing comparable data from multiple studies, especially as the field of ecological immunology moves towards greater methodological standardization. PMID:21526186

  7. Mass analyzed threshold ionization of phenolṡCO: Intermolecular binding energies of a hydrogen-bonded complex

    NASA Astrophysics Data System (ADS)

    Haines, Stephen R.; Dessent, Caroline E. H.; Müller-Dethlefs, Klaus

    1999-08-01

    [PhenolṡCO]+ was studied using a combination of two-color resonant zero kinetic energy (ZEKE) spectroscopy and mass analyzed threshold ionization (MATI) spectroscopy to investigate the interaction of the CO ligand with a hydrogen-bonding cation. Vibrational progressions were observed in three intermolecular modes, the in-plane bend (42 cm-1), stretch (130 cm-1), and in-plane wag (160 cm-1), and are consistent with a planar hydrogen-bonded structure where the CO bonds through the carbon atom to the phenol OH group. Dissociation energies for the S0, S1, and D0 states were determined as 659±20, 849±20, and 2425±10 cm-1, respectively. The cationic and neutral dissociation energies of the phenolṡCO complex are considerably stronger than those of phenolṡN2, demonstrating the extent to which the larger quadrupole of CO affects the strength of binding.

  8. Using SEM to Analyze Complex Survey Data: A Comparison between Design-Based Single-Level and Model-Based Multilevel Approaches

    ERIC Educational Resources Information Center

    Wu, Jiun-Yu; Kwok, Oi-man

    2012-01-01

    Both ad-hoc robust sandwich standard error estimators (design-based approach) and multilevel analysis (model-based approach) are commonly used for analyzing complex survey data with nonindependent observations. Although these 2 approaches perform equally well on analyzing complex survey data with equal between- and within-level model structures…

  9. Communities of Practice: A New Approach to Solving Complex Educational Problems

    ERIC Educational Resources Information Center

    Cashman, J.; Linehan, P.; Rosser, M.

    2007-01-01

    Communities of Practice offer state agency personnel a promising approach for engaging stakeholder groups in collaboratively solving complex and, often, persistent problems in special education. Communities of Practice can help state agency personnel drive strategy, solve problems, promote the spread of best practices, develop members'…

  10. An Exploratory Framework for Handling the Complexity of Mathematical Problem Posing in Small Groups

    ERIC Educational Resources Information Center

    Kontorovich, Igor; Koichu, Boris; Leikin, Roza; Berman, Avi

    2012-01-01

    The paper introduces an exploratory framework for handling the complexity of students' mathematical problem posing in small groups. The framework integrates four facets known from past research: task organization, students' knowledge base, problem-posing heuristics and schemes, and group dynamics and interactions. In addition, it contains a new…

  11. The Ethnology of Traditional and Complex Societies. Test Edition. AAAS Study Guides on Contemporary Problems.

    ERIC Educational Resources Information Center

    Simic, Andrei

    This is one of several study guides on contemporary problems produced by the American Association for the Advancement of Science with support of the National Science Foundation. This guide focuses on the ethnology of traditional and complex societies. Part I, Simple and Complex Societies, includes three sections: (1) Introduction: Anthropologists…

  12. Development and operation of an integrated sampling probe and gas analyzer for turbulent mixing studies in complex supersonic flows

    NASA Astrophysics Data System (ADS)

    Wiswall, John D.

    -temporal characteristic scales of the flow on the resulting time-area-averaged concentration measurements. Two series of experiments were performed to verify the probe's design; the first used Schlieren photography and verified that the probe sampled from the supersonic flowfield isokinetically. The second series involved traversing the probe across a free mixing layer of air and helium, to obtain both mean concentration and high frequency measurements. High-frequency data was statistically analyzed and inspection of the Probability Density Function (PDF) of the hot-film response was instrumental to interpret how well the resulting average mixing measurements represent these types of complex flows. The probe is minimally intrusive, has accuracy comparable to its predecessors, has an improved frequency response for mean concentration measurements, and samples from a very small area in the flowfield.

  13. Complex Cervical Aortic Arch With Hypoplasia: A Simple Solution to a Complex Problem.

    PubMed

    Rajbanshi, Bijoy G; Gautam, Navin C; Pradhan, Sidhartha; Sharma, Apurb; Ghimire, Ram K; Joyce, Lyle D

    2016-07-01

    We report a rare case of a 6-year-old boy with a complex right-sided cervical aortic arch, with retroesophageal hypoplastic transverse arch, left subclavian artery arising from the Kommerell diverticulum of the descending aorta, and a vascular ring formed by the ductus ligament. An extraanatomic ascending-to-descending aorta bypass was done through a median sternotomy along with division of the ductus ligament, without any complications and good results. PMID:27343523

  14. Harm reduction as a complex adaptive system: A dynamic framework for analyzing Tanzanian policies concerning heroin use.

    PubMed

    Ratliff, Eric A; Kaduri, Pamela; Masao, Frank; Mbwambo, Jessie K K; McCurdy, Sheryl A

    2016-04-01

    Contrary to popular belief, policies on drug use are not always based on scientific evidence or composed in a rational manner. Rather, decisions concerning drug policies reflect the negotiation of actors' ambitions, values, and facts as they organize in different ways around the perceived problems associated with illicit drug use. Drug policy is thus best represented as a complex adaptive system (CAS) that is dynamic, self-organizing, and coevolving. In this analysis, we use a CAS framework to examine how harm reduction emerged around heroin trafficking and use in Tanzania over the past thirty years (1985-present). This account is an organizational ethnography based on of the observant participation of the authors as actors within this system. We review the dynamic history and self-organizing nature of harm reduction, noting how interactions among system actors and components have coevolved with patterns of heroin us, policing, and treatment activities over time. Using a CAS framework, we describe harm reduction as a complex process where ambitions, values, facts, and technologies interact in the Tanzanian sociopolitical environment. We review the dynamic history and self-organizing nature of heroin policies, noting how the interactions within and between competing prohibitionist and harm reduction policies have changed with patterns of heroin use, policing, and treatment activities over time. Actors learn from their experiences to organize with other actors, align their values and facts, and implement new policies. Using a CAS approach provides researchers and policy actors a better understanding of patterns and intricacies in drug policy. This knowledge of how the system works can help improve the policy process through adaptive action to introduce new actors, different ideas, and avenues for communication into the system. PMID:26790689

  15. Conceptual and procedural knowledge community college students use when solving a complex science problem

    NASA Astrophysics Data System (ADS)

    Steen-Eibensteiner, Janice Lee

    2006-07-01

    A strong science knowledge base and problem solving skills have always been highly valued for employment in the science industry. Skills currently needed for employment include being able to problem solve (Overtoom, 2000). Academia also recognizes the need for effectively teaching students to apply problem solving skills in clinical settings. This thesis investigates how students solve complex science problems in an academic setting in order to inform the development of problem solving skills for the workplace. Students' use of problem solving skills in the form of learned concepts and procedural knowledge was studied as students completed a problem that might come up in real life. Students were taking a community college sophomore biology course, Human Anatomy & Physiology II. The problem topic was negative feedback inhibition of the thyroid and parathyroid glands. The research questions answered were (1) How well do community college students use a complex of conceptual knowledge when solving a complex science problem? (2) What conceptual knowledge are community college students using correctly, incorrectly, or not using when solving a complex science problem? (3) What problem solving procedural knowledge are community college students using successfully, unsuccessfully, or not using when solving a complex science problem? From the whole class the high academic level participants performed at a mean of 72% correct on chapter test questions which was a low average to fair grade of C-. The middle and low academic participants both failed (F) the test questions (37% and 30% respectively); 29% (9/31) of the students show only a fair performance while 71% (22/31) fail. From the subset sample population of 2 students each from the high, middle, and low academic levels selected from the whole class 35% (8/23) of the concepts were used effectively, 22% (5/23) marginally, and 43% (10/23) poorly. Only 1 concept was used incorrectly by 3/6 of the students and identified as

  16. Thinking Problems of the Present Collision Warning Work by Analyzing the Intersection Between Cosmos 2251 and Iridium 33

    NASA Astrophysics Data System (ADS)

    Wang, R. L.; Liu, W.; Yan, R. D.; Gong, J. C.

    2013-08-01

    After Cosmos 2251 and Iridium 33 collision breakup event, the institutions at home and abroad began the collision warning analysis for the event. This paper compared the results from the different research units and discussed the problems of the current collision warning work, then gave the suggestions of further study.

  17. Temporality Matters: Advancing a Method for Analyzing Problem-Solving Processes in a Computer-Supported Collaborative Environment

    ERIC Educational Resources Information Center

    Kapur, Manu

    2011-01-01

    This paper argues for a need to develop methods for examining temporal patterns in computer-supported collaborative learning (CSCL) groups. It advances one such quantitative method--Lag-sequential Analysis (LsA)--and instantiates it in a study of problem-solving interactions of collaborative groups in an online, synchronous environment. LsA…

  18. Methods and Challenges of Analyzing Spatial Data for Social Work Problems: The Case of Examining Child Maltreatment Geographically

    ERIC Educational Resources Information Center

    Freisthler, Bridget; Lery, Bridgette; Gruenewald, Paul J.; Chow, Julian

    2006-01-01

    Increasingly, social work researchers are interested in examining how "place" and "location" contribute to social problems. Yet, often these researchers do not use the specialized spatial statistical techniques developed to handle the analytic issues faced when conducting ecological analyses. This article explains the importance of these…

  19. Analyzing Multiple Informant Data on Child and Adolescent Behavior Problems: Predictive Validity and Comparison of Aggregation Procedures

    ERIC Educational Resources Information Center

    van Dulmen, Manfred H. M.; Egeland, Byron

    2011-01-01

    We compared the predictive validity of five aggregation methods for multiple informant data on child and adolescent behavior problems. In addition, we compared the predictive validity of these aggregation methods with single informant scores. Data were derived from the Minnesota Longitudinal Study of Parents and Children (N = 175). Maternal and…

  20. Exhaustive expansion: A novel technique for analyzing complex data generated by higher-order polychromatic flow cytometry experiments

    PubMed Central

    2010-01-01

    Background The complex data sets generated by higher-order polychromatic flow cytometry experiments are a challenge to analyze. Here we describe Exhaustive Expansion, a data analysis approach for deriving hundreds to thousands of cell phenotypes from raw data, and for interrogating these phenotypes to identify populations of biological interest given the experimental context. Methods We apply this approach to two studies, illustrating its broad applicability. The first examines the longitudinal changes in circulating human memory T cell populations within individual patients in response to a melanoma peptide (gp100209-2M) cancer vaccine, using 5 monoclonal antibodies (mAbs) to delineate subpopulations of viable, gp100-specific, CD8+ T cells. The second study measures the mobilization of stem cells in porcine bone marrow that may be associated with wound healing, and uses 5 different staining panels consisting of 8 mAbs each. Results In the first study, our analysis suggests that the cell surface markers CD45RA, CD27 and CD28, commonly used in historical lower order (2-4 color) flow cytometry analysis to distinguish memory from naïve and effector T cells, may not be obligate parameters in defining central memory T cells (TCM). In the second study, we identify novel phenotypes such as CD29+CD31+CD56+CXCR4+CD90+Sca1-CD44+, which may characterize progenitor cells that are significantly increased in wounded animals as compared to controls. Conclusions Taken together, these results demonstrate that Exhaustive Expansion supports thorough interrogation of complex higher-order flow cytometry data sets and aids in the identification of potentially clinically relevant findings. PMID:21034498

  1. Analyzing the Effects of a Mathematics Problem-Solving Program, Exemplars, on Mathematics Problem-Solving Scores with Deaf and Hard-of-Hearing Students

    ERIC Educational Resources Information Center

    Chilvers, Amanda Leigh

    2013-01-01

    Researchers have noted that mathematics achievement for deaf and hard-of-hearing (d/hh) students has been a concern for many years, including the ability to problem solve. This quasi-experimental study investigates the use of the Exemplars mathematics program with students in grades 2-8 in a school for the deaf that utilizes American Sign Language…

  2. Processing and Correcting Master Images to Analyze and map Metamorphic Core Complexes in the Southern Basin and Range Province

    NASA Astrophysics Data System (ADS)

    Sanchez, S. O.

    2004-12-01

    Metamorphic core complexes (MCCs) have been of great interest to geologists and geophysicists and our goal is to facilitate integrated studies of these intriguing features. Our specific targets are the exposed Whipple Mountains in Southeastern California and the spectrally similar Mohave Mountains in Western Arizona. These two ranges were selected for study using the MODIS/ASTER airborne sensor also known as MASTER, and NASA/JPL acquired the data for us. These two ranges were chosen because of their close proximity to each other in the imagery. This sensor was chosen because it has a good resolution (15m) and 50 different bands ranging from the visible to thermal infrared. However, because it is flown on a light aircraft its flight line patterns and photogrammetric distortions make it hard to georeference and mosaic with other images from adjacent flight lines. The distortions become misalignments of images during mosaicing. This project involved two efforts: 1) developing a method for correcting and processing MASTER multispectral images; and 2) using those images to analyze and map MCCs in the southern Basin and Range Province. Standard image processing techniques available within the ENVI software package were applied to this imagery to geometrically correct, mosaic, and spectrally process it in order to locate defining characteristics of MCCs that are mappable with the imagery. These techniques include the use of warping, histogram matching, mosaicing, classification, Principal Component Analysis, decorrelation stretching, Minimum Noise Fraction Transformation, Pixel Purity Index, and end member analysis.

  3. Performance of isotope ratio infrared spectroscopy (IRIS) for analyzing waters containing organic contaminants: Problems and solutions (Invited)

    NASA Astrophysics Data System (ADS)

    West, A. G.; Goldsmith, G. R.; Dawson, T. E.

    2010-12-01

    The development of isotope ratio infrared spectroscopy (IRIS) for simultaneous δ2H and δ18O analysis of liquid water samples shows much potential for affordable, simple and potentially portable isotopic analyses. IRIS has been shown to be comparable in precision and accuracy to isotope ratio mass spectrometry (IRMS) when analyzing pure water samples. However, recent studies have shown that organic contaminants in analyzed water samples may interfere with the spectroscopy leading to errors of considerable magnitude in the reported stable isotope data. Many environmental, biological and forensic studies require analyses of water containing organic contaminants in some form, yet our current methods of removing organic contaminants prior to analysis appear inadequate for IRIS. Treated plant water extracts analyzed by IRIS showed deviations as large as 35‰ (δ2H) and 11.8‰ (δ18O) from the IRMS value, indicating that trace amounts of contaminants were sufficient to disrupt IRIS analyses. However, not all organic contaminants negatively influence IRIS. For such samples, IRIS presents a labour saving method relative to IRMS. Prior to widespread use in the environmental, biological and forensic sciences, a means of obtaining reliable data from IRIS needs to be demonstrated. One approach is to use instrument-based software to flag potentially problematic spectra and output a corrected isotope value based on analysis of the spectra. We evaluate this approach on two IRIS systems and discuss the way forward for ensuring accurate stable isotope data using IRIS.

  4. Medicines counterfeiting is a complex problem: a review of key challenges across the supply chain.

    PubMed

    Tremblay, Michael

    2013-02-01

    The paper begins by asking why there is a market for counterfeit medicines, which in effect creates the problem of counterfeiting itself. Contributing factors include supply chain complexity and the lack of whole-systems thinking. These two underpin the author's view that counterfeiting is a complex (i.e. wicked) problem, and that corporate, public policy and regulatory actions need to be mindful of how their actions may be causal. The paper offers a problem-based review of key components of this complexity, viz., the knowledge end-users/consumers have of medicines; whether restrictive information policies may hamper information provision to patients; the internet's direct access to consumers; internet-enabled distribution of unsafe and counterfeit medicines; whether the internet is a parallel and competitive supply chain to legitimate routes; organised crime as an emerging medicines manufacturer and supplier and whether substandard medicines is really the bigger problem. Solutions respect the perceived complexity of the supply chain challenges. The paper identifies the need to avoid technologically-driven solutions, calling for 'technological agnosticism'. Both regulation and public policy need to reflect the dynamic nature of the problem and avoid creating perverse incentives; it may be, for instance, that medicines pricing and reimbursement policies, which affect consumer/patient access may act as market signals to counterfeiters, since this creates a cash market in cheaper drugs. PMID:23656447

  5. Analogy as a strategy for supporting complex problem solving under uncertainty.

    PubMed

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving. PMID:22815065

  6. On the Critical Behaviour, Crossover Point and Complexity of the Exact Cover Problem

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.; Smelyanskiy, Vadim N.; Shumow, Daniel; Koga, Dennis (Technical Monitor)

    2003-01-01

    Research into quantum algorithms for NP-complete problems has rekindled interest in the detailed study a broad class of combinatorial problems. A recent paper applied the quantum adiabatic evolution algorithm to the Exact Cover problem for 3-sets (EC3), and provided an empirical evidence that the algorithm was polynomial. In this paper we provide a detailed study of the characteristics of the exact cover problem. We present the annealing approximation applied to EC3, which gives an over-estimate of the phase transition point. We also identify empirically the phase transition point. We also study the complexity of two classical algorithms on this problem: Davis-Putnam and Simulated Annealing. For these algorithms, EC3 is significantly easier than 3-SAT.

  7. Learning about Complex Multi-Stakeholder Issues: Assessing the Visual Problem Appraisal

    ERIC Educational Resources Information Center

    Witteveen, Loes; Put, Marcel; Leeuwis, Cees

    2010-01-01

    This paper presents an evaluation of the visual problem appraisal (VPA) learning environment in higher education. The VPA has been designed for the training of competences that are required in complex stakeholder settings in relation to sustainability issues. The design of VPA incorporates a diversity of instruction strategies to accommodate the…

  8. Ecosystem services and cooperative fisheries research to address a complex fishery problem

    EPA Science Inventory

    The St. Louis River represents a complex fishery management problem. Current fishery management goals have to be developed taking into account bi-state commercial, subsistence and recreational fisheries which are valued for different characteristics by a wide range of anglers, as...

  9. Calculating Probabilistic Distance to Solution in a Complex Problem Solving Domain

    ERIC Educational Resources Information Center

    Sudol, Leigh Ann; Rivers, Kelly; Harris, Thomas K.

    2012-01-01

    In complex problem solving domains, correct solutions are often comprised of a combination of individual components. Students usually go through several attempts, each attempt reflecting an individual solution state that can be observed during practice. Classic metrics to measure student performance over time rely on counting the number of…

  10. To Live With Complexity: A Problem for Students--And for the Rest of Us.

    ERIC Educational Resources Information Center

    Ford, Franklin L.

    1968-01-01

    In articles on student unrest, there is a great tendency to oversimplify the issues and to assume that the components and stakes are the same from Minnesota to Czechoslovakia. To understand this complex phenomenon, the following questions should be answered: How many different problems, of what orders of magnitude and intensity, need to be…

  11. Computer-Based Assessment of Complex Problem Solving: Concept, Implementation, and Application

    ERIC Educational Resources Information Center

    Greiff, Samuel; Wustenberg, Sascha; Holt, Daniel V.; Goldhammer, Frank; Funke, Joachim

    2013-01-01

    Complex Problem Solving (CPS) skills are essential to successfully deal with environments that change dynamically and involve a large number of interconnected and partially unknown causal influences. The increasing importance of such skills in the 21st century requires appropriate assessment and intervention methods, which in turn rely on adequate…

  12. Differential Relations between Facets of Complex Problem Solving and Students' Immigration Background

    ERIC Educational Resources Information Center

    Sonnleitner, Philipp; Brunner, Martin; Keller, Ulrich; Martin, Romain

    2014-01-01

    Whereas the assessment of complex problem solving (CPS) has received increasing attention in the context of international large-scale assessments, its fairness in regard to students' cultural background has gone largely unexplored. On the basis of a student sample of 9th-graders (N = 299), including a representative number of immigrant students (N…

  13. Small-Group Problem-Based Learning as a Complex Adaptive System

    ERIC Educational Resources Information Center

    Mennin, Stewart

    2007-01-01

    Small-group problem-based learning (PBL) is widely embraced as a method of study in health professions schools and at many different levels of education. Complexity science provides a different lens with which to view and understand the application of this method. It presents new concepts and vocabulary that may be unfamiliar to practitioners of…

  14. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    ERIC Educational Resources Information Center

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  15. The Development of Complex Problem Solving in Adolescence: A Latent Growth Curve Analysis

    ERIC Educational Resources Information Center

    Frischkorn, Gidon T.; Greiff, Samuel; Wüstenberg, Sascha

    2014-01-01

    Complex problem solving (CPS) as a cross-curricular competence has recently attracted more attention in educational psychology as indicated by its implementation in international educational large-scale assessments such as the Programme for International Student Assessment. However, research on the development of CPS is scarce, and the few…

  16. On Using Meta-Modeling and Multi-Modeling to Address Complex Problems

    ERIC Educational Resources Information Center

    Abu Jbara, Ahmed

    2013-01-01

    Models, created using different modeling techniques, usually serve different purposes and provide unique insights. While each modeling technique might be capable of answering specific questions, complex problems require multiple models interoperating to complement/supplement each other; we call this Multi-Modeling. To address the syntactic and…

  17. An HPLC chromatographic framework to analyze the β-cyclodextrin/solute complexation mechanism using a carbon nanotube stationary phase.

    PubMed

    Aljhni, Rania; Andre, Claire; Lethier, Lydie; Guillaume, Yves Claude

    2015-11-01

    A carbon nanotube (CNT) stationary phase was used for the first time to study the β-cyclodextrin (β-CD) solute complexation mechanism using high performance liquid chromatography (HPLC). For this, the β-CD was added at various concentrations in the mobile phase and the effect of column temperature was studied on both the retention of a series of aniline and benzoic acid derivatives with the CNT stationary phase and their complexation mechanism with β-CD. A decrease in the solute retention factor was observed for all the studied molecules without change in the retention order. The apparent formation constant KF of the inclusion complex β-CD/solute was determined at various temperatures. Our results showed that the interaction of β-CD with both the mobile phase and the stationary phase interfered in the complex formation. The enthalpy and entropy of the complex formation (ΔHF and ΔSF) between the solute molecule and CD were determined using a thermodynamic approach. Negative enthalpies and entropies indicated that the inclusion process of the studied molecule in the CD cavity was enthalpically driven and that the hydrogen bonds between carboxylic or aniline groups and the functional groups on the β-CD rim play an important role in the complex formation. PMID:26452814

  18. Mixing Bandt-Pompe and Lempel-Ziv approaches: another way to analyze the complexity of continuous-state sequences

    NASA Astrophysics Data System (ADS)

    Zozor, S.; Mateos, D.; Lamberti, P. W.

    2014-05-01

    In this paper, we propose to mix the approach underlying Bandt-Pompe permutation entropy with Lempel-Ziv complexity, to design what we call Lempel-Ziv permutation complexity. The principle consists of two steps: (i) transformation of a continuous-state series that is intrinsically multivariate or arises from embedding into a sequence of permutation vectors, where the components are the positions of the components of the initial vector when re-arranged; (ii) performing the Lempel-Ziv complexity for this series of `symbols', as part of a discrete finite-size alphabet. On the one hand, the permutation entropy of Bandt-Pompe aims at the study of the entropy of such a sequence; i.e., the entropy of patterns in a sequence (e.g., local increases or decreases). On the other hand, the Lempel-Ziv complexity of a discrete-state sequence aims at the study of the temporal organization of the symbols (i.e., the rate of compressibility of the sequence). Thus, the Lempel-Ziv permutation complexity aims to take advantage of both of these methods. The potential from such a combined approach - of a permutation procedure and a complexity analysis - is evaluated through the illustration of some simulated data and some real data. In both cases, we compare the individual approaches and the combined approach.

  19. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem

    PubMed Central

    Williams, Patricia AH; Woodward, Andrew J

    2015-01-01

    The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat. PMID:26229513

  20. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem.

    PubMed

    Williams, Patricia Ah; Woodward, Andrew J

    2015-01-01

    The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat. PMID:26229513

  1. Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving

    PubMed Central

    Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni

    2015-01-01

    It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load. PMID:25652466

  2. Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving.

    PubMed

    Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni

    2015-03-01

    It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load. PMID:25652466

  3. Solving the three-body Coulomb breakup problem using exterior complex scaling

    SciTech Connect

    McCurdy, C.W.; Baertschy, M.; Rescigno, T.N.

    2004-05-17

    Electron-impact ionization of the hydrogen atom is the prototypical three-body Coulomb breakup problem in quantum mechanics. The combination of subtle correlation effects and the difficult boundary conditions required to describe two electrons in the continuum have made this one of the outstanding challenges of atomic physics. A complete solution of this problem in the form of a ''reduction to computation'' of all aspects of the physics is given by the application of exterior complex scaling, a modern variant of the mathematical tool of analytic continuation of the electronic coordinates into the complex plane that was used historically to establish the formal analytic properties of the scattering matrix. This review first discusses the essential difficulties of the three-body Coulomb breakup problem in quantum mechanics. It then describes the formal basis of exterior complex scaling of electronic coordinates as well as the details of its numerical implementation using a variety of methods including finite difference, finite elements, discrete variable representations, and B-splines. Given these numerical implementations of exterior complex scaling, the scattering wave function can be generated with arbitrary accuracy on any finite volume in the space of electronic coordinates, but there remains the fundamental problem of extracting the breakup amplitudes from it. Methods are described for evaluating these amplitudes. The question of the volume-dependent overall phase that appears in the formal theory of ionization is resolved. A summary is presented of accurate results that have been obtained for the case of electron-impact ionization of hydrogen as well as a discussion of applications to the double photoionization of helium.

  4. The Role of Prior Knowledge and Problem Contexts in Students' Explanations of Complex System

    NASA Astrophysics Data System (ADS)

    Barth-Cohen, Lauren April

    The purpose of this dissertation is to study students' competencies in generating scientific explanations within the domain of complex systems, an interdisciplinary area in which students tend to have difficulties. While considering students' developing explanations of how complex systems work, I investigate the role of prior knowledge and how students' explanations systematically vary across seven problem contexts (e.g. the movement of sand dunes, the formation of traffic jams, and diffusion in water). Using the Knowledge in Pieces epistemological perspective, I build a mini-theory of how students construct explanations about the behavior of complex systems. The mini-theory shows how advanced, "decentralized" explanations evolve from a variety of prior knowledge resources, which depend on specific features of the problem. A general emphasis on students' competences is exhibited through three strands of analysis: (1) a focus on moment-to-moment shifts in individuals' explanations in the direction of a normative understanding; (2) a comparison of explanations across the seven problem contexts in order to highlight variation in kinds of prior knowledge that are used; and (3) a concentration on the diversity within explanations that can be all considered examples of emergent thinking. First, I document cases of students' shifting explanations as they become less prototypically centralized (a more naive causality) and then become more prototypically decentralized over short time periods. The analysis illustrates the lines of continuity between these two ways of understanding and how change can occur during the process of students generating a progression of increasingly sophisticated transitional explanations. Second, I find a variety of students' understandings across the problem contexts, expressing both variation in their prior knowledge and how the nature of a specific domain influences reasoning. Certain problem contexts are easier or harder for students

  5. The markov-dubins problem with free terminal direction in a nonpositively curved cube complex

    NASA Astrophysics Data System (ADS)

    La Corte, Jason Thomson

    State complexes are nonpositively curved cube complexes that model the state spaces of reconfigurable systems. The problem of determining a strategy for reconfiguring the system from a given initial state to a given goal state is equivalent to that of finding a path between two points in the state complex. The additional requirement that allowable paths must have a prescribed initial direction and minimal turning radius determines a Markov-Dubins problem with free terminal direction (MDPFTD). Given a nonpositively curved, locally finite cube complex X, we consider the set of unit-speed paths which satisfy a certain smoothness condition in addition to the boundary conditions and curvature constraint that define a MDPFTD. We show that this set either contains a path of minimal length, or is empty. We then focus on the case that X is a surface with a nonpositively curved cubical structure. We show that any solution to a MDPFTD in X must consist of finitely many geodesic segments and arcs of constant curvature, and we give an algorithm for determining those solutions to the MDPFTD in X which are CL paths, that is, made up of an arc of constant curvature followed by a geodesic segment. Finally, under the assumption that the 1-skeleton of X is d-regular, we give sufficient conditions for a topological ray in X of constant curvature to be a rose curve or a proper ray.

  6. Analysis and formulation of a class of complex dynamic optimization problems

    NASA Astrophysics Data System (ADS)

    Kameswaran, Shivakumar

    The Direct Transcription approach, also known as the direct simultaneous approach, is a widely used solution strategy for the solution of dynamic optimization problems involving differential-algebraic equations (DAEs). Direct transcription refers to the procedure of approximating the infinite dimensional problem by a finite dimensional one, which is then solved using a nonlinear programming (NLP) solver tailored to large-scale problems. Systems governed by partial differential equations (PDEs) can also be handled by spatially discretizing the PDEs to convert them to a system of DAEs. The objective of this thesis is firstly to ensure that direct transcription using Radau collocation is provably correct, and secondly to widen the applicability of the direct simultaneous approach to a larger class of dynamic optimization and optimal control problems (OCPs). This thesis aims at addressing these issues using rigorous theoretical tools and/or characteristic examples, and at the same time use the results for solving large-scale industrial applications to realize the benefits. The first part of this work deals with the analysis of convergence rates for direct transcription of unconstrained and final-time equality constrained optimal control problems. The problems are discretized using collocation at Radau points. Convergence is analyzed from an NLP/matrix-algebra perspective, which enables the prediction of the conditioning of the direct transcription NLP as the mesh size becomes finer. Several convergence results are presented along with tests on numerous example problems. These convergence results lead to an adjoint estimation procedure given the Lagrange multipliers for the large-scale NLP. The work also reveals the role of process control concepts such as controllability on the convergence analysis, and provides a very important link between control and optimization inside the framework of dynamic optimization. As an effort to extend the applicability of the direct

  7. Beyond roots alone: Novel methodologies for analyzing complex soil and minirhizotron imagery using image processing and GIS tools

    NASA Astrophysics Data System (ADS)

    Silva, Justina A.

    Quantifying belowground dynamics is critical to our understanding of plant and ecosystem function and belowground carbon cycling, yet currently available tools for complex belowground image analyses are insufficient. We introduce novel techniques combining digital image processing tools and geographic information systems (GIS) analysis to permit semi-automated analysis of complex root and soil dynamics. We illustrate methodologies with imagery from microcosms, minirhizotrons, and a rhizotron, in upland and peatland soils. We provide guidelines for correct image capture, a method that automatically stitches together numerous minirhizotron images into one seamless image, and image analysis using image segmentation and classification in SPRING or change analysis in ArcMap. These methods facilitate spatial and temporal root and soil interaction studies, providing a framework to expand a more comprehensive understanding of belowground dynamics.

  8. You Need to Know: There Is a Causal Relationship between Structural Knowledge and Control Performance in Complex Problem Solving Tasks

    ERIC Educational Resources Information Center

    Goode, Natassia; Beckmann, Jens F.

    2010-01-01

    This study investigates the relationships between structural knowledge, control performance and fluid intelligence in a complex problem solving (CPS) task. 75 participants received either complete, partial or no information regarding the underlying structure of a complex problem solving task, and controlled the task to reach specific goals.…

  9. How to solve complex problems in foundry plants - future of casting simulation -

    NASA Astrophysics Data System (ADS)

    Ohnaka, I.

    2015-06-01

    Although the computer simulation of casting has progressed dramatically over the last decades, there are still many challenges and problems. This paper discusses how to solve complex engineering problems in foundry plants and what we should do in the future, in particular, for casting simulation. First, problem solving procedures including application of computer simulation are demonstrated and various difficulties are pointed-out exemplifying mainly porosity defects in sand castings of spheroidal graphite cast irons. Next, looking back conventional scientific and engineering research to understand casting phenomena, challenges and problems are discussed from problem solving view point, followed by discussion on the issues we should challenge such as how to integrate huge amount of dispersed knowledge in various disciplines, differentiation of science-oriented and engineering-oriented models, professional ethics, how to handle fluctuating materials, initial and boundary conditions, error accumulation, simulation codes as black-box, etc. Finally some suggestions are made on how to challenge the issues such as promotion of research on the simulation based on the science- oriented model and publication of reliable data of casting phenomena in complicated-shaped castings including reconsideration of the evaluation system.

  10. Numerical calculation of thermo-mechanical problems at large strains based on complex step derivative approximation of tangent stiffness matrices

    NASA Astrophysics Data System (ADS)

    Balzani, Daniel; Gandhi, Ashutosh; Tanaka, Masato; Schröder, Jörg

    2015-05-01

    In this paper a robust approximation scheme for the numerical calculation of tangent stiffness matrices is presented in the context of nonlinear thermo-mechanical finite element problems and its performance is analyzed. The scheme extends the approach proposed in Kim et al. (Comput Methods Appl Mech Eng 200:403-413, 2011) and Tanaka et al. (Comput Methods Appl Mech Eng 269:454-470, 2014 and bases on applying the complex-step-derivative approximation to the linearizations of the weak forms of the balance of linear momentum and the balance of energy. By incorporating consistent perturbations along the imaginary axis to the displacement as well as thermal degrees of freedom, we demonstrate that numerical tangent stiffness matrices can be obtained with accuracy up to computer precision leading to quadratically converging schemes. The main advantage of this approach is that contrary to the classical forward difference scheme no round-off errors due to floating-point arithmetics exist within the calculation of the tangent stiffness. This enables arbitrarily small perturbation values and therefore leads to robust schemes even when choosing small values. An efficient algorithmic treatment is presented which enables a straightforward implementation of the method in any standard finite-element program. By means of thermo-elastic and thermo-elastoplastic boundary value problems at finite strains the performance of the proposed approach is analyzed.

  11. Low complexity interference alignment algorithms for desired signal power maximization problem of MIMO channels

    NASA Astrophysics Data System (ADS)

    Sun, Cong; Yang, Yunchuan; Yuan, Yaxiang

    2012-12-01

    In this article, we investigate the interference alignment (IA) solution for a K-user MIMO interference channel. Proper users' precoders and decoders are designed through a desired signal power maximization model with IA conditions as constraints, which forms a complex matrix optimization problem. We propose two low complexity algorithms, both of which apply the Courant penalty function technique to combine the leakage interference and the desired signal power together as the new objective function. The first proposed algorithm is the modified alternating minimization algorithm (MAMA), where each subproblem has closed-form solution with an eigenvalue decomposition. To further reduce algorithm complexity, we propose a hybrid algorithm which consists of two parts. As the first part, the algorithm iterates with Householder transformation to preserve the orthogonality of precoders and decoders. In each iteration, the matrix optimization problem is considered in a sequence of 2D subspaces, which leads to one dimensional optimization subproblems. From any initial point, this algorithm obtains precoders and decoders with low leakage interference in short time. In the second part, to exploit the advantage of MAMA, it continues to iterate to perfectly align the interference from the output point of the first part. Analysis shows that in one iteration generally both proposed two algorithms have lower computational complexity than the existed maximum signal power (MSP) algorithm, and the hybrid algorithm enjoys lower complexity than MAMA. Simulations reveal that both proposed algorithms achieve similar performances as the MSP algorithm with less executing time, and show better performances than the existed alternating minimization algorithm in terms of sum rate. Besides, from the view of convergence rate, simulation results show that the MAMA enjoys fastest speed with respect to a certain sum rate value, while hybrid algorithm converges fastest to eliminate interference.

  12. Complexity of analysis and verification problems for communicating automata and discrete dynamical systems.

    SciTech Connect

    Hunt, H. B.; Rosenkrantz, D. J.; Barrett, C. L.; Marathe, M. V.; Ravi, S. S.

    2001-01-01

    We identify several simple but powerful concepts, techniques, and results; and we use them to characterize the complexities of a number of basic problems II, that arise in the analysis and verification of the following models M of communicating automata and discrete dynamical systems: systems of communicating automata including both finite and infinite cellular automata, transition systems, discrete dynamical systems, and succinctly-specified finite automata. These concepts, techniques, and results are centered on the following: (1) reductions Of STATE-REACHABILITY problems, especially for very simple systems of communicating copies of a single simple finite automaton, (2) reductions of generalized CNF satisfiability problems [Sc78], especially to very simple communicating systems of copies of a few basic acyclic finite sequential machines, and (3) reductions of the EMPTINESS and EMPTINESS-OF-INTERSECTION problems, for several kinds of regular set descriptors. For systems of communicating automata and transition systems, the problems studied include: all equivalence relations and simulation preorders in the Linear-time/Branching-time hierarchies of equivalence relations and simulation preorders of [vG90, vG93], both without and with the hiding abstraction. For discrete dynamical systems, the problems studied include the INITIAL and BOUNDARY VALUE PROBLEMS (denoted IVPs and BVPs, respectively), for nonlinear difference equations over many different algebraic structures, e.g. all unitary rings, all finite unitary semirings, and all lattices. For succinctly specified finite automata, the problems studied also include the several problems studied in [AY98], e.g. the EMPTINESS, EMPTINESS-OF-INTERSECTION, EQUIVALENCE and CONTAINMENT problems. The concepts, techniques, and results presented unify and significantly extend many of the known results in the literature, e.g. [Wo86, Gu89, BPT91, GM92, Ra92, HT94, SH+96, AY98, AKY99, RH93, SM73, Hu73, HRS76, HR78], for

  13. COMPLEXITY OF ANALYSIS & VERIFICATION PROBLEMS FOR COMMUNICATING AUTOMATA & DISCRETE DYNAMICAL SYSTEMS

    SciTech Connect

    H. B. HUNT; D. J. ROSENKRANTS; ET AL

    2001-03-01

    We identify several simple but powerful concepts, techniques, and results; and we use them to characterize the complexities of a number of basic problems II, that arise in the analysis and verification of the following models M of communicating automata and discrete dynamical systems: systems of communicating automata including both finite and infinite cellular automata, transition systems, discrete dynamical systems, and succinctly-specified finite automata. These concepts, techniques, and results are centered on the following: (i) reductions Of STATE-REACHABILITY problems, especially for very simple systems of communicating copies of a single simple finite automaton, (ii) reductions of generalized CNF satisfiability problems [Sc78], especially to very simple communicating systems of copies of a few basic acyclic finite sequential machines, and (iii) reductions of the EMPTINESS and EMPTINESS-OF-INTERSECTION problems, for several kinds of regular set descriptors. For systems of communicating automata and transition systems, the problems studied include: all equivalence relations and simulation preorders in the Linear-time/Branching-time hierarchies of equivalence relations and simulation preorders of [vG90, vG93], both without and with the hiding abstraction. For discrete dynamical systems, the problems studied include the INITIAL and BOUNDARY VALUE PROBLEMS (denoted IVPs and BVPS, respectively), for nonlinear difference equations over many different algebraic structures, e.g. all unitary rings, all finite unitary semirings, and all lattices. For succinctly-specified finite automata, the problems studied also include the several problems studied in [AY98], e.g. the EMPTINESS, EMPTINESS-OF-INTERSECTION, EQUIVALENCE and CONTAINMENT problems. The concepts, techniques, and results presented unify and significantly extend many of the known results in the literature, e.g. [Wo86, Gu89, BPT91, GM92, Ra92, HT94, SH+96, AY98, AKY99, RH93, SM73, Hu73, HRS76, HR78], for

  14. Thresholds of Knowledge Development in Complex Problem Solving: A Multiple-Case Study of Advanced Learners' Cognitive Processes

    ERIC Educational Resources Information Center

    Bogard, Treavor; Liu, Min; Chiang, Yueh-hui Vanessa

    2013-01-01

    This multiple-case study examined how advanced learners solved a complex problem, focusing on how their frequency and application of cognitive processes contributed to differences in performance outcomes, and developing a mental model of a problem. Fifteen graduate students with backgrounds related to the problem context participated in the study.…

  15. Analyzing the tradeoff between electrical complexity and accuracy in patient-specific computational models of deep brain stimulation

    NASA Astrophysics Data System (ADS)

    Howell, Bryan; McIntyre, Cameron C.

    2016-06-01

    Objective. Deep brain stimulation (DBS) is an adjunctive therapy that is effective in treating movement disorders and shows promise for treating psychiatric disorders. Computational models of DBS have begun to be utilized as tools to optimize the therapy. Despite advancements in the anatomical accuracy of these models, there is still uncertainty as to what level of electrical complexity is adequate for modeling the electric field in the brain and the subsequent neural response to the stimulation. Approach. We used magnetic resonance images to create an image-based computational model of subthalamic DBS. The complexity of the volume conductor model was increased by incrementally including heterogeneity, anisotropy, and dielectric dispersion in the electrical properties of the brain. We quantified changes in the load of the electrode, the electric potential distribution, and stimulation thresholds of descending corticofugal (DCF) axon models. Main results. Incorporation of heterogeneity altered the electric potentials and subsequent stimulation thresholds, but to a lesser degree than incorporation of anisotropy. Additionally, the results were sensitive to the choice of method for defining anisotropy, with stimulation thresholds of DCF axons changing by as much as 190%. Typical approaches for defining anisotropy underestimate the expected load of the stimulation electrode, which led to underestimation of the extent of stimulation. More accurate predictions of the electrode load were achieved with alternative approaches for defining anisotropy. The effects of dielectric dispersion were small compared to the effects of heterogeneity and anisotropy. Significance. The results of this study help delineate the level of detail that is required to accurately model electric fields generated by DBS electrodes.

  16. Direct, inverse, and combined problems in complex engineered system modeling by artificial neural networks

    NASA Astrophysics Data System (ADS)

    Terekhoff, Serge A.

    1997-04-01

    This paper summarizes theoretical findings and applications of artificial neural networks to modeling of complex engineered system response in the abnormal environments. The thermal fire impact on the industrial container for waste and fissile materials was investigated using model and experimental data. Solutions for the direct problem show that the generalization properties of neural network based model are significantly better than those for standard interpolation methods. Minimal amount of data required for good prediction of system response is estimated in computer experiments with MLP network. It was shown that Kohonen's self-organizing map with counterpropagation may also estimate local accuracy of regularized solution for inverse and combined problems. Feature space regions of partial correctness of the inverse model can be automatically extracted using adaptive clustering. Practical findings include time strategy recommendations for fire-safe services when industrial or transport accidents occur.

  17. Using Brain Imaging to Track Problem Solving in a Complex State Space

    PubMed Central

    Anderson, John R.; Fincham, Jon M.; Schneider, Darryl W.; Yang, Jian

    2011-01-01

    This paper describes how behavioral and imaging data can be combined with a Hidden Markov Model (HMM) to track participants’ trajectories through a complex state space. Participants completed a problem-solving variant of a memory game that involved 625 distinct states, 24 operators, and an astronomical number of paths through the state space. Three sources of information were used for classification purposes. First, an Imperfect Memory Model was used to estimate transition probabilities for the HMM. Second, behavioral data provided information about the timing of different events. Third, multivoxel pattern analysis of the imaging data was used to identify features of the operators. By combining the three sources of information, an HMM algorithm was able to efficiently identify the most probable path that participants took through the state space, achieving over 80% accuracy. These results support the approach as a general methodology for tracking mental states that occur during individual problem-solving episodes. PMID:22209783

  18. A boundary collocation meshfree method for the treatment of Poisson problems with complex morphologies

    NASA Astrophysics Data System (ADS)

    Soghrati, Soheil; Mai, Weijie; Liang, Bowen; Buchheit, Rudolph G.

    2015-01-01

    A new meshfree method based on a discrete transformation of Green's basis functions is introduced to simulate Poisson problems with complex morphologies. The proposed Green's Discrete Transformation Method (GDTM) uses source points that are located along a virtual boundary outside the problem domain to construct the basis functions needed to approximate the field. The optimal number of Green's functions source points and their relative distances with respect to the problem boundaries are evaluated to obtain the best approximation of the partition of unity condition. A discrete transformation technique together with the boundary point collocation method is employed to evaluate the unknown coefficients of the solution series via satisfying the problem boundary conditions. A comprehensive convergence study is presented to investigate the accuracy and convergence rate of the GDTM. We will also demonstrate the application of this meshfree method for simulating the conductive heat transfer in a heterogeneous materials system and the dissolved aluminum ions concentration in the electrolyte solution formed near a passive corrosion pit.

  19. The anatomical problem posed by brain complexity and size: a potential solution

    PubMed Central

    DeFelipe, Javier

    2015-01-01

    Over the years the field of neuroanatomy has evolved considerably but unraveling the extraordinary structural and functional complexity of the brain seems to be an unattainable goal, partly due to the fact that it is only possible to obtain an imprecise connection matrix of the brain. The reasons why reaching such a goal appears almost impossible to date is discussed here, together with suggestions of how we could overcome this anatomical problem by establishing new methodologies to study the brain and by promoting interdisciplinary collaboration. Generating a realistic computational model seems to be the solution rather than attempting to fully reconstruct the whole brain or a particular brain region. PMID:26347617

  20. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    SciTech Connect

    Tahvili, Sahar; Österberg, Jonas; Silvestrov, Sergei; Biteus, Jonas

    2014-12-10

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation.

  1. SVD-GFD scheme to simulate complex moving body problems in 3D space

    NASA Astrophysics Data System (ADS)

    Wang, X. Y.; Yu, P.; Yeo, K. S.; Khoo, B. C.

    2010-03-01

    The present paper presents a hybrid meshfree-and-Cartesian grid method for simulating moving body incompressible viscous flow problems in 3D space. The method combines the merits of cost-efficient and accurate conventional finite difference approximations on Cartesian grids with the geometric freedom of generalized finite difference (GFD) approximations on meshfree grids. Error minimization in GFD is carried out by singular value decomposition (SVD). The Arbitrary Lagrangian-Eulerian (ALE) form of the Navier-Stokes equations on convecting nodes is integrated by a fractional-step projection method. The present hybrid grid method employs a relatively simple mode of nodal administration. Nevertheless, it has the geometrical flexibility of unstructured mesh-based finite-volume and finite element methods. Boundary conditions are precisely implemented on boundary nodes without interpolation. The present scheme is validated by a moving patch consistency test as well as against published results for 3D moving body problems. Finally, the method is applied on low-Reynolds number flapping wing applications, where large boundary motions are involved. The present study demonstrates the potential of the present hybrid meshfree-and-Cartesian grid scheme for solving complex moving body problems in 3D.

  2. Induction of mutation spectra by complex mixtures: approaches, problems, and possibilities.

    PubMed Central

    DeMarini, D M

    1994-01-01

    More complex environmental mixtures have been evaluated for mutagenic activity at the hisD3052 allele of Salmonella, primarily in strain TA98, than in any other target or mutation assay. Using colony probe hybridization to detect a common hot spot deletion, followed by polymerase chain reaction and DNA sequencing, we have generated 10 mutation spectra from three classes of mixtures (i.e., urban air, cigarette smoke condensate, and municipal waste incinerator emissions). The mutation spectra are distinctly different among the three classes of mixtures; however, the spectra for samples within the same class of mixture are similar. In addition to the hot spot mutation, the mixtures induce complex mutations, which consist of a small deletion and a base substitution. These mutations suggest a mechanism involving misinsertion of a base opposite a DNA adduct followed by a slippage and mismatch. A role for DNA secondary structure also may be the basis for the mutational site specificity exhibited by the various mixtures. The results suggest that unique mutation spectra can be generated by different classes of complex mixtures and that such spectra are a consequence of the dominance of a particular chemical class or classes within the mixture. The problems associated with this type of research are discussed along with the potential value of mutation spectra as a tool for exposure and risk assessment. PMID:7821286

  3. Exploring the complexity of inquiry learning in an open-ended problem space

    NASA Astrophysics Data System (ADS)

    Clarke, Jody

    Data-gathering and problem identification are key components of scientific inquiry. However, few researchers have studied how students learn these skills because historically this required a time-consuming, complicated method of capturing the details of learners' data-gathering processes. Nor are classroom settings authentic contexts in which students could exhibit problem identification skills parallel to those involved in deconstructing complex real world situations. In this study of middle school students, because of my access to an innovative technology, I simulated a disease outbreak in a virtual community as a complicated, authentic problem. As students worked through the curriculum in the virtual world, their time-stamped actions were stored by the computer in event-logs. Using these records, I tracked in detail how the student scientists made sense of the complexity they faced and how they identified and investigated the problem using science-inquiry skills. To describe the degree to which students' data collection narrowed and focused on a specific disease over time, I developed a rubric and automated the coding of records in the event-logs. I measured the ongoing development of the students' "systematicity" in investigating the disease outbreak. I demonstrated that coding event-logs is an effective yet non-intrusive way of collecting and parsing detailed information about students' behaviors in real time in an authentic setting. My principal research question was "Do students who are more thoughtful about their inquiry prior to entry into the curriculum demonstrate increased systematicity in their inquiry behavior during the experience, by narrowing the focus of their data-gathering more rapidly than students who enter with lower levels of thoughtfulness about inquiry?" My sample consisted of 403 middle-school students from public schools in the US who volunteered to participate in the River City Project in spring 2008. Contrary to my hypothesis, I found

  4. Communication: Overcoming the root search problem in complex quantum trajectory calculations

    SciTech Connect

    Zamstein, Noa; Tannor, David J.

    2014-01-28

    Three new developments are presented regarding the semiclassical coherent state propagator. First, we present a conceptually different derivation of Huber and Heller's method for identifying complex root trajectories and their equations of motion [D. Huber and E. J. Heller, J. Chem. Phys. 87, 5302 (1987)]. Our method proceeds directly from the time-dependent Schrödinger equation and therefore allows various generalizations of the formalism. Second, we obtain an analytic expression for the semiclassical coherent state propagator. We show that the prefactor can be expressed in a form that requires solving significantly fewer equations of motion than in alternative expressions. Third, the semiclassical coherent state propagator is used to formulate a final value representation of the time-dependent wavefunction that avoids the root search, eliminates problems with caustics and automatically includes interference. We present numerical results for the 1D Morse oscillator showing that the method may become an attractive alternative to existing semiclassical approaches.

  5. Leadership and leadership development in healthcare settings - a simplistic solution to complex problems?

    PubMed

    McDonald, Ruth

    2014-10-01

    There is a trend in health systems around the world to place great emphasis on and faith in improving 'leadership'. Leadership has been defined in many ways and the elitist implications of traditional notions of leadership sit uncomfortably with modern healthcare organisations. The concept of distributed leadership incorporates inclusivity, collectiveness and collaboration, with the result that, to some extent, all staff, not just those in senior management roles, are viewed as leaders. Leadership development programmes are intended to equip individuals to improve leadership skills, but we know little about their effectiveness. Furthermore, the content of these programmes varies widely and the fact that many lack a sense of how they fit with individual or organisational goals raises questions about how they are intended to achieve their aims. It is important to avoid simplistic assumptions about the ability of improved leadership to solve complex problems. It is also important to evaluate leadership development programmes in ways that go beyond descriptive accounts. PMID:25337595

  6. Seeing around a Ball: Complex, Technology-Based Problems in Calculus with Applications in Science and Engineering-Redux

    ERIC Educational Resources Information Center

    Winkel, Brian

    2008-01-01

    A complex technology-based problem in visualization and computation for students in calculus is presented. Strategies are shown for its solution and the opportunities for students to put together sequences of concepts and skills to build for success are highlighted. The problem itself involves placing an object under water in order to actually see…

  7. Validation Study of a Method for Assessing Complex Ill-Structured Problem Solving by Using Causal Representations

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ifenthaler, Dirk; Ge, Xun

    2013-01-01

    The important but little understood problem that motivated this study was the lack of research on valid assessment methods to determine progress in higher-order learning in situations involving complex and ill-structured problems. Without a valid assessment method, little progress can occur in instructional design research with regard to designing…

  8. Using Educational Data Mining Methods to Assess Field-Dependent and Field-Independent Learners' Complex Problem Solving

    ERIC Educational Resources Information Center

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    The present study investigated the problem-solving performance of 101 university students and their interactions with a computer modeling tool in order to solve a complex problem. Based on their performance on the hidden figures test, students were assigned to three groups of field-dependent (FD), field-mixed (FM), and field-independent (FI)…

  9. An Investigation of the Interrelationships between Motivation, Engagement, and Complex Problem Solving in Game-Based Learning

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Law, Victor; Ifenthaler, Dirk; Ge, Xun; Miller, Raymond

    2014-01-01

    Digital game-based learning, especially massively multiplayer online games, has been touted for its potential to promote student motivation and complex problem-solving competency development. However, current evidence is limited to anecdotal studies. The purpose of this empirical investigation is to examine the complex interplay between…

  10. An immersed boundary computational model for acoustic scattering problems with complex geometries.

    PubMed

    Sun, Xiaofeng; Jiang, Yongsong; Liang, An; Jing, Xiaodong

    2012-11-01

    An immersed boundary computational model is presented in order to deal with the acoustic scattering problem by complex geometries, in which the wall boundary condition is treated as a direct body force determined by satisfying the non-penetrating boundary condition. Two distinct discretized grids are used to discrete the fluid domain and immersed boundary, respectively. The immersed boundaries are represented by Lagrangian points and the direct body force determined on these points is applied on the neighboring Eulerian points. The coupling between the Lagrangian points and Euler points is linked by a discrete delta function. The linearized Euler equations are spatially discretized with a fourth-order dispersion-relation-preserving scheme and temporal integrated with a low-dissipation and low-dispersion Runge-Kutta scheme. A perfectly matched layer technique is applied to absorb out-going waves and in-going waves in the immersed bodies. Several benchmark problems for computational aeroacoustic solvers are performed to validate the present method. PMID:23145603

  11. Enhancements of evolutionary algorithm for the complex requirements of a nurse scheduling problem

    NASA Astrophysics Data System (ADS)

    Tein, Lim Huai; Ramli, Razamin

    2014-12-01

    Over the years, nurse scheduling is a noticeable problem that is affected by the global nurse turnover crisis. The more nurses are unsatisfied with their working environment the more severe the condition or implication they tend to leave. Therefore, the current undesirable work schedule is partly due to that working condition. Basically, there is a lack of complimentary requirement between the head nurse's liability and the nurses' need. In particular, subject to highly nurse preferences issue, the sophisticated challenge of doing nurse scheduling is failure to stimulate tolerance behavior between both parties during shifts assignment in real working scenarios. Inevitably, the flexibility in shifts assignment is hard to achieve for the sake of satisfying nurse diverse requests with upholding imperative nurse ward coverage. Hence, Evolutionary Algorithm (EA) is proposed to cater for this complexity in a nurse scheduling problem (NSP). The restriction of EA is discussed and thus, enhancement on the EA operators is suggested so that the EA would have the characteristic of a flexible search. This paper consists of three types of constraints which are the hard, semi-hard and soft constraints that can be handled by the EA with enhanced parent selection and specialized mutation operators. These operators and EA as a whole contribute to the efficiency of constraint handling, fitness computation as well as flexibility in the search, which correspond to the employment of exploration and exploitation principles.

  12. Fibromyalgia and disability adjudication: No simple solutions to a complex problem

    PubMed Central

    Harth, Manfred; Nielson, Warren R

    2014-01-01

    BACKGROUND: Adjudication of disability claims related to fibromyalgia (FM) syndrome can be a challenging and complex process. A commentary published in the current issue of Pain Research & Management makes suggestions for improvement. The authors of the commentary contend that: previously and currently used criteria for the diagnosis of FM are irrelevant to clinical practice; the opinions of family physicians should supersede those of experts; there is little evidence that trauma can cause FM; no formal instruments are necessary to assess disability; and many FM patients on or applying for disability are exaggerating or malingering, and tests of symptoms validity should be used to identify malingerers. OBJECTIVES: To assess the assertions made by Fitzcharles et al. METHODS: A narrative review of the available research literature was performed. RESULTS: Available diagnostic criteria should be used in a medicolegal context; family physicians are frequently uncertain about FM and/or biased; there is considerable evidence that trauma can be a cause of FM; it is essential to use validated instruments to assess functional impairment; and the available tests of physical effort and symptom validity are of uncertain value in identifying malingering in FM. CONCLUSIONS: The available evidence does not support many of the suggestions presented in the commentary. Caution is advised in adopting simple solutions for disability adjudication in FM because they are generally incompatible with the inherently complex nature of the problem. PMID:25479149

  13. Decision Analysis for Environmental Problems

    EPA Science Inventory

    Environmental management problems are often complex and uncertain. A formal process with proper guidance is needed to understand the issues, identify sources of disagreement, and analyze the major uncertainties in environmental problems. This course will present a process that fo...

  14. Exploring Corn-Ethanol As A Complex Problem To Teach Sustainability Concepts Across The Science-Business-Liberal Arts Curriculum

    NASA Astrophysics Data System (ADS)

    Oches, E. A.; Szymanski, D. W.; Snyder, B.; Gulati, G. J.; Davis, P. T.

    2012-12-01

    The highly interdisciplinary nature of sustainability presents pedagogic challenges when sustainability concepts are incorporated into traditional disciplinary courses. At Bentley University, where over 90 percent of students major in business disciplines, we have created a multidisciplinary course module centered on corn ethanol that explores a complex social, environmental, and economic problem and develops basic data analysis and analytical thinking skills in several courses spanning the natural, physical, and social sciences within the business curriculum. Through an NSF-CCLI grant, Bentley faculty from several disciplines participated in a summer workshop to define learning objectives, create course modules, and develop an assessment plan to enhance interdisciplinary sustainability teaching. The core instructional outcome was a data-rich exercise for all participating courses in which students plot and analyze multiple parameters of corn planted and harvested for various purposes including food (human), feed (animal), ethanol production, and commodities exchanged for the years 1960 to present. Students then evaluate patterns and trends in the data and hypothesize relationships among the plotted data and environmental, social, and economic drivers, responses, and unintended consequences. After the central data analysis activity, students explore corn ethanol production as it relates to core disciplinary concepts in their individual classes. For example, students in Environmental Chemistry produce ethanol using corn and sugar as feedstocks and compare the efficiency of each process, while learning about enzymes, fermentation, distillation, and other chemical principles. Principles of Geology students examine the effects of agricultural runoff on surface water quality associated with extracting greater agricultural yield from mid-continent croplands. The American Government course examines the role of political institutions, the political process, and various

  15. The Problem with Word Problems: Solving Word Problems in Math Requires a Complex Web of Skills. But There's No Reason Why it Can't Be Fun

    ERIC Educational Resources Information Center

    Forsten, Char

    2004-01-01

    Children need to combine reading, thinking, and computational skills to solve math word problems. The author provides some strategies that principals can share with their teachers to help students become proficient and advanced problem-solvers. They include creating a conducive classroom environment, providing daily mental math activities, making…

  16. Subspace Iteration Method for Complex Eigenvalue Problems with Nonsymmetric Matrices in Aeroelastic System

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Lung, Shu

    2009-01-01

    Modern airplane design is a multidisciplinary task which combines several disciplines such as structures, aerodynamics, flight controls, and sometimes heat transfer. Historically, analytical and experimental investigations concerning the interaction of the elastic airframe with aerodynamic and in retia loads have been conducted during the design phase to determine the existence of aeroelastic instabilities, so called flutter .With the advent and increased usage of flight control systems, there is also a likelihood of instabilities caused by the interaction of the flight control system and the aeroelastic response of the airplane, known as aeroservoelastic instabilities. An in -house code MPASES (Ref. 1), modified from PASES (Ref. 2), is a general purpose digital computer program for the analysis of the closed-loop stability problem. This program used subroutines given in the International Mathematical and Statistical Library (IMSL) (Ref. 3) to compute all of the real and/or complex conjugate pairs of eigenvalues of the Hessenberg matrix. For high fidelity configuration, these aeroelastic system matrices are large and compute all eigenvalues will be time consuming. A subspace iteration method (Ref. 4) for complex eigenvalues problems with nonsymmetric matrices has been formulated and incorporated into the modified program for aeroservoelastic stability (MPASES code). Subspace iteration method only solve for the lowest p eigenvalues and corresponding eigenvectors for aeroelastic and aeroservoelastic analysis. In general, the selection of p is ranging from 10 for wing flutter analysis to 50 for an entire aircraft flutter analysis. The application of this newly incorporated code is an experiment known as the Aerostructures Test Wing (ATW) which was designed by the National Aeronautic and Space Administration (NASA) Dryden Flight Research Center, Edwards, California to research aeroelastic instabilities. Specifically, this experiment was used to study an instability

  17. Can fuzzy logic bring complex problems into focus? Modeling imprecise factors in environmental policy

    SciTech Connect

    McKone, Thomas E.; Deshpande, Ashok W.

    2004-06-14

    In modeling complex environmental problems, we often fail to make precise statements about inputs and outcome. In this case the fuzzy logic method native to the human mind provides a useful way to get at these problems. Fuzzy logic represents a significant change in both the approach to and outcome of environmental evaluations. Risk assessment is currently based on the implicit premise that probability theory provides the necessary and sufficient tools for dealing with uncertainty and variability. The key advantage of fuzzy methods is the way they reflect the human mind in its remarkable ability to store and process information which is consistently imprecise, uncertain, and resistant to classification. Our case study illustrates the ability of fuzzy logic to integrate statistical measurements with imprecise health goals. But we submit that fuzzy logic and probability theory are complementary and not competitive. In the world of soft computing, fuzzy logic has been widely used and has often been the ''smart'' behind smart machines. But it will require more effort and case studies to establish its niche in risk assessment or other types of impact assessment. Although we often hear complaints about ''bright lines,'' could we adapt to a system that relaxes these lines to fuzzy gradations? Would decision makers and the public accept expressions of water or air quality goals in linguistic terms with computed degrees of certainty? Resistance is likely. In many regions, such as the US and European Union, it is likely that both decision makers and members of the public are more comfortable with our current system in which government agencies avoid confronting uncertainties by setting guidelines that are crisp and often fail to communicate uncertainty. But some day perhaps a more comprehensive approach that includes exposure surveys, toxicological data, epidemiological studies coupled with fuzzy modeling will go a long way in resolving some of the conflict, divisiveness

  18. Computational issues in complex water-energy optimization problems: Time scales, parameterizations, objectives and algorithms

    NASA Astrophysics Data System (ADS)

    Efstratiadis, Andreas; Tsoukalas, Ioannis; Kossieris, Panayiotis; Karavokiros, George; Christofides, Antonis; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris

    2015-04-01

    Modelling of large-scale hybrid renewable energy systems (HRES) is a challenging task, for which several open computational issues exist. HRES comprise typical components of hydrosystems (reservoirs, boreholes, conveyance networks, hydropower stations, pumps, water demand nodes, etc.), which are dynamically linked with renewables (e.g., wind turbines, solar parks) and energy demand nodes. In such systems, apart from the well-known shortcomings of water resources modelling (nonlinear dynamics, unknown future inflows, large number of variables and constraints, conflicting criteria, etc.), additional complexities and uncertainties arise due to the introduction of energy components and associated fluxes. A major difficulty is the need for coupling two different temporal scales, given that in hydrosystem modeling, monthly simulation steps are typically adopted, yet for a faithful representation of the energy balance (i.e. energy production vs. demand) a much finer resolution (e.g. hourly) is required. Another drawback is the increase of control variables, constraints and objectives, due to the simultaneous modelling of the two parallel fluxes (i.e. water and energy) and their interactions. Finally, since the driving hydrometeorological processes of the integrated system are inherently uncertain, it is often essential to use synthetically generated input time series of large length, in order to assess the system performance in terms of reliability and risk, with satisfactory accuracy. To address these issues, we propose an effective and efficient modeling framework, key objectives of which are: (a) the substantial reduction of control variables, through parsimonious yet consistent parameterizations; (b) the substantial decrease of computational burden of simulation, by linearizing the combined water and energy allocation problem of each individual time step, and solve each local sub-problem through very fast linear network programming algorithms, and (c) the substantial

  19. Accurate gradient approximation for complex interface problems in 3D by an improved coupling interface method

    SciTech Connect

    Shu, Yu-Chen; Chern, I-Liang; Chang, Chien C.

    2014-10-15

    Most elliptic interface solvers become complicated for complex interface problems at those “exceptional points” where there are not enough neighboring interior points for high order interpolation. Such complication increases especially in three dimensions. Usually, the solvers are thus reduced to low order accuracy. In this paper, we classify these exceptional points and propose two recipes to maintain order of accuracy there, aiming at improving the previous coupling interface method [26]. Yet the idea is also applicable to other interface solvers. The main idea is to have at least first order approximations for second order derivatives at those exceptional points. Recipe 1 is to use the finite difference approximation for the second order derivatives at a nearby interior grid point, whenever this is possible. Recipe 2 is to flip domain signatures and introduce a ghost state so that a second-order method can be applied. This ghost state is a smooth extension of the solution at the exceptional point from the other side of the interface. The original state is recovered by a post-processing using nearby states and jump conditions. The choice of recipes is determined by a classification scheme of the exceptional points. The method renders the solution and its gradient uniformly second-order accurate in the entire computed domain. Numerical examples are provided to illustrate the second order accuracy of the presently proposed method in approximating the gradients of the original states for some complex interfaces which we had tested previous in two and three dimensions, and a real molecule ( (1D63)) which is double-helix shape and composed of hundreds of atoms.

  20. Speed and Complexity Characterize Attention Problems in Children with Localization-Related Epilepsy

    PubMed Central

    Berl, Madison; Terwilliger, Virginia; Scheller, Alexandra; Sepeta, Leigh; Walkowiak, Jenifer; Gaillard, William D.

    2015-01-01

    Summary Objective Children with epilepsy (EPI) have a higher rate of ADHD (28–70%) than typically developing (TD) children (5–10%); however, attention is multidimensional. Thus, we aimed to characterize the profile of attention difficulties in children with epilepsy. Methods Seventy-five children with localization-related epilepsy ages 6–16 and 75 age-matched controls were evaluated using multimodal, multidimensional measures of attention including direct performance and parent ratings of attention as well as intelligence testing. We assessed group differences across attention measures, determined if parent rating predicted performance on attention measures, and examined if epilepsy characteristics were associated with attention skills. Results The EPI group performed worse than the TD group on timed and complex attention aspects of attention (p<.05), while performance on simple visual and simple auditory attention tasks was comparable. Children with EPI were 12 times as likely as TD children to have clinically elevated symptoms of inattention as rated by parents, but ratings were a weak predictor of attention performance. Earlier age of onset was associated with slower motor speed (p<.01), but no other epilepsy-related clinical characteristics were associated with attention skills. Significance This study clarifies the nature of the attention problems in pediatric epilepsy, which may be under recognized. Children with EPI had difficulty with complex attention and rapid response, not simple attention. As such, they may not exhibit difficulty until later in primary school when demands increase. Parent report with standard ADHD screening tools may underdetect these higher order attention difficulties. Thus, monitoring through direct neuropsychological performance is recommended. PMID:25940056

  1. Accurate gradient approximation for complex interface problems in 3D by an improved coupling interface method

    NASA Astrophysics Data System (ADS)

    Shu, Yu-Chen; Chern, I.-Liang; Chang, Chien C.

    2014-10-01

    Most elliptic interface solvers become complicated for complex interface problems at those “exceptional points” where there are not enough neighboring interior points for high order interpolation. Such complication increases especially in three dimensions. Usually, the solvers are thus reduced to low order accuracy. In this paper, we classify these exceptional points and propose two recipes to maintain order of accuracy there, aiming at improving the previous coupling interface method [26]. Yet the idea is also applicable to other interface solvers. The main idea is to have at least first order approximations for second order derivatives at those exceptional points. Recipe 1 is to use the finite difference approximation for the second order derivatives at a nearby interior grid point, whenever this is possible. Recipe 2 is to flip domain signatures and introduce a ghost state so that a second-order method can be applied. This ghost state is a smooth extension of the solution at the exceptional point from the other side of the interface. The original state is recovered by a post-processing using nearby states and jump conditions. The choice of recipes is determined by a classification scheme of the exceptional points. The method renders the solution and its gradient uniformly second-order accurate in the entire computed domain. Numerical examples are provided to illustrate the second order accuracy of the presently proposed method in approximating the gradients of the original states for some complex interfaces which we had tested previous in two and three dimensions, and a real molecule (1D63) which is double-helix shape and composed of hundreds of atoms.

  2. Managing the Complexity of Design Problems through Studio-Based Learning

    ERIC Educational Resources Information Center

    Cennamo, Katherine; Brandt, Carol; Scott, Brigitte; Douglas, Sarah; McGrath, Margarita; Reimer, Yolanda; Vernon, Mitzi

    2011-01-01

    The ill-structured nature of design problems makes them particularly challenging for problem-based learning. Studio-based learning (SBL), however, has much in common with problem-based learning and indeed has a long history of use in teaching students to solve design problems. The purpose of this ethnographic study of an industrial design class,…

  3. Leadership and leadership development in healthcare settings – a simplistic solution to complex problems?

    PubMed Central

    McDonald, Ruth

    2014-01-01

    There is a trend in health systems around the world to place great emphasis on and faith in improving ‘leadership’. Leadership has been defined in many ways and the elitist implications of traditional notions of leadership sit uncomfortably with modern healthcare organisations. The concept of distributed leadership incorporates inclusivity, collectiveness and collaboration, with the result that, to some extent, all staff, not just those in senior management roles, are viewed as leaders. Leadership development programmes are intended to equip individuals to improve leadership skills, but we know little about their effectiveness. Furthermore, the content of these programmes varies widely and the fact that many lack a sense of how they fit with individual or organisational goals raises questions about how they are intended to achieve their aims. It is important to avoid simplistic assumptions about the ability of improved leadership to solve complex problems. It is also important to evaluate leadership development programmes in ways that go beyond descriptive accounts. PMID:25337595

  4. Case study method and problem-based learning: utilizing the pedagogical model of progressive complexity in nursing education.

    PubMed

    McMahon, Michelle A; Christopher, Kimberly A

    2011-01-01

    As the complexity of health care delivery continues to increase, educators are challenged to determine educational best practices to prepare BSN students for the ambiguous clinical practice setting. Integrative, active, and student-centered curricular methods are encouraged to foster student ability to use clinical judgment for problem solving and informed clinical decision making. The proposed pedagogical model of progressive complexity in nursing education suggests gradually introducing students to complex and multi-contextual clinical scenarios through the utilization of case studies and problem-based learning activities, with the intention to transition nursing students into autonomous learners and well-prepared practitioners at the culmination of a nursing program. Exemplar curricular activities are suggested to potentiate student development of a transferable problem solving skill set and a flexible knowledge base to better prepare students for practice in future novel clinical experiences, which is a mutual goal for both educators and students. PMID:22718667

  5. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  6. A framework to approach problems of forensic anthropology using complex networks

    NASA Astrophysics Data System (ADS)

    Caridi, Inés; Dorso, Claudio O.; Gallo, Pablo; Somigliana, Carlos

    2011-05-01

    We have developed a method to analyze and interpret emerging structures in a set of data which lacks some information. It has been conceived to be applied to the problem of getting information about people who disappeared in the Argentine state of Tucumán from 1974 to 1981. Even if the military dictatorship formally started in Argentina had begun in 1976 and lasted until 1983, the disappearance and assassination of people began some months earlier. During this period several circuits of Illegal Detention Centres (IDC) were set up in different locations all over the country. In these secret centres, disappeared people were illegally kept without any sort of constitutional guarantees, and later assassinated. Even today, the final destination of most of the disappeared people’s remains is still unknown. The fundamental hypothesis in this work is that a group of people with the same political affiliation whose disappearances were closely related in time and space shared the same place of captivity (the same IDC or circuit of IDCs). This hypothesis makes sense when applied to the systematic method of repression and disappearances which was actually launched in Tucumán, Argentina (2007) [11]. In this work, the missing individuals are identified as nodes on a network and connections are established among them based on the individuals’ attributes while they were alive, by using rules to link them. In order to determine which rules are the most effective in defining the network, we use other kind of knowledge available in this problem: previous results from the anthropological point of view (based on other sources of information, both oral and written, historical and anthropological data, etc.); and information about the place (one or more IDCs) where some people were kept during their captivity. For these best rules, a prediction about these people’s possible destination is assigned (one or more IDCs where they could have been kept), and the success of the

  7. Untangling the Complex Needs of People Experiencing Gambling Problems and Homelessness

    ERIC Educational Resources Information Center

    Holdsworth, Louise; Tiyce, Margaret

    2013-01-01

    People with gambling problems are now recognised among those at increased risk of homelessness, and the link between housing and gambling problems has been identified as an area requiring further research. This paper discusses the findings of a qualitative study that explored the relationship between gambling problems and homelessness. Interviews…

  8. Generalist solutions to complex problems: generating practice-based evidence - the example of managing multi-morbidity

    PubMed Central

    2013-01-01

    Background A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet the concept of multi-morbidity (the presence of >2 diseases) is a product of the design of health care systems which define health care need on the basis of disease status. So does the solution lie in an alternative model of healthcare? Discussion Strengthening generalist practice has been proposed as part of the solution to tackling multi-morbidity. Generalism is a professional philosophy of practice, deeply known to many practitioners, and described as expertise in whole person medicine. But generalism lacks the evidence base needed by policy makers and planners to support service redesign. The challenge is to fill this practice-research gap in order to critically explore if and when generalist care offers a robust alternative to management of this complex problem. We need practice-based evidence to fill this gap. By recognising generalist practice as a ‘complex intervention’ (intervening in a complex system), we outline an approach to evaluate impact using action-research principles. We highlight the implications for those who both commission and undertake research in order to tackle this problem. Summary Answers to the complex problem of multi-morbidity won’t come from doing more of the same. We need to change systems of care, and so the systems for generating evidence to support that care. This paper contributes to that work through outlining a process for generating practice-based evidence of generalist solutions to the complex problem of person-centred care for people with multi-morbidity. PMID:23919296

  9. New approach to the complex-action problem and its application to a nonperturbative study of superstring theory

    NASA Astrophysics Data System (ADS)

    Anagnostopoulos, K. N.; Nishimura, J.

    2002-11-01

    Monte Carlo simulations of a system whose action has an imaginary part are considered to be extremely difficult. We propose a new approach to this ``complex-action problem,'' which utilizes a factorization property of distribution functions. The basic idea is quite general, and it removes the so-called overlap problem completely. Here we apply the method to a nonperturbative study of superstring theory using its matrix formulation. In this particular example, the distribution function turns out to be positive definite, which allows us to reduce the problem even further. Our numerical results suggest an intuitive explanation for the dynamical generation of 4D space-time.

  10. Complex Networks Approach for Analyzing the Correlation of Traditional Chinese Medicine Syndrome Evolvement and Cardiovascular Events in Patients with Stable Coronary Heart Disease

    PubMed Central

    Gao, Zhuye; Li, Siwei; Jiao, Yang; Zhou, Xuezhong; Fu, Changgeng; Shi, Dazhuo; Chen, Keji

    2015-01-01

    This is a multicenter prospective cohort study to analyze the correlation of traditional Chinese medicine (TCM) syndrome evolvement and cardiovascular events in patients with stable coronary heart disease (CHD). The impact of syndrome evolvement on cardiovascular events during the 6-month and 12-month follow-up was analyzed using complex networks approach. Results of verification using Chi-square test showed that the occurrence of cardiovascular events was positively correlated with syndrome evolvement when it evolved from toxic syndrome to Qi deficiency, blood stasis, or sustained toxic syndrome, when it evolved from Qi deficiency to blood stasis, toxic syndrome, or sustained Qi deficiency, and when it evolved from blood stasis to Qi deficiency. Blood stasis, Qi deficiency, and toxic syndrome are important syndrome factors for stable CHD. There are positive correlations between cardiovascular events and syndrome evolution from toxic syndrome to Qi deficiency or blood stasis, from Qi deficiency to blood stasis, or toxic syndrome and from blood stasis to Qi deficiency. These results indicate that stable CHD patients with pathogenesis of toxin consuming Qi, toxin leading to blood stasis, and mutual transformation of Qi deficiency and blood stasis are prone to recurrent cardiovascular events. PMID:25821500

  11. DIFFERENTIAL ANALYZER

    DOEpatents

    Sorensen, E.G.; Gordon, C.M.

    1959-02-10

    Improvements in analog eomputing machines of the class capable of evaluating differential equations, commonly termed differential analyzers, are described. In general form, the analyzer embodies a plurality of basic computer mechanisms for performing integration, multiplication, and addition, and means for directing the result of any one operation to another computer mechanism performing a further operation. In the device, numerical quantities are represented by the rotation of shafts, or the electrical equivalent of shafts.

  12. Learning by Preparing to Teach: Fostering Self-Regulatory Processes and Achievement during Complex Mathematics Problem Solving

    ERIC Educational Resources Information Center

    Muis, Krista R.; Psaradellis, Cynthia; Chevrier, Marianne; Di Leo, Ivana; Lajoie, Susanne P.

    2016-01-01

    We developed an intervention based on the learning by teaching paradigm to foster self-regulatory processes and better learning outcomes during complex mathematics problem solving in a technology-rich learning environment. Seventy-eight elementary students were randomly assigned to 1 of 2 conditions: learning by preparing to teach, or learning for…

  13. Does Visualization Enhance Complex Problem Solving? The Effect of Causal Mapping on Performance in the Computer-Based Microworld Tailorshop

    ERIC Educational Resources Information Center

    Öllinger, Michael; Hammon, Stephanie; von Grundherr, Michael; Funke, Joachim

    2015-01-01

    Causal mapping is often recognized as a technique to support strategic decisions and actions in complex problem situations. Such drawing of causal structures is supposed to particularly foster the understanding of the interaction of the various system elements and to further encourage holistic thinking. It builds on the idea that humans make use…

  14. Validity of the MicroDYN Approach: Complex Problem Solving Predicts School Grades beyond Working Memory Capacity

    ERIC Educational Resources Information Center

    Schweizer, Fabian; Wustenberg, Sascha; Greiff, Samuel

    2013-01-01

    This study examines the validity of the complex problem solving (CPS) test MicroDYN by investigating a) the relation between its dimensions--rule identification (exploration strategy), rule knowledge (acquired knowledge), rule application (control performance)--and working memory capacity (WMC), and b) whether CPS predicts school grades in…

  15. Linking Complex Problem Solving and General Mental Ability to Career Advancement: Does a Transversal Skill Reveal Incremental Predictive Validity?

    ERIC Educational Resources Information Center

    Mainert, Jakob; Kretzschmar, André; Neubert, Jonas C.; Greiff, Samuel

    2015-01-01

    Transversal skills, such as complex problem solving (CPS) are viewed as central twenty-first-century skills. Recent empirical findings have already supported the importance of CPS for early academic advancement. We wanted to determine whether CPS could also contribute to the understanding of career advancement later in life. Towards this end, we…

  16. Environmental Sensing of Expert Knowledge in a Computational Evolution System for Complex Problem Solving in Human Genetics

    NASA Astrophysics Data System (ADS)

    Greene, Casey S.; Hill, Douglas P.; Moore, Jason H.

    The relationship between interindividual variation in our genomes and variation in our susceptibility to common diseases is expected to be complex with multiple interacting genetic factors. A central goal of human genetics is to identify which DNA sequence variations predict disease risk in human populations. Our success in this endeavour will depend critically on the development and implementation of computational intelligence methods that are able to embrace, rather than ignore, the complexity of the genotype to phenotype relationship. To this end, we have developed a computational evolution system (CES) to discover genetic models of disease susceptibility involving complex relationships between DNA sequence variations. The CES approach is hierarchically organized and is capable of evolving operators of any arbitrary complexity. The ability to evolve operators distinguishes this approach from artificial evolution approaches using fixed operators such as mutation and recombination. Our previous studies have shown that a CES that can utilize expert knowledge about the problem in evolved operators significantly outperforms a CES unable to use this knowledge. This environmental sensing of external sources of biological or statistical knowledge is important when the search space is both rugged and large as in the genetic analysis of complex diseases. We show here that the CES is also capable of evolving operators which exploit one of several sources of expert knowledge to solve the problem. This is important for both the discovery of highly fit genetic models and because the particular source of expert knowledge used by evolved operators may provide additional information about the problem itself. This study brings us a step closer to a CES that can solve complex problems in human genetics in addition to discovering genetic models of disease.

  17. Process Analyzer

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The ChemScan UV-6100 is a spectrometry system originally developed by Biotronics Technologies, Inc. under a Small Business Innovation Research (SBIR) contract. It is marketed to the water and wastewater treatment industries, replacing "grab sampling" with on-line data collection. It analyzes the light absorbance characteristics of a water sample, simultaneously detects hundreds of individual wavelengths absorbed by chemical substances in a process solution, and quantifies the information. Spectral data is then processed by ChemScan analyzer and compared with calibration files in the system's memory in order to calculate concentrations of chemical substances that cause UV light absorbance in specific patterns. Monitored substances can be analyzed for quality and quantity. Applications include detection of a variety of substances, and the information provided enables an operator to control a process more efficiently.

  18. Gas Analyzer

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The M200 originated in the 1970's under an Ames Research Center/Stanford University contract to develop a small, lightweight gas analyzer for Viking Landers. Although the unit was not used on the spacecraft, it was further developed by The National Institute for Occupational Safety and Health (NIOSH). Three researchers from the project later formed Microsensor Technology, Inc. (MTI) to commercialize the analyzer. The original version (Micromonitor 500) was introduced in 1982, and the M200 in 1988. The M200, a more advanced version, features dual gas chromatograph which separate a gaseous mixture into components and measure concentrations of each gas. It is useful for monitoring gas leaks, chemical spills, etc. Many analyses are completed in less than 30 seconds, and a wide range of mixtures can be analyzed.

  19. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information. Part 1—Methodology

    NASA Astrophysics Data System (ADS)

    Mejer Hansen, Thomas; Skou Cordua, Knud; Caroline Looms, Majken; Mosegaard, Klaus

    2013-03-01

    From a probabilistic point-of-view, the solution to an inverse problem can be seen as a combination of independent states of information quantified by probability density functions. Typically, these states of information are provided by a set of observed data and some a priori information on the solution. The combined states of information (i.e. the solution to the inverse problem) is a probability density function typically referred to as the a posteriori probability density function. We present a generic toolbox for Matlab and Gnu Octave called SIPPI that implements a number of methods for solving such probabilistically formulated inverse problems by sampling the a posteriori probability density function. In order to describe the a priori probability density function, we consider both simple Gaussian models and more complex (and realistic) a priori models based on higher order statistics. These a priori models can be used with both linear and non-linear inverse problems. For linear inverse Gaussian problems we make use of least-squares and kriging-based methods to describe the a posteriori probability density function directly. For general non-linear (i.e. non-Gaussian) inverse problems, we make use of the extended Metropolis algorithm to sample the a posteriori probability density function. Together with the extended Metropolis algorithm, we use sequential Gibbs sampling that allow computationally efficient sampling of complex a priori models. The toolbox can be applied to any inverse problem as long as a way of solving the forward problem is provided. Here we demonstrate the methods and algorithms available in SIPPI. An application of SIPPI, to a tomographic cross borehole inverse problems, is presented in a second part of this paper.

  20. Blood Analyzer

    NASA Technical Reports Server (NTRS)

    1992-01-01

    In the 1970's, NASA provided funding for development of an automatic blood analyzer for Skylab at the Oak Ridge National Laboratory (ORNL). ORNL devised "dynamic loading," which employed a spinning rotor to load, transfer, and analyze blood samples by centrifugal processing. A refined, commercial version of the system was produced by ABAXIS and is marketed as portable ABAXIS MiniLab MCA. Used in a doctor's office, the equipment can perform 80 to 100 chemical blood tests on a single drop of blood and report results in five minutes. Further development is anticipated.

  1. Complex Problem Solving in Radiologic Technology: Understanding the Roles of Experience, Reflective Judgment, and Workplace Culture

    ERIC Educational Resources Information Center

    Yates, Jennifer L.

    2011-01-01

    The purpose of this research study was to explore the process of learning and development of problem solving skills in radiologic technologists. The researcher sought to understand the nature of difficult problems encountered in clinical practice, to identify specific learning practices leading to the development of professional expertise, and to…

  2. Introducing the Hero Complex and the Mythic Iconic Pathway of Problem Gambling

    ERIC Educational Resources Information Center

    Nixon, Gary; Solowoniuk, Jason

    2009-01-01

    Early research into the motivations behind problem gambling reflected separate paradigms of thought splitting our understanding of the gambler into divergent categories. However, over the past 25 years, problem gambling is now best understood to arise from biological, environmental, social, and psychological processes, and is now encapsulated…

  3. The Role of Prior Knowledge and Problem Contexts in Students' Explanations of Complex System

    ERIC Educational Resources Information Center

    Barth-Cohen, Lauren April

    2012-01-01

    The purpose of this dissertation is to study students' competencies in generating scientific explanations within the domain of complex systems, an interdisciplinary area in which students tend to have difficulties. While considering students' developing explanations of how complex systems work, I investigate the role of prior knowledge…

  4. Process Analyzer

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Under a NASA Small Business Innovation Research (SBIR) contract, Axiomatics Corporation developed a shunting Dielectric Sensor to determine the nutrient level and analyze plant nutrient solutions in the CELSS, NASA's space life support program. (CELSS is an experimental facility investigating closed-cycle plant growth and food processing for long duration manned missions.) The DiComp system incorporates a shunt electrode and is especially sensitive to changes in dielectric property changes in materials at measurements much lower than conventional sensors. The analyzer has exceptional capabilities for predicting composition of liquid streams or reactions. It measures concentrations and solids content up to 100 percent in applications like agricultural products, petrochemicals, food and beverages. The sensor is easily installed; maintenance is low, and it can be calibrated on line. The software automates data collection and analysis.

  5. Oxygen analyzer

    DOEpatents

    Benner, W.H.

    1984-05-08

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N/sub 2/), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable obtained by decomposing the sample at 1135/sup 0/C, or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135/sup 0/C as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N/sub 2/, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  6. Oxygen analyzer

    DOEpatents

    Benner, William H.

    1986-01-01

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  7. Atmosphere Analyzer

    NASA Technical Reports Server (NTRS)

    1982-01-01

    California Measurements, Inc.'s model PC-2 Aerosol Particle Analyzer is produced in both airborne and ground-use versions. Originating from NASA technology, it is a quick and accurate method of detecting minute amounts of mass loadings on a quartz crystal -- offers utility as highly sensitive detector of fine particles suspended in air. When combined with suitable air delivery system, it provides immediate information on the size distribution and mass concentrations of aerosols. William Chiang, obtained a NASA license for multiple crystal oscillator technology, and initially developed a particle analyzer for NASA use with Langley Research Center assistance. Later his company produced the modified PC-2 for commercial applications Brunswick Corporation uses the device for atmospheric research and in studies of smoke particles in Fires. PC-2 is used by pharmaceutical and chemical companies in research on inhalation toxicology and environmental health. Also useful in testing various filters for safety masks and nuclear installations.

  8. Contextual approach to technology assessment: Implications for one-factor fix solutions to complex social problems

    NASA Technical Reports Server (NTRS)

    Mayo, L. H.

    1975-01-01

    The contextual approach is discussed which undertakes to demonstrate that technology assessment assists in the identification of the full range of implications of taking a particular action and facilitates the consideration of alternative means by which the total affected social problem context might be changed by available project options. It is found that the social impacts of an application on participants, institutions, processes, and social interests, and the accompanying interactions may not only induce modifications in the problem contest delineated for examination with respect to the design, operations, regulation, and use of the posited application, but also affect related social problem contexts.

  9. MULTICHANNEL ANALYZER

    DOEpatents

    Kelley, G.G.

    1959-11-10

    A multichannel pulse analyzer having several window amplifiers, each amplifier serving one group of channels, with a single fast pulse-lengthener and a single novel interrogation circuit serving all channels is described. A pulse followed too closely timewise by another pulse is disregarded by the interrogation circuit to prevent errors due to pulse pileup. The window amplifiers are connected to the pulse lengthener output, rather than the linear amplifier output, so need not have the fast response characteristic formerly required.

  10. Metabolic analyzer

    NASA Technical Reports Server (NTRS)

    Lem, J. D.

    1977-01-01

    The metabolic analyzer was designed to support experiment M171. It operates on the so-called open circuit method to measure a subject's metabolic activity in terms of oxygen consumed, carbon dioxide produced, minute volume, respiratory exchange ratio, and tidal volume or vital capacity. The system operates in either of two modes. (1) In Mode I, inhaled respiratory volumes are actually measured by a piston spirometer. (2) In Mode II, inhaled volumes are calculated from the exhaled volume and the measured inhaled and exhaled nitrogen concentrations. This second mode was the prime mode for Skylab. Following is a brief description of the various subsystems and their operation.