Science.gov

Sample records for analyzing complex problems

  1. Analyzing the many skills involved in solving complex physics problems

    NASA Astrophysics Data System (ADS)

    Adams, Wendy K.; Wieman, Carl E.

    2015-05-01

    We have empirically identified over 40 distinct sub-skills that affect a person's ability to solve complex problems in many different contexts. The identification of so many sub-skills explains why it has been so difficult to teach or assess problem solving as a single skill. The existence of these sub-skills is supported by several studies comparing a wide range of individuals' strengths and weaknesses in these sub-skills, their "problem solving fingerprint," while solving different types of problems including a classical mechanics problem, quantum mechanics problems, and a complex trip-planning problem with no physics. We see clear differences in the problem solving fingerprint of physics and engineering majors compared to the elementary education majors that we tested. The implications of these findings for guiding the teaching and assessing of problem solving in physics instruction are discussed.

  2. The Bright Side of Being Blue: Depression as an Adaptation for Analyzing Complex Problems

    ERIC Educational Resources Information Center

    Andrews, Paul W.; Thomson, J. Anderson, Jr.

    2009-01-01

    Depression is the primary emotional condition for which help is sought. Depressed people often report persistent rumination, which involves analysis, and complex social problems in their lives. Analysis is often a useful approach for solving complex problems, but it requires slow, sustained processing, so disruption would interfere with problem…

  3. Analyzing elastoplastic large deformation problems with the complex variable element-free Galerkin method

    NASA Astrophysics Data System (ADS)

    Li, D. M.; Liew, K. M.; Cheng, Y. M.

    2014-06-01

    Using the complex variable moving least-squares (CVMLS) approximation, a complex variable element-free Galerkin (CVEFG) method for two-dimensional elastoplastic large deformation problems is presented. This meshless method has higher computational precision and efficiency because in the CVMLS approximation, the trial function of a two-dimensional problem is formed with a one-dimensional basis function. For two-dimensional elastoplastic large deformation problems, the Galerkin weak form is employed to obtain its equation system. The penalty method is used to impose essential boundary conditions. Then the corresponding formulae of the CVEFG method for two-dimensional elastoplastic large deformation problems are derived. In comparison with the conventional EFG method, our study shows that the CVEFG method has higher precision and efficiency. For illustration purpose, a few selected numerical examples are presented to demonstrate the advantages of the CVEFG method.

  4. The bright side of being blue: Depression as an adaptation for analyzing complex problems

    PubMed Central

    Andrews, Paul W.; Thomson, J. Anderson

    2009-01-01

    Depression ranks as the primary emotional problem for which help is sought. Depressed people often have severe, complex problems, and rumination is a common feature. Depressed people often believe that their ruminations give them insight into their problems, but clinicians often view depressive rumination as pathological because it is difficult to disrupt and interferes with the ability to concentrate on other things. Abundant evidence indicates that depressive rumination involves the analysis of episode-related problems. Because analysis is time consuming and requires sustained processing, disruption would interfere with problem-solving. The analytical rumination (AR) hypothesis proposes that depression is an adaptation that evolved as a response to complex problems and whose function is to minimize disruption of rumination and sustain analysis of complex problems. It accomplishes this by giving episode-related problems priority access to limited processing resources, by reducing the desire to engage in distracting activities (anhedonia), and by producing psychomotor changes that reduce exposure to distracting stimuli. Because processing resources are limited, the inability to concentrate on other things is a tradeoff that must be made to sustain analysis of the triggering problem. The AR hypothesis is supported by evidence from many levels, including genes, neurotransmitters and their receptors, neurophysiology, neuroanatomy, neuroenergetics, pharmacology, cognition and behavior, and the efficacy of treatments. In addition, we address and provide explanations for puzzling findings in the cognitive and behavioral genetics literatures on depression. In the process, we challenge the belief that serotonin transmission is low in depression. Finally, we discuss implications of the hypothesis for understanding and treating depression. PMID:19618990

  5. Implementation of Complexity Analyzing Based on Additional Effect

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang

    According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.

  6. Analyzing Static Loading of Complex Structures

    NASA Technical Reports Server (NTRS)

    Gallear, D. C.

    1986-01-01

    Critical loading conditions determined from analysis of each structural element. Automated Thrust Structures Loads and Stresses (ATLAS) system is series of programs developed to analyze elements of complex structure under static-loading conditions. ATLAS calculates internal loads, beam-bending loads, column- and web-buckling loads, beam and panel stresses, and beam-corner stresses. Programs written in FORTRAN IV and Assembler for batch execution.

  7. The Politics of Analyzing Social Problems.

    ERIC Educational Resources Information Center

    Ross, Robert; Staines, Graham L.

    Two crucial processes are discussed: (1) that through which social problems become public issues; and (2) that through which conflicts between competing diagnoses of, and responses to, publicly recognized social problems are resolved. Regularities in these transformations are conceptualized as follows: groups differ in their definitions of social…

  8. Analyzing and Detecting Problems in Systems of Systems

    NASA Technical Reports Server (NTRS)

    Lindvall, Mikael; Ackermann, Christopher; Stratton, William C.; Sibol, Deane E.; Godfrey, Sally

    2008-01-01

    Many software systems are evolving complex system of systems (SoS) for which inter-system communication is mission-critical. Evidence indicates that transmission failures and performance issues are not uncommon occurrences. In a NASA-supported Software Assurance Research Program (SARP) project, we are researching a new approach addressing such problems. In this paper, we are presenting an approach for analyzing inter-system communications with the goal to uncover both transmission errors and performance problems. Our approach consists of a visualization and an evaluation component. While the visualization of the observed communication aims to facilitate understanding, the evaluation component automatically checks the conformance of an observed communication (actual) to a desired one (planned). The actual and the planned are represented as sequence diagrams. The evaluation algorithm checks the conformance of the actual to the planned diagram. We have applied our approach to the communication of aerospace systems and were successful in detecting and resolving even subtle and long existing transmission problems.

  9. Analyzing Complex Metabolomic Networks: Experiments and Simulation

    NASA Astrophysics Data System (ADS)

    Steuer, R.; Kurths, J.; Fiehn, O.; Weckwerth, W.

    2002-03-01

    In the recent years, remarkable advances in molecular biology have enabled us to measure the behavior of complex regularity networks underlying biological systems. In particular, high throughput techniques, such as gene expression arrays, allow a fast acquisition of a large number of simultaneously measured variables. Similar to gene expression, the analysis of metabolomic datasets results in a huge number of metabolite co-regulations: Metabolites are the end products of cellular regulatory processes, their level can be regarded as the ultimate response to genetic or environmental changes. In this presentation we focus on the topological description of such networks, using both, experimental data and simulations. In particular, we discuss the possibility to deduce novel links between metabolites, using concepts from (nonlinear) time series analysis and information theory.

  10. Software Analyzes Complex Systems in Real Time

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Expert system software programs, also known as knowledge-based systems, are computer programs that emulate the knowledge and analytical skills of one or more human experts, related to a specific subject. SHINE (Spacecraft Health Inference Engine) is one such program, a software inference engine (expert system) designed by NASA for the purpose of monitoring, analyzing, and diagnosing both real-time and non-real-time systems. It was developed to meet many of the Agency s demanding and rigorous artificial intelligence goals for current and future needs. NASA developed the sophisticated and reusable software based on the experience and requirements of its Jet Propulsion Laboratory s (JPL) Artificial Intelligence Research Group in developing expert systems for space flight operations specifically, the diagnosis of spacecraft health. It was designed to be efficient enough to operate in demanding real time and in limited hardware environments, and to be utilized by non-expert systems applications written in conventional programming languages. The technology is currently used in several ongoing NASA applications, including the Mars Exploration Rovers and the Spacecraft Health Automatic Reasoning Pilot (SHARP) program for the diagnosis of telecommunication anomalies during the Neptune Voyager Encounter. It is also finding applications outside of the Space Agency.

  11. Quantum Computing: Solving Complex Problems

    ScienceCinema

    DiVincenzo, David [IBM Watson Research Center

    2009-09-01

    One of the motivating ideas of quantum computation was that there could be a new kind of machine that would solve hard problems in quantum mechanics. There has been significant progress towards the experimental realization of these machines (which I will review), but there are still many questions about how such a machine could solve computational problems of interest in quantum physics. New categorizations of the complexity of computational problems have now been invented to describe quantum simulation. The bad news is that some of these problems are believed to be intractable even on a quantum computer, falling into a quantum analog of the NP class. The good news is that there are many other new classifications of tractability that may apply to several situations of physical interest.

  12. Quantum Computing: Solving Complex Problems

    SciTech Connect

    DiVincenzo, David

    2007-04-12

    One of the motivating ideas of quantum computation was that there could be a new kind of machine that would solve hard problems in quantum mechanics. There has been significant progress towards the experimental realization of these machines (which I will review), but there are still many questions about how such a machine could solve computational problems of interest in quantum physics. New categorizations of the complexity of computational problems have now been invented to describe quantum simulation. The bad news is that some of these problems are believed to be intractable even on a quantum computer, falling into a quantum analog of the NP class. The good news is that there are many other new classifications of tractability that may apply to several situations of physical interest.

  13. Quantum Computing: Solving Complex Problems

    SciTech Connect

    DiVincenzo, David

    2007-04-11

    One of the motivating ideas of quantum computation was that there could be a new kind of machine that would solve hard problems in quantum mechanics. There has been significant progress towards the experimental realization of these machines (which I will review), but there are still many questions about how such a machine could solve computational problems of interest in quantum physics. New categorizations of the complexity of computational problems have now been invented to describe quantum simulation. The bad news is that some of these problems are believed to be intractable even on a quantum computer, falling into a quantum analog of the NP class. The good news is that there are many other new classifications of tractability that may apply to several situations of physical interest.

  14. Analyzing the Origins of Childhood Externalizing Behavioral Problems

    ERIC Educational Resources Information Center

    Barnes, J. C.; Boutwell, Brian B.; Beaver, Kevin M.; Gibson, Chris L.

    2013-01-01

    Drawing on a sample of twin children from the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B; Snow et al., 2009), the current study analyzed 2 of the most prominent predictors of externalizing behavioral problems (EBP) in children: (a) parental use of spankings and (b) childhood self-regulation. A variety of statistical techniques were…

  15. Analyzing the origins of childhood externalizing behavioral problems.

    PubMed

    Barnes, J C; Boutwell, Brian B; Beaver, Kevin M; Gibson, Chris L

    2013-12-01

    Drawing on a sample of twin children from the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B; Snow et al., 2009), the current study analyzed 2 of the most prominent predictors of externalizing behavioral problems (EBP) in children: (a) parental use of spankings and (b) childhood self-regulation. A variety of statistical techniques were employed, and, overall, the findings can be summarized into 2 points. First, the results show that the relationships among spanking, self-regulation, and EBP are highly nuanced in that multiple explanations for their intercorrelations appear to fit the data (e.g., bidirectional relationships and shared methods variance). Second, genetic influences accounted for variance in each variable (EBP, spankings received, self-regulation) and even explained a portion of the covariance among the different variables. Thus, research that does not consider genetic influences when analyzing these associations runs a risk of model misspecification. PMID:23477531

  16. Analyzing Problem's Difficulty Based on Neural Networks and Knowledge Map

    ERIC Educational Resources Information Center

    Kuo, Rita; Lien, Wei-Peng; Chang, Maiga; Heh, Jia-Sheng

    2004-01-01

    This paper proposes a methodology to calculate both the difficulty of the basic problems and the difficulty of solving a problem. The method to calculate the difficulty of problem is according to the process of constructing a problem, including Concept Selection, Unknown Designation, and Proposition Construction. Some necessary measures observed…

  17. Analyzing Large Protein Complexes by Structural Mass Spectrometry

    PubMed Central

    Kirshenbaum, Noam; Michaelevski, Izhak; Sharon, Michal

    2010-01-01

    Living cells control and regulate their biological processes through the coordinated action of a large number of proteins that assemble themselves into an array of dynamic, multi-protein complexes1. To gain a mechanistic understanding of the various cellular processes, it is crucial to determine the structure of such protein complexes, and reveal how their structural organization dictates their function. Many aspects of multi-protein complexes are, however, difficult to characterize, due to their heterogeneous nature, asymmetric structure, and dynamics. Therefore, new approaches are required for the study of the tertiary levels of protein organization. One of the emerging structural biology tools for analyzing macromolecular complexes is mass spectrometry (MS)2-5. This method yields information on the complex protein composition, subunit stoichiometry, and structural topology. The power of MS derives from its high sensitivity and, as a consequence, low sample requirement, which enables examination of protein complexes expressed at endogenous levels. Another advantage is the speed of analysis, which allows monitoring of reactions in real time. Moreover, the technique can simultaneously measure the characteristics of separate populations co-existing in a mixture. Here, we describe a detailed protocol for the application of structural MS to the analysis of large protein assemblies. The procedure begins with the preparation of gold-coated capillaries for nanoflow electrospray ionization (nESI). It then continues with sample preparation, emphasizing the buffer conditions which should be compatible with nESI on the one hand, and enable to maintain complexes intact on the other. We then explain, step-by-step, how to optimize the experimental conditions for high mass measurements and acquire MS and tandem MS spectra. Finally, we chart the data processing and analyses that follow. Rather than attempting to characterize every aspect of protein assemblies, this protocol

  18. Analyzing patterns in experts' approaches to solving experimental problems

    NASA Astrophysics Data System (ADS)

    Čančula, Maja Poklinek; Planinšič, Gorazd; Etkina, Eugenia

    2015-04-01

    We report detailed observations of three pairs of expert scientists and a pair of advanced undergraduate students solving an experimental optics problem. Using a new method ("transition graphs") of visualizing sequences of logical steps, we were able to compare the groups and identify patterns that could not be found using previously existing methods. While the problem solving of undergraduates significantly differed from that of experts at the beginning of the process, it gradually became more similar to the expert problem solving. We mapped problem solving steps and their sequence to the elements of an approach to teaching and learning physics called Investigative Science Learning Environment (ISLE), and we speculate that the ISLE educational framework closely represents the actual work of physicists.

  19. Special Education Provision in Nigeria: Analyzing Contexts, Problems, and Prospects

    ERIC Educational Resources Information Center

    Obiakor, Festus E.; Offor, MaxMary Tabugbo

    2011-01-01

    Nigeria has made some efforts to educate all of its citizenry, including those with disabilities. And, it has struggled to make sure that programs are available to those who need them. However, its traditional, sociocultural, and educational problems have prevented some programmatic consistency and progress. As a result, the special education…

  20. Program for Analyzing Flows in a Complex Network

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok Kumar

    2006-01-01

    Generalized Fluid System Simulation Program (GFSSP) version 4 is a general-purpose computer program for analyzing steady-state and transient flows in a complex fluid network. The program is capable of modeling compressibility, fluid transients (e.g., water hammers), phase changes, mixtures of chemical species, and such externally applied body forces as gravitational and centrifugal ones. A graphical user interface enables the user to interactively develop a simulation of a fluid network consisting of nodes and branches. The user can also run the simulation and view the results in the interface. The system of equations for conservation of mass, energy, chemical species, and momentum is solved numerically by a combination of the Newton-Raphson and successive-substitution methods.

  1. Complex Problem Solving--More than Reasoning?

    ERIC Educational Resources Information Center

    Wustenberg, Sascha; Greiff, Samuel; Funke, Joachim

    2012-01-01

    This study investigates the internal structure and construct validity of Complex Problem Solving (CPS), which is measured by a "Multiple-Item-Approach." It is tested, if (a) three facets of CPS--"rule identification" (adequateness of strategies), "rule knowledge" (generated knowledge) and "rule application" (ability to control a system)--can be…

  2. Selecting model complexity in learning problems

    SciTech Connect

    Buescher, K.L.; Kumar, P.R.

    1993-10-01

    To learn (or generalize) from noisy data, one must resist the temptation to pick a model for the underlying process that overfits the data. Many existing techniques solve this problem at the expense of requiring the evaluation of an absolute, a priori measure of each model`s complexity. We present a method that does not. Instead, it uses a natural, relative measure of each model`s complexity. This method first creates a pool of ``simple`` candidate models using part of the data and then selects from among these by using the rest of the data.

  3. Refined scale-dependent permutation entropy to analyze systems complexity

    NASA Astrophysics Data System (ADS)

    Wu, Shuen-De; Wu, Chiu-Wen; Humeau-Heurtier, Anne

    2016-05-01

    Multiscale entropy (MSE) has become a prevailing method to quantify the complexity of systems. Unfortunately, MSE has a temporal complexity in O(N2) , which is unrealistic for long time series. Moreover, MSE relies on the sample entropy computation which is length-dependent and which leads to large variance and possible undefined entropy values for short time series. Here, we propose and introduce a new multiscale complexity measure, the refined scale-dependent permutation entropy (RSDPE). Through the processing of different kinds of synthetic data and real signals, we show that RSDPE has a behavior close to the one of MSE. Furthermore, RSDPE has a temporal complexity in O(N) . Finally, RSDPE has the advantage of being much less length-dependent than MSE. From all this, we conclude that RSDPE over-performs MSE in terms of computational cost and computational accuracy.

  4. Analyzing Complex and Structured Data via Unsupervised Learning Techniques

    NASA Astrophysics Data System (ADS)

    Polsterer, Kai Lars; Gieseke, Fabian; Gianniotis, Nikos; Kügler, Dennis

    2015-08-01

    In the last decades more and more dedicated all-sky-surveys created an enormous amount of data which is publicly available on the internet. The resulting datasets contain spatial, spectral, and temporal information which exhibit complex structures in the respective domain. The capability to deal with morphological features, spectral signatures, and complex time series data has become very important but is still a challenging task. A common approach when processing this kind of structured data is to extract representative features and use those for a further analysis. We present unsupervised learning approaches that help to visualize / cluster these complex data sets by e.g. deriving rotation / translation invariant prototypes or capturing the latent dynamics of time series without employing features and using echo-state-networks instead.

  5. New Approach to Analyzing Physics Problems: A Taxonomy of Introductory Physics Problems

    ERIC Educational Resources Information Center

    Teodorescu, Raluca E.; Bennhold, Cornelius; Feldman, Gerald; Medsker, Larry

    2013-01-01

    This paper describes research on a classification of physics problems in the context of introductory physics courses. This classification, called the Taxonomy of Introductory Physics Problems (TIPP), relates physics problems to the cognitive processes required to solve them. TIPP was created in order to design educational objectives, to develop…

  6. Fractal applications to complex crustal problems

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1989-01-01

    Complex scale-invariant problems obey fractal statistics. The basic definition of a fractal distribution is that the number of objects with a characteristic linear dimension greater than r satisfies the relation N = about r exp -D where D is the fractal dimension. Fragmentation often satisfies this relation. The distribution of earthquakes satisfies this relation. The classic relationship between the length of a rocky coast line and the step length can be derived from this relation. Power law relations for spectra can also be related to fractal dimensions. Topography and gravity are examples. Spectral techniques can be used to obtain maps of fractal dimension and roughness amplitude. These provide a quantitative measure of texture analysis. It is argued that the distribution of stress and strength in a complex crustal region, such as the Alps, is fractal. Based on this assumption, the observed frequency-magnitude relation for the seismicity in the region can be derived.

  7. Complex energies and the polyelectronic Stark problem

    NASA Astrophysics Data System (ADS)

    Themelis, Spyros I.; Nicolaides, Cleanthes A.

    2000-12-01

    The problem of computing the energy shifts and widths of ground or excited N-electron atomic states perturbed by weak or strong static electric fields is dealt with by formulating a state-specific complex eigenvalue Schrödinger equation (CESE), where the complex energy contains the field-induced shift and width. The CESE is solved to all orders nonperturbatively, by using separately optimized N-electron function spaces, composed of real and complex one-electron functions, the latter being functions of a complex coordinate. The use of such spaces is a salient characteristic of the theory, leading to economy and manageability of calculation in terms of a two-step computational procedure. The first step involves only Hermitian matrices. The second adds complex functions and the overall computation becomes non-Hermitian. Aspects of the formalism and of computational strategy are compared with those of the complex absorption potential (CAP) method, which was recently applied for the calculation of field-induced complex energies in H and Li. Also compared are the numerical results of the two methods, and the questions of accuracy and convergence that were posed by Sahoo and Ho (Sahoo S and Ho Y K 2000 J. Phys. B: At. Mol. Opt. Phys. 33 2195) are explored further. We draw attention to the fact that, because in the region where the field strength is weak the tunnelling rate (imaginary part of the complex eigenvalue) diminishes exponentially, it is possible for even large-scale nonperturbative complex eigenvalue calculations either to fail completely or to produce seemingly stable results which, however, are wrong. It is in this context that the discrepancy in the width of Li 1s22s 2S between results obtained by the CAP method and those obtained by the CESE method is interpreted. We suggest that the very-weak-field regime must be computed by the golden rule, provided the continuum is represented accurately. In this respect, existing one-particle semiclassical formulae seem

  8. MatOFF: A Tool For Analyzing Behaviorally-Complex Neurophysiological Experiments

    PubMed Central

    Genovesio, Aldo; Mitz, Andrew R.

    2007-01-01

    The simple operant conditioning originally used in behavioral neurophysiology 30 years ago has given way to complex and sophisticated behavioral paradigms; so much so, that early general purpose programs for analyzing neurophysiological data are ill-suited for complex experiments. The trend has been to develop custom software for each class of experiment, but custom software can have serious drawbacks. We describe here a general purpose software tool for behavioral and electrophysiological studies, called MatOFF, that is especially suited for processing neurophysiological data gathered during the execution of complex behaviors. Written in the MATLAB programming language, MatOFF solves the problem of handling complex analysis requirements in a unique and powerful way. While other neurophysiological programs are either a loose collection of tools or append MATLAB as a post-processing step, MatOFF is an integrated environment that supports MATLAB scripting within the event search engine safely isolated in programming sandbox. The results from scripting are stored separately, but in parallel with the raw data, and thus available to all subsequent MatOFF analysis and display processing. An example from a recently published experiment shows how all the features of MatOFF work together to analyze complex experiments and mine neurophysiological data in efficient ways. PMID:17604115

  9. System and method for modeling and analyzing complex scenarios

    DOEpatents

    Shevitz, Daniel Wolf

    2013-04-09

    An embodiment of the present invention includes a method for analyzing and solving possibility tree. A possibility tree having a plurality of programmable nodes is constructed and solved with a solver module executed by a processor element. The solver module executes the programming of said nodes, and tracks the state of at least a variable through a branch. When a variable of said branch is out of tolerance with a parameter, the solver disables remaining nodes of the branch and marks the branch as an invalid solution. The valid solutions are then aggregated and displayed as valid tree solutions.

  10. Estimating uncertainties in complex joint inverse problems

    NASA Astrophysics Data System (ADS)

    Afonso, Juan Carlos

    2016-04-01

    Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related

  11. Network-Thinking: Graphs to Analyze Microbial Complexity and Evolution

    PubMed Central

    Corel, Eduardo; Lopez, Philippe; Méheust, Raphaël; Bapteste, Eric

    2016-01-01

    The tree model and tree-based methods have played a major, fruitful role in evolutionary studies. However, with the increasing realization of the quantitative and qualitative importance of reticulate evolutionary processes, affecting all levels of biological organization, complementary network-based models and methods are now flourishing, inviting evolutionary biology to experience a network-thinking era. We show how relatively recent comers in this field of study, that is, sequence-similarity networks, genome networks, and gene families–genomes bipartite graphs, already allow for a significantly enhanced usage of molecular datasets in comparative studies. Analyses of these networks provide tools for tackling a multitude of complex phenomena, including the evolution of gene transfer, composite genes and genomes, evolutionary transitions, and holobionts. PMID:26774999

  12. Analyzing complex networks through correlations in centrality measurements

    NASA Astrophysics Data System (ADS)

    Furlan Ronqui, José Ricardo; Travieso, Gonzalo

    2015-05-01

    Many real world systems can be expressed as complex networks of interconnected nodes. It is frequently important to be able to quantify the relative importance of the various nodes in the network, a task accomplished by defining some centrality measures, with different centrality definitions stressing different aspects of the network. It is interesting to know to what extent these different centrality definitions are related for different networks. In this work, we study the correlation between pairs of a set of centrality measures for different real world networks and two network models. We show that the centralities are in general correlated, but with stronger correlations for network models than for real networks. We also show that the strength of the correlation of each pair of centralities varies from network to network. Taking this fact into account, we propose the use of a centrality correlation profile, consisting of the values of the correlation coefficients between all pairs of centralities of interest, as a way to characterize networks. Using the yeast protein interaction network as an example we show also that the centrality correlation profile can be used to assess the adequacy of a network model as a representation of a given real network.

  13. Analyzing complex gaze behavior in the natural world

    NASA Astrophysics Data System (ADS)

    Pelz, Jeff B.; Kinsman, Thomas B.; Evans, Karen M.

    2011-03-01

    The history of eye-movement research extends back at least to 1794, when Erasmus Darwin (Charles' grandfather) published Zoonomia, including descriptions of eye movements due to self-motion. But research on eye movements was restricted to the laboratory for 200 years, until Michael Land built the first wearable eyetracker at the University of Sussex and published the seminal paper "Where we look when we steer" [1]. In the intervening centuries, we learned a tremendous amount about the mechanics of the oculomotor system and how it responds to isolated stimuli, but virtually nothing about how we actually use our eyes to explore, gather information, navigate, and communicate in the real world. Inspired by Land's work, we have been working to extend knowledge in these areas by developing hardware, algorithms, and software that have allowed researchers to ask questions about how we actually use vision in the real world. Central to that effort are new methods for analyzing the volumes of data that come from the experiments made possible by the new systems. We describe a number of recent experiments and SemantiCode, a new program that supports assisted coding of eye-movement data collected in unrestricted environments.

  14. Analyzing Problems in Schools and School Systems: A Theoretical Approach. Topics in Educational Leadership.

    ERIC Educational Resources Information Center

    Gaynor, Alan Kibbe

    This book is directed toward students in organizational-theory and problem-analysis classes and their professors, as well as school administrators seeking to examine their problems and policies from new perspectives. It explains and illustrates methodology for describing, documenting, and analyzing organizational problems. Part I, "Methodology,"…

  15. Hybrid techniques for complex aerospace electromagnetics problems

    NASA Technical Reports Server (NTRS)

    Aberle, Jim

    1993-01-01

    Important aerospace electromagnetics problems include the evaluation of antenna performance on aircraft and the prediction and control of the aircraft's electromagnetic signature. Due to the ever increasing complexity and expense of aircraft design, aerospace engineers have become increasingly dependent on computer solutions. Traditionally, computational electromagnetics (CEM) has relied primarily on four disparate techniques: the method of moments (MoM), the finite-difference time-domain (FDTD) technique, the finite element method (FEM), and high frequency asymptotic techniques (HFAT) such as ray tracing. Each of these techniques has distinct advantages and disadvantages, and no single technique is capable of accurately solving all problems of interest on computers that are available now or will be available in the foreseeable future. As a result, new approaches that overcome the deficiencies of traditional techniques are beginning to attract a great deal of interest in the CEM community. Among these new approaches are hybrid methods which combine two or more of these techniques into a coherent model. During the ASEE Summer Faculty Fellowship Program a hybrid FEM/MoM computer code was developed and applied to a geometry containing features found on many modern aircraft.

  16. The Guarding Problem - Complexity and Approximation

    NASA Astrophysics Data System (ADS)

    Reddy, T. V. Thirumala; Krishna, D. Sai; Rangan, C. Pandu

    Let G = (V, E) be the given graph and G R = (V R ,E R ) and G C = (V C ,E C ) be the sub graphs of G such that V R ∩ V C = ∅ and V R ∪ V C = V. G C is referred to as the cops region and G R is called as the robber region. Initially a robber is placed at some vertex of V R and the cops are placed at some vertices of V C . The robber and cops may move from their current vertices to one of their neighbours. While a cop can move only within the cops region, the robber may move to any neighbour. The robber and cops move alternatively. A vertex v ∈ V C is said to be attacked if the current turn is the robber's turn, the robber is at vertex u where u ∈ V R , (u,v) ∈ E and no cop is present at v. The guarding problem is to find the minimum number of cops required to guard the graph G C from the robber's attack. We first prove that the decision version of this problem when G R is an arbitrary undirected graph is PSPACE-hard. We also prove that the complexity of the decision version of the guarding problem when G R is a wheel graph is NP-hard. We then present approximation algorithms if G R is a star graph, a clique and a wheel graph with approximation ratios H(n 1), 2 H(n 1) and left( H(n1) + 3/2 right) respectively, where H(n1) = 1 + 1/2 + ... + 1/n1 and n 1 = ∣ V R ∣.

  17. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 2. [sample problem library guide

    NASA Technical Reports Server (NTRS)

    Jackson, C. E., Jr.

    1977-01-01

    A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.

  18. Decomposing a complex design problem using CLIPS

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    1990-01-01

    Many engineering systems are large and multidisciplinary. Before the design of such complex systems can begin, much time and money are invested in determining the possible couplings among the participating subsystems and their parts. For designs based on existing concepts, like commercial aircraft design, the subsystems and their couplings are usually well-established. However, for designs based on novel concepts, like large space platforms, the determination of the subsystems, couplings, and participating disciplines is an important task. Moreover, this task must be repeated as new information becomes available or as the design specifications change. Determining the subsystems is not an easy, straightforward process and often important couplings are overlooked. The design manager must know how to divide the design work among the design teams so that changes in one subsystem will have predictable effects on other subsystems. The resulting subsystems must be ordered into a hierarchical structure before the planning documents and milestones of the design project are set. The success of a design project often depends on the wise choice of design variables, constraints, objective functions, and the partitioning of these among the design teams. Very few tools are available to aid the design manager in determining the hierarchical structure of a design problem and assist in making these decisions.

  19. Problems and solutions in analyzing partial-reflection drift data by correlation techniques

    NASA Technical Reports Server (NTRS)

    Meek, C. E.

    1984-01-01

    Solutions in analyzing partial reflection drift data by correlation techniques are discussed. The problem of analyzing spaced antenna drift data breaks down into the general categories of raw data collection and storage, correlation calculation, interpretation of correlations, location of time lags for peak correlation, and velocity calculation.

  20. Complex network problems in physics, computer science and biology

    NASA Astrophysics Data System (ADS)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe

  1. Solving Complex Problems: A Convergent Approach to Cognitive Load Measurement

    ERIC Educational Resources Information Center

    Zheng, Robert; Cook, Anne

    2012-01-01

    The study challenged the current practices in cognitive load measurement involving complex problem solving by manipulating the presence of pictures in multiple rule-based problem-solving situations and examining the cognitive load resulting from both off-line and online measures associated with complex problem solving. Forty-eight participants…

  2. Team-Based Complex Problem Solving: A Collective Cognition Perspective

    ERIC Educational Resources Information Center

    Hung, Woei

    2013-01-01

    Today, much problem solving is performed by teams, rather than individuals. The complexity of these problems has exceeded the cognitive capacity of any individual and requires a team of members to solve them. The success of solving these complex problems not only relies on individual team members who possess different but complementary expertise,…

  3. Managing Complex Problems in Rangeland Ecosystems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Management of rangelands, and natural resources in general, has become increasingly complex. There is an atmosphere of increasing expectations for conservation efforts associated with a variety of issues from water quality to endangered species. We argue that many current issues are complex by their...

  4. Complex partial status epilepticus: a recurrent problem.

    PubMed Central

    Cockerell, O C; Walker, M C; Sander, J W; Shorvon, S D

    1994-01-01

    Twenty patients with complex partial status epilepticus were identified retrospectively from a specialist neurology hospital. Seventeen patients experienced recurrent episodes of complex partial status epilepticus, often occurring at regular intervals, usually over many years, and while being treated with effective anti-epileptic drugs. No unifying cause for the recurrences, and no common epilepsy aetiologies, were identified. In spite of the frequency of recurrence and length of history, none of the patients showed any marked evidence of cognitive or neurological deterioration. Complex partial status epilepticus is more common than is generally recognised, should be differentiated from other forms of non-convulsive status, and is often difficult to treat. PMID:8021671

  5. Organizational Structure and Complex Problem Solving

    ERIC Educational Resources Information Center

    Becker, Selwyn W.; Baloff, Nicholas

    1969-01-01

    The problem-solving efficiency of different organization structures is discussed in relation to task requirements and the appropriate organizational behavior, to group adaptation to a task over time, and to various group characteristics. (LN)

  6. Solving the Inverse-Square Problem with Complex Variables

    ERIC Educational Resources Information Center

    Gauthier, N.

    2005-01-01

    The equation of motion for a mass that moves under the influence of a central, inverse-square force is formulated and solved as a problem in complex variables. To find the solution, the constancy of angular momentum is first established using complex variables. Next, the complex position coordinate and complex velocity of the particle are assumed…

  7. Building an information model (with the help of PSL/PSA). [Problem Statement Language/Problem Statement Analyzer

    NASA Technical Reports Server (NTRS)

    Callender, E. D.; Farny, A. M.

    1983-01-01

    Problem Statement Language/Problem Statement Analyzer (PSL/PSA) applications, which were once a one-step process in which product system information was immediately translated into PSL statements, have in light of experience been shown to result in inconsistent representations. These shortcomings have prompted the development of an intermediate step, designated the Product System Information Model (PSIM), which provides a basis for the mutual understanding of customer terminology and the formal, conceptual representation of that product system in a PSA data base. The PSIM is initially captured as a paper diagram, followed by formal capture in the PSL/PSA data base.

  8. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  9. Analyzing the Responses of 7-8 Year Olds When Solving Partitioning Problems

    ERIC Educational Resources Information Center

    Badillo, Edelmira; Font, Vicenç; Edo, Mequè

    2015-01-01

    We analyze the mathematical solutions of 7- to 8-year-old pupils while individually solving an arithmetic problem. The analysis was based on the "configuration of objects," an instrument derived from the onto-semiotic approach to mathematical knowledge. Results are illustrated through a number of cases. From the analysis of mathematical…

  10. How Unstable Are Complex Financial Systems? Analyzing an Inter-bank Network of Credit Relations

    NASA Astrophysics Data System (ADS)

    Sinha, Sitabhra; Thess, Maximilian; Markose, Sheri

    The recent worldwide economic crisis of 2007-09 has focused attention on the need to analyze systemic risk in complex financial networks. We investigate the problem of robustness of such systems in the context of the general theory of dynamical stability in complex networks and, in particular, how the topology of connections influence the risk of the failure of a single institution triggering a cascade of successive collapses propagating through the network. We use data on bilateral liabilities (or exposure) in the derivatives market between 202 financial intermediaries based in USA and Europe in the last quarter of 2009 to empirically investigate the network structure of the over-the-counter (OTC) derivatives market. We observe that the network exhibits both heterogeneity in node properties and the existence of communities. It also has a prominent core-periphery organization and can resist large-scale collapse when subjected to individual bank defaults (however, failure of any bank in the core may result in localized collapse of the innermost core with substantial loss of capital) but is vulnerable to system-wide breakdown as a result of an accompanying liquidity crisis.

  11. Analyzing HIV/AIDS and Alcohol and Other Drug Use as a Social Problem

    PubMed Central

    PATTERSON, DAVID A.; Wolf (Adelv unegv Waya), Silver

    2012-01-01

    Most prevention and intervention activities directed toward HIV/AIDS and alcohol and other drug use separately as well as the combining of the two (e.g., those who are both HIV/AIDS and using alcohol and other drugs) comes in the form of specific, individualized therapies without consideration of social influences that may have a greater impact on this population. Approaching this social problem from the narrowed view of individualized, mi-cro solutions disregards the larger social conditions that affect or perhaps even are at the root of the problem. This paper analyzes the social problem of HIV/AIDS and alcohol and other drug abuse using three sociological perspectives—social construction theory, ethnomethodology, and conflict theory—informing the reader of the broader influences accompanying this problem. PMID:23264724

  12. Multigrid Methods for Aerodynamic Problems in Complex Geometries

    NASA Technical Reports Server (NTRS)

    Caughey, David A.

    1995-01-01

    Work has been directed at the development of efficient multigrid methods for the solution of aerodynamic problems involving complex geometries, including the development of computational methods for the solution of both inviscid and viscous transonic flow problems. The emphasis is on problems of complex, three-dimensional geometry. The methods developed are based upon finite-volume approximations to both the Euler and the Reynolds-Averaged Navier-Stokes equations. The methods are developed for use on multi-block grids using diagonalized implicit multigrid methods to achieve computational efficiency. The work is focused upon aerodynamic problems involving complex geometries, including advanced engine inlets.

  13. Complex Mathematical Problem Solving by Individuals and Dyads.

    ERIC Educational Resources Information Center

    Vye, Nancy J.; Goldman, Susan R.; Voss, James F.; Hmelo, Cindy; Williams, Susan; Cognition and Technology Group at Vanderbilt University

    1997-01-01

    Describes two studies of mathematical problem solving using an episode from "The Adventures of Jasper Woodbury," a set of curriculum materials that afford complex problem-solving opportunities. Discussion focuses on characteristics of problems that make solutions difficult, kinds of reasoning that dyadic interactions support, and considerations of…

  14. Preparing for Complexity and Wicked Problems through Transformational Learning Approaches

    ERIC Educational Resources Information Center

    Yukawa, Joyce

    2015-01-01

    As the information environment becomes increasingly complex and challenging, Library and Information Studies (LIS) education is called upon to nurture innovative leaders capable of managing complex situations and "wicked problems." While disciplinary expertise remains essential, higher levels of mental complexity and adaptive…

  15. A New Approach to Analyzing the Cognitive Load in Physics Problems

    NASA Astrophysics Data System (ADS)

    Teodorescu, Raluca

    2010-02-01

    I will present a Taxonomy of Introductory Physics Problems (TIPP), which relates physics problems to the cognitive processes and the knowledge required to solve them. TIPP was created for designing and clarifying educational objectives, for developing assessments to evaluate components of the problem-solving process, and for guiding curriculum design in introductory physics courses. To construct TIPP, I considered processes that have been identified either by cognitive science and expert-novice research or by direct observation of students' behavior while solving physics problems. Based on Marzano and Kendall's taxonomy [1], I developed a procedure to classify physics problems according to the cognitive processes that they involve and the knowledge to which they refer. The procedure is applicable to any physics problem and its validity and reliability have been confirmed. This algorithm was then used to build TIPP, which is a database that contains text-based and research-based physics problems and explains their relationship to cognitive processes and knowledge. TIPP has been used in the years 2006--2009 to reform the first semester of the introductory algebra-based physics course at The George Washington University. The reform targeted students' cognitive development and attitudes improvement. The methodology employed in the course involves exposing students to certain types of problems in a variety of contexts with increasing complexity. To assess the effectiveness of our approach, rubrics were created to evaluate students' problem-solving abilities and the Colorado Learning Attitudes about Science Survey (CLASS) was administered pre- and post-instruction to determine students' shift in dispositions towards learning physics. Our results show definitive gains in the areas targeted by our curricular reform.[4pt] [1] R.J. Marzano and J.S. Kendall, The New Taxonomy of Educational Objectives, 2^nd Ed., (Corwin Press, Thousand Oaks, 2007). )

  16. The ESTER particle and plasma analyzer complex for the phobos mission

    NASA Astrophysics Data System (ADS)

    Afonin, V. V.; McKenna-Lawlor, S.; Kiraly, P.; Marsden, R.; Richter, A.; Rusznyak, P.; Shutte, N. M.; Szabo, L.; Szalai, S.; Szucs, I. T.; Varhalmi, L.; Witte, M.

    1990-05-01

    The ESTER particle and plasma analyzer system for the Phobos Mission comprised a complex of three instruments (LET, SLED and HARP) serviced by a common Data Processing Unit. An account is provided of this complex, its objectives and excellent performance in space.

  17. Minimum structural controllability problems of complex networks

    NASA Astrophysics Data System (ADS)

    Yin, Hongli; Zhang, Siying

    2016-02-01

    Controllability of complex networks has been one of the attractive research areas for both network and control community, and has yielded many promising and significant results in minimum inputs and minimum driver vertices. However, few studies have been devoted to studying the minimum controlled vertex set through which control over the network with arbitrary structure can be achieved. In this paper, we prove that the minimum driver vertices driven by different inputs are not sufficient to ensure the full control of the network when the associated graph contains the inaccessible strongly connected component which has perfect matching and propose an algorithm to identify a minimum controlled vertex set for network with arbitrary structure using convenient graph and mathematical tools. And the simulation results show that the controllability of network is correlated to the number of inaccessible strongly connected components which have perfect matching and these results promote us to better understand the relationship between the network's structural characteristics and its control.

  18. Completed Beltrami-Michell formulation for analyzing mixed boundary value problems in elasticity

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Kaljevic, Igor; Hopkins, Dale A.; Saigal, Sunil

    1995-01-01

    In elasticity, the method of forces, wherein stress parameters are considered as the primary unknowns, is known as the Beltrami-Michell formulation (BMF). The existing BMF can only solve stress boundary value problems; it cannot handle the more prevalent displacement of mixed boundary value problems of elasticity. Therefore, this formulation, which has restricted application, could not become a true alternative to the Navier's displacement method, which can solve all three types of boundary value problems. The restrictions in the BMF have been alleviated by augmenting the classical formulation with a novel set of conditions identified as the boundary compatibility conditions. This new method, which completes the classical force formulation, has been termed the completed Beltrami-Michell formulation (CBMF). The CBMF can solve general elasticity problems with stress, displacement, and mixed boundary conditions in terms of stresses as the primary unknowns. The CBMF is derived from the stationary condition of the variational functional of the integrated force method. In the CBMF, stresses for kinematically stable structures can be obtained without any reference to the displacements either in the field or on the boundary. This paper presents the CBMF and its derivation from the variational functional of the integrated force method. Several examples are presented to demonstrate the applicability of the completed formulation for analyzing mixed boundary value problems under thermomechanical loads. Selected example problems include a cylindrical shell wherein membrane and bending responses are coupled, and a composite circular plate.

  19. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 1. [thermal analyzer manual

    NASA Technical Reports Server (NTRS)

    Lee, H. P.

    1977-01-01

    The NASTRAN Thermal Analyzer Manual describes the fundamental and theoretical treatment of the finite element method, with emphasis on the derivations of the constituent matrices of different elements and solution algorithms. Necessary information and data relating to the practical applications of engineering modeling are included.

  20. The complex problem of sensitive skin.

    PubMed

    Marriott, Marie; Holmes, Jo; Peters, Lisa; Cooper, Karen; Rowson, Matthew; Basketter, David A

    2005-08-01

    There exists within the population subsets of individuals who display heightened skin reactivity to materials the majority find tolerable. In a series of investigations, we have examined interrelationships between many of the endpoints associated with the term 'sensitive skin'. In the most recent work, 58 volunteers were treated with 10% lactic acid, 50% ethanol, 0.5% menthol and 1.0% capsaicin on the nasolabial fold, unoccluded, with sensory reactions recorded at 2.5 min, 5 min and 8 min after application. Urticant susceptibility was evaluated with 1 m benzoic acid and 125 mM trans-cinnamic acid applied to the volar forearm for 20 min. A 2 x 23-h patch test was also conducted using 0.1% and 0.3% sodium dodecyl sulfate, 0.3% and 0.6% cocamidopropyl betaine and 0.1% and 0.2% benzalkonium chloride to determine irritant susceptibility. As found in previous studies, increased susceptibility to one endpoint was not predictive of sensitivity to another. In our experience, nasolabial stinging was a poor predictor of general skin sensitivity. Nevertheless, it may be possible to identify in the normal population individuals who, coincidentally, are more generally sensitive to a range of non-immunologic adverse skin reactions. Whether such individuals are those who experience problems with skin care products remains to be addressed. PMID:16033403

  1. Semantic Annotation of Complex Text Structures in Problem Reports

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Throop, David R.; Fleming, Land D.

    2011-01-01

    Text analysis is important for effective information retrieval from databases where the critical information is embedded in text fields. Aerospace safety depends on effective retrieval of relevant and related problem reports for the purpose of trend analysis. The complex text syntax in problem descriptions has limited statistical text mining of problem reports. The presentation describes an intelligent tagging approach that applies syntactic and then semantic analysis to overcome this problem. The tags identify types of problems and equipment that are embedded in the text descriptions. The power of these tags is illustrated in a faceted searching and browsing interface for problem report trending that combines automatically generated tags with database code fields and temporal information.

  2. On the Complexity of Rearrangement Problems under the Breakpoint Distance

    PubMed Central

    2014-01-01

    Abstract We study the complexity of rearrangement problems in the generalized breakpoint model of Tannier et al. and settle several open questions. We improve the algorithm for the median problem and show that it is equivalent to the problem of finding maximum cardinality nonbipartite matching (under linear reduction). On the other hand, we prove that the more general small phylogeny problem is NP-hard. Surprisingly, we show that it is already NP-hard (or even APX-hard) for a quartet phylogeny. We also show that in the unichromosomal and the multilinear breakpoint model the halving problem is NP-hard, refuting the conjecture of Tannier et al. Interestingly, this is the first problem that is harder in the breakpoint model than in the double cut and join or reversal models. PMID:24200391

  3. Particle swarm optimization for complex nonlinear optimization problems

    NASA Astrophysics Data System (ADS)

    Alexandridis, Alex; Famelis, Ioannis Th.; Tsitouras, Charalambos

    2016-06-01

    This work presents the application of a technique belonging to evolutionary computation, namely particle swarm optimization (PSO), to complex nonlinear optimization problems. To be more specific, a PSO optimizer is setup and applied to the derivation of Runge-Kutta pairs for the numerical solution of initial value problems. The effect of critical PSO operational parameters on the performance of the proposed scheme is thoroughly investigated.

  4. Theory of periodically specified problems: Complexity and approximability

    SciTech Connect

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.; Rosenkrantz, D.J.

    1997-12-05

    We study the complexity and the efficient approximability of graph and satisfiability problems when specified using various kinds of periodic specifications studied. The general results obtained include the following: (1) We characterize the complexities of several basic generalized CNF satisfiability problems SAT(S) [Sc78], when instances are specified using various kinds of 1- and 2-dimensional periodic specifications. We outline how this characterization can be used to prove a number of new hardness results for the complexity classes DSPACE(n), NSPACE(n), DEXPTIME, NEXPTIME, EXPSPACE etc. These results can be used to prove in a unified way the hardness of a number of combinatorial problems when instances are specified succinctly using various succient specifications considered in the literature. As one corollary, we show that a number of basic NP-hard problems because EXPSPACE-hard when inputs are represented using 1-dimensional infinite periodic wide specifications. This answers a long standing open question posed by Orlin. (2) We outline a simple yet a general technique to devise approximation algorithms with provable worst case performance guarantees for a number of combinatorial problems specified periodically. Our efficient approximation algorithms and schemes are based on extensions of the ideas and represent the first non-trivial characterization of a class of problems having an {epsilon}-approximation (or PTAS) for periodically specified NEXPTIME-hard problems. Two of properties of our results are: (i) For the first time, efficient approximation algorithms and schemes have been developed for natural NEXPTIME-complete problems. (ii) Our results are the first polynomial time approximation algorithms with good performance guarantees for hard problems specified using various kinds of periodic specifications considered in this paper.

  5. Application of NASA management approach to solve complex problems on earth

    NASA Technical Reports Server (NTRS)

    Potate, J. S.

    1972-01-01

    The application of NASA management approach to solving complex problems on earth is discussed. The management of the Apollo program is presented as an example of effective management techniques. Four key elements of effective management are analyzed. Photographs of the Cape Kennedy launch sites and supporting equipment are included to support the discussions.

  6. What Do Employers Pay for Employees' Complex Problem Solving Skills?

    ERIC Educational Resources Information Center

    Ederer, Peer; Nedelkoska, Ljubica; Patt, Alexander; Castellazzi, Silvia

    2015-01-01

    We estimate the market value that employers assign to the complex problem solving (CPS) skills of their employees, using individual-level Mincer-style wage regressions. For the purpose of the study, we collected new and unique data using psychometric measures of CPS and an extensive background questionnaire on employees' personal and work history.…

  7. Investigating the Effect of Complexity Factors in Gas Law Problems

    ERIC Educational Resources Information Center

    Schuttlefield, Jennifer D.; Kirk, John; Pienta, Norbert J.; Tang, Hui

    2012-01-01

    Undergraduate students were asked to complete gas law questions using a Web-based tool as a first step in our understanding of the role of cognitive load in chemistry word questions and in helping us assess student problem-solving. Each question contained five different complexity factors, which were randomly assigned by the tool so that a…

  8. Emergent Science: Solving complex science problems via collaborations

    NASA Astrophysics Data System (ADS)

    Li, X.; Ramachandran, R.; Wilson, B. D.; Lynnes, C.; Conover, H.

    2009-12-01

    The recent advances in Cyberinfrastructure have democratized the use of computational and data resources. These resources together with new social networking and collaboration technologies, present an unprecedented opportunity to impact the science process. These advances can move the science process from “circumspect science” -- where scientists publish only when the project is complete, publish only the final results, seldom publish things that did not work, and communicate results with each other using paper technology -- to “open science” -- where scientists can share and publish every element in their research, from the data used as input, workflows used to analyze these data sets, possibly failed experiments, and the final results. Open science can foster novel ways of social collaboration in science. We are already seeing the impact of social collaboration in our daily lives. A simple example is the use of reviews posted online by other consumers while evaluating whether to buy a product or not. This phenomenon has been well documented and is referred by many names such as Smart Mobs, Wisdom of Crowds, Wikinomics, Crowd sourcing, We-Think and swarm collaboration. Similar social collaborations during the science process can lead to “emergent science”. We define "emergent science" as way complex science problems can be solved and new research directions forged out of a multiplicity of relatively simple collaborative interactions. There are, however, barriers that prevent social collaboration within the science process. Some of these barriers are technical such as lack of science collaboration platforms and the others are social. The success of any collaborative platform has to take into account the incentives or motivation for the scientists to participate. This presentation will address obstacles facing emergent science and will suggest possible solutions required to build a critical mass.

  9. The Complex Route to Success: Complex Problem-Solving Skills in the Prediction of University Success

    ERIC Educational Resources Information Center

    Stadler, Matthias J.; Becker, Nicolas; Greiff, Samuel; Spinath, Frank M.

    2016-01-01

    Successful completion of a university degree is a complex matter. Based on considerations regarding the demands of acquiring a university degree, the aim of this paper was to investigate the utility of complex problem-solving (CPS) skills in the prediction of objective and subjective university success (SUS). The key finding of this study was that…

  10. Complexity and efficient approximability of two dimensional periodically specified problems

    SciTech Connect

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.

    1996-09-01

    The authors consider the two dimensional periodic specifications: a method to specify succinctly objects with highly regular repetitive structure. These specifications arise naturally when processing engineering designs including VLSI designs. These specifications can specify objects whose sizes are exponentially larger than the sizes of the specification themselves. Consequently solving a periodically specified problem by explicitly expanding the instance is prohibitively expensive in terms of computational resources. This leads one to investigate the complexity and efficient approximability of solving graph theoretic and combinatorial problems when instances are specified using two dimensional periodic specifications. They prove the following results: (1) several classical NP-hard optimization problems become NEXPTIME-hard, when instances are specified using two dimensional periodic specifications; (2) in contrast, several of these NEXPTIME-hard problems have polynomial time approximation algorithms with guaranteed worst case performance.

  11. A Generalized Topological Entropy for Analyzing the Complexity of DNA Sequences

    PubMed Central

    Jiang, Qinghua; Xu, Li; Peng, Jiajie; Wang, Yong; Wang, Yadong

    2014-01-01

    Topological entropy is one of the most difficult entropies to be used to analyze the DNA sequences, due to the finite sample and high-dimensionality problems. In order to overcome these problems, a generalized topological entropy is introduced. The relationship between the topological entropy and the generalized topological entropy is compared, which shows the topological entropy is a special case of the generalized entropy. As an application the generalized topological entropy in introns, exons and promoter regions was computed, respectively. The results indicate that the entropy of introns is higher than that of exons, and the entropy of the exons is higher than that of the promoter regions for each chromosome, which suggest that DNA sequence of the promoter regions is more regular than the exons and introns. PMID:24533097

  12. Analyzing networks of phenotypes in complex diseases: methodology and applications in COPD

    PubMed Central

    2014-01-01

    Background The investigation of complex disease heterogeneity has been challenging. Here, we introduce a network-based approach, using partial correlations, that analyzes the relationships among multiple disease-related phenotypes. Results We applied this method to two large, well-characterized studies of chronic obstructive pulmonary disease (COPD). We also examined the associations between these COPD phenotypic networks and other factors, including case-control status, disease severity, and genetic variants. Using these phenotypic networks, we have detected novel relationships between phenotypes that would not have been observed using traditional epidemiological approaches. Conclusion Phenotypic network analysis of complex diseases could provide novel insights into disease susceptibility, disease severity, and genetic mechanisms. PMID:24964944

  13. A Comparison of Geographic Information Systems, Complex Networks, and Other Models for Analyzing Transportation Network Topologies

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia (Technical Monitor); Kuby, Michael; Tierney, Sean; Roberts, Tyler; Upchurch, Christopher

    2005-01-01

    This report reviews six classes of models that are used for studying transportation network topologies. The report is motivated by two main questions. First, what can the "new science" of complex networks (scale-free, small-world networks) contribute to our understanding of transport network structure, compared to more traditional methods? Second, how can geographic information systems (GIS) contribute to studying transport networks? The report defines terms that can be used to classify different kinds of models by their function, composition, mechanism, spatial and temporal dimensions, certainty, linearity, and resolution. Six broad classes of models for analyzing transport network topologies are then explored: GIS; static graph theory; complex networks; mathematical programming; simulation; and agent-based modeling. Each class of models is defined and classified according to the attributes introduced earlier. The paper identifies some typical types of research questions about network structure that have been addressed by each class of model in the literature.

  14. Analyzing complex patients' temporal histories: new frontiers in temporal data mining.

    PubMed

    Sacchi, Lucia; Dagliati, Arianna; Bellazzi, Riccardo

    2015-01-01

    In recent years, data coming from hospital information systems (HIS) and local healthcare organizations have started to be intensively used for research purposes. This rising amount of available data allows reconstructing the compete histories of the patients, which have a strong temporal component. This chapter introduces the major challenges faced by temporal data mining researchers in an era when huge quantities of complex clinical temporal data are becoming available. The analysis is focused on the peculiar features of this kind of data and describes the methodological and technological aspects that allow managing such complex framework. The chapter shows how heterogeneous data can be processed to derive a homogeneous representation. Starting from this representation, it illustrates different techniques for jointly analyze such kind of data. Finally, the technological strategies that allow creating a common data warehouse to gather data coming from different sources and with different formats are presented. PMID:25417081

  15. Analyzing the causation of a railway accident based on a complex network

    NASA Astrophysics Data System (ADS)

    Ma, Xin; Li, Ke-Ping; Luo, Zi-Yan; Zhou, Jin

    2014-02-01

    In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents.

  16. Analyzing Pre-Service Primary Teachers' Fraction Knowledge Structures through Problem Posing

    ERIC Educational Resources Information Center

    Kilic, Cigdem

    2015-01-01

    In this study it was aimed to determine pre-service primary teachers' knowledge structures of fraction through problem posing activities. A total of 90 pre-service primary teachers participated in this study. A problem posing test consisting of two questions was used and the participants were asked to generate as many as problems based on the…

  17. Sorting of Streptomyces Cell Pellets Using a Complex Object Parametric Analyzer and Sorter

    PubMed Central

    Petrus, Marloes L. C.; van Veluw, G. Jerre; Wösten, Han A. B.; Claessen, Dennis

    2014-01-01

    Streptomycetes are filamentous soil bacteria that are used in industry for the production of enzymes and antibiotics. When grown in bioreactors, these organisms form networks of interconnected hyphae, known as pellets, which are heterogeneous in size. Here we describe a method to analyze and sort mycelial pellets using a Complex Object Parametric Analyzer and Sorter (COPAS). Detailed instructions are given for the use of the instrument and the basic statistical analysis of the data. We furthermore describe how pellets can be sorted according to user-defined settings, which enables downstream processing such as the analysis of the RNA or protein content. Using this methodology the mechanism underlying heterogeneous growth can be tackled. This will be instrumental for improving streptomycetes as a cell factory, considering the fact that productivity correlates with pellet size. PMID:24561666

  18. Data Mining and Complex Problems: Case Study in Composite Materials

    NASA Technical Reports Server (NTRS)

    Rabelo, Luis; Marin, Mario

    2009-01-01

    Data mining is defined as the discovery of useful, possibly unexpected, patterns and relationships in data using statistical and non-statistical techniques in order to develop schemes for decision and policy making. Data mining can be used to discover the sources and causes of problems in complex systems. In addition, data mining can support simulation strategies by finding the different constants and parameters to be used in the development of simulation models. This paper introduces a framework for data mining and its application to complex problems. To further explain some of the concepts outlined in this paper, the potential application to the NASA Shuttle Reinforced Carbon-Carbon structures and genetic programming is used as an illustration.

  19. Complexity and approximability of quantified and stochastic constraint satisfaction problems

    SciTech Connect

    Hunt, H. B.; Stearns, R. L.; Marathe, M. V.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SAT{sub c}(S)). Here, we study simultaneously the complexity of and the existence of efficient approximation algorithms for a number of variants of the problems SAT(S) and SAT{sub c}(S), and for many different D, C, and S. These problem variants include decision and optimization problems, for formulas, quantified formulas stochastically-quantified formulas. We denote these problems by Q-SAT(S), MAX-Q-SAT(S), S-SAT(S), MAX-S-SAT(S) MAX-NSF-Q-SAT(S) and MAX-NSF-S-SAT(S). The main contribution is the development of a unified predictive theory for characterizing the the complexity of these problems. Our unified approach is based on the following basic two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic representability. Let k {ge} 2. Let S be a finite set of finite-arity relations on {Sigma}{sub k} with the following condition on S: All finite arity relations on {Sigma}{sub k} can be represented as finite existentially-quantified conjunctions of relations in S applied to variables (to variables and constant symbols in C), Then we prove the following new results: (1) The problems SAT(S) and SAT{sub c}(S) are both NQL-complete and {le}{sub logn}{sup bw}-complete for NP. (2) The problems Q-SAT(S), Q-SAT{sub c}(S), are PSPACE-complete. Letting k = 2, the problem S-SAT(S) and S-SAT{sub c}(S) are PSPACE-complete. (3) {exists} {epsilon} > 0 for which approximating the problems MAX-Q-SAT(S) within {epsilon} times optimum is PSPACE-hard. Letting k =: 2, {exists} {epsilon} > 0 for which approximating the problems MAX-S-SAT(S) within {epsilon} times optimum is PSPACE-hard. (4

  20. Increased complexity in carcinomas: Analyzing and modeling the interaction of human cancer cells with their microenvironment.

    PubMed

    Stadler, Mira; Walter, Stefanie; Walzl, Angelika; Kramer, Nina; Unger, Christine; Scherzer, Martin; Unterleuthner, Daniela; Hengstschläger, Markus; Krupitza, Georg; Dolznig, Helmut

    2015-12-01

    Solid cancers are not simple accumulations of malignant tumor cells but rather represent complex organ-like structures. Despite a more chaotic general appearance as compared to the highly organized setup of healthy tissues, cancers still show highly differentiated structures and a close interaction with and dependency on the interwoven connective tissue. This complexity within cancers is not known in detail at the molecular level so far. The first part of this article will shortly describe the technology and strategies to quantify and dissect the heterogeneity in human solid cancers. Moreover, there is urgent need to better understand human cancer biology since the development of novel anti-cancer drugs is far from being efficient, predominantly due to the scarcity of predictive preclinical models. Hence, in vivo and in vitro models were developed, which better recapitulate the complexity of human cancers, by their intrinsic three-dimensional nature and the cellular heterogeneity and allow functional intervention for hypothesis testing. Therefore, in the second part 3D in vitro cancer models are presented that analyze and depict the heterogeneity in human cancers. Advantages and drawbacks of each model are highlighted and their suitability to preclinical drug testing is discussed. PMID:26320002

  1. Analyzing complex functional brain networks: Fusing statistics and network science to understand the brain*†

    PubMed Central

    Simpson, Sean L.; Bowman, F. DuBois; Laurienti, Paul J.

    2014-01-01

    Complex functional brain network analyses have exploded over the last decade, gaining traction due to their profound clinical implications. The application of network science (an interdisciplinary offshoot of graph theory) has facilitated these analyses and enabled examining the brain as an integrated system that produces complex behaviors. While the field of statistics has been integral in advancing activation analyses and some connectivity analyses in functional neuroimaging research, it has yet to play a commensurate role in complex network analyses. Fusing novel statistical methods with network-based functional neuroimage analysis will engender powerful analytical tools that will aid in our understanding of normal brain function as well as alterations due to various brain disorders. Here we survey widely used statistical and network science tools for analyzing fMRI network data and discuss the challenges faced in filling some of the remaining methodological gaps. When applied and interpreted correctly, the fusion of network scientific and statistical methods has a chance to revolutionize the understanding of brain function. PMID:25309643

  2. On the complexity of some quadratic Euclidean 2-clustering problems

    NASA Astrophysics Data System (ADS)

    Kel'manov, A. V.; Pyatkin, A. V.

    2016-03-01

    Some problems of partitioning a finite set of points of Euclidean space into two clusters are considered. In these problems, the following criteria are minimized: (1) the sum over both clusters of the sums of squared pairwise distances between the elements of the cluster and (2) the sum of the (multiplied by the cardinalities of the clusters) sums of squared distances from the elements of the cluster to its geometric center, where the geometric center (or centroid) of a cluster is defined as the mean value of the elements in that cluster. Additionally, another problem close to (2) is considered, where the desired center of one of the clusters is given as input, while the center of the other cluster is unknown (is the variable to be optimized) as in problem (2). Two variants of the problems are analyzed, in which the cardinalities of the clusters are (1) parts of the input or (2) optimization variables. It is proved that all the considered problems are strongly NP-hard and that, in general, there is no fully polynomial-time approximation scheme for them (unless P = NP).

  3. COMPLEXITY&APPROXIMABILITY OF QUANTIFIED&STOCHASTIC CONSTRAINT SATISFACTION PROBLEMS

    SciTech Connect

    Hunt, H. B.; Marathe, M. V.; Stearns, R. E.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S and T be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SATc(S).) Here, we study simultaneously the complexity of decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. We present simple yet general techniques to characterize simultaneously, the complexity or efficient approximability of a number of versions/variants of the problems SAT(S), Q-SAT(S), S-SAT(S),MAX-Q-SAT(S) etc., for many different such D,C ,S, T. These versions/variants include decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. Our unified approach is based on the following two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic represent ability. Some of the results extend the earlier results in [Pa85,LMP99,CF+93,CF+94O]u r techniques and results reported here also provide significant steps towards obtaining dichotomy theorems, for a number of the problems above, including the problems MAX-&-SAT( S), and MAX-S-SAT(S). The discovery of such dichotomy theorems, for unquantified formulas, has received significant recent attention in the literature [CF+93,CF+94,Cr95,KSW97

  4. Analyzing Student Modeling Cycles in the Context of a "Real World" Problem

    ERIC Educational Resources Information Center

    Schorr, Roberta Y.; Amit, Miriam

    2005-01-01

    Many students do not apply their real world intuitions and sense-making abilities when solving mathematics problems in school. In an effort to better understand how to help students draw upon these valued resources, we investigate the manner in which the solution to a particular problem activity is repeatedly re-interpreted by a student. This is…

  5. TOPAZ - the transient one-dimensional pipe flow analyzer: code validation and sample problems

    SciTech Connect

    Winters, W.S.

    1985-10-01

    TOPAZ is a ''user friendly'' computer code for modeling the one-dimensional-transient physics of multi-species gas transfer in arbitrary arrangements of pipes, valves, vessels, and flow branches. This document presents a series of sample problems designed to aid potential users in creating TOPAZ input files. To the extent possible, sample problems were selected for which analytical solutions currently exist. TOPAZ comparisons with such solutions are intended to provide a measure of code validation.

  6. Complexity and Approximation of a Geometric Local Robot Assignment Problem

    NASA Astrophysics Data System (ADS)

    Bonorden, Olaf; Degener, Bastian; Kempkes, Barbara; Pietrzyk, Peter

    We introduce a geometric multi-robot assignment problem. Robots positioned in a Euclidean space have to be assigned to treasures in such a way that their joint strength is sufficient to unearth a treasure with a given weight. The robots have a limited range and thus can only be assigned to treasures in their proximity. The objective is to unearth as many treasures as possible. We investigate the complexity of several variants of this problem and show whether they are in {mathcal P} or are mathcal{ NP}-complete. Furthermore, we provide a distributed and local constant-factor approximation algorithm using constant-factor resource augmentation for the two-dimensional setting with {mathcal O}(log^*n) communication rounds.

  7. Coordinating complex problem-solving among distributed intelligent agents

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1992-01-01

    A process-oriented control model is described for distributed problem solving. The model coordinates the transfer and manipulation of information across independent networked applications, both intelligent and conventional. The model was implemented using SOCIAL, a set of object-oriented tools for distributing computing. Complex sequences of distributed tasks are specified in terms of high level scripts. Scripts are executed by SOCIAL objects called Manager Agents, which realize an intelligent coordination model that routes individual tasks to suitable server applications across the network. These tools are illustrated in a prototype distributed system for decision support of ground operations for NASA's Space Shuttle fleet.

  8. Analyzing Energy and Resource Problems: An Interdisciplinary Approach to Mathematical Modeling.

    ERIC Educational Resources Information Center

    Fishman, Joseph

    1993-01-01

    Suggests ways in which mathematical models can be presented and developed in the classroom to promote discussion, analysis, and understanding of issues related to energy consumption. Five problems deal with past trends and future projections of availability of a nonrenewable resource, natural gas. (Contains 13 references.) (MDH)

  9. Case Studies in Critical Ecoliteracy: A Curriculum for Analyzing the Social Foundations of Environmental Problems

    ERIC Educational Resources Information Center

    Turner, Rita; Donnelly, Ryan

    2013-01-01

    This article outlines the features and application of a set of model curriculum materials that utilize eco-democratic principles and humanities-based content to cultivate critical analysis of the cultural foundations of socio-environmental problems. We first describe the goals and components of the materials, then discuss results of their use in…

  10. INVESTIGATION OF ANALYZER PROBLEMS IN THE MEASUREMENT OF NOX FROM METHANOL VEHICLES

    EPA Science Inventory

    The study investigated the extent and source of irregularities related to the measurement of NOx emissions from methanol cars. Corrective measures also were explored. It was observed that NOx chemiluminescent analyzers respond to methanol and formaldehyde after being exposed to h...

  11. Analyzing and Attempting to Overcome Prospective Teachers' Difficulties during Problem-Solving Instruction

    ERIC Educational Resources Information Center

    Karp, Alexander

    2010-01-01

    This article analyzes the experiences of prospective secondary mathematics teachers during a teaching methods course, offered prior to their student teaching, but involving actual teaching and reflexive analysis of this teaching. The study focuses on the pedagogical difficulties that arose during their teaching, in which prospective teachers…

  12. The problem of motivating teaching staff in a complex amalgamation.

    PubMed

    Kenrick, M A

    1993-09-01

    This paper addresses some of the problems brought about by the merger of a number of schools of nursing into a new complex amalgamation. A very real concern in the new colleges of nursing and midwifery in the United Kingdom is the effect of amalgamation on management systems and staff morale. The main focus of this paper is the motivation of staff during this time of change. There is currently a lack of security amongst staff and in many instances the personal job satisfaction of nurse teachers and managers of nurse education has been reduced, which has made the task of motivating staff difficult. Hence, two major theories of motivation and the implications of these theories for managers of nurse education are discussed. The criteria used for the selection of managers within the new colleges, leadership styles and organizational structures are reviewed. The amalgamations have brought about affiliation with higher-education institutions. Some problems associated with these mergers and the effects on the motivation of staff both within the higher-education institutions and the nursing colleges are outlined. Strategies for overcoming some of the problems are proposed including job enlargement, job enrichment, potential achievement rewards and the use of individual performance reviews which may be useful for assessing the ability of all staff, including managers, in the new amalgamations. PMID:8258610

  13. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    NASA Technical Reports Server (NTRS)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  14. Leveraging Cultural Resources through Teacher Pedagogical Reasoning: Elementary Grade Teachers Analyze Second Language Learners' Science Problem Solving

    ERIC Educational Resources Information Center

    Buxton, Cory A.; Salinas, Alejandra; Mahotiere, Margarette; Lee, Okhee; Secada, Walter G.

    2013-01-01

    Grounded in teacher professional development addressing the intersection of student diversity and content area instruction, this study examined school teachers' pedagogical reasoning complexity as they reflected on their second language learners' science problem solving abilities using both home and school contexts. Teachers responded to interview…

  15. Quantum trajectories in complex space: one-dimensional stationary scattering problems.

    PubMed

    Chou, Chia-Chun; Wyatt, Robert E

    2008-04-21

    One-dimensional time-independent scattering problems are investigated in the framework of the quantum Hamilton-Jacobi formalism. The equation for the local approximate quantum trajectories near the stagnation point of the quantum momentum function is derived, and the first derivative of the quantum momentum function is related to the local structure of quantum trajectories. Exact complex quantum trajectories are determined for two examples by numerically integrating the equations of motion. For the soft potential step, some particles penetrate into the nonclassical region, and then turn back to the reflection region. For the barrier scattering problem, quantum trajectories may spiral into the attractors or from the repellers in the barrier region. Although the classical potentials extended to complex space show different pole structures for each problem, the quantum potentials present the same second-order pole structure in the reflection region. This paper not only analyzes complex quantum trajectories and the total potentials for these examples but also demonstrates general properties and similar structures of the complex quantum trajectories and the quantum potentials for one-dimensional time-independent scattering problems. PMID:18433189

  16. Understanding the determinants of problem-solving behavior in a complex environment

    NASA Technical Reports Server (NTRS)

    Casner, Stephen A.

    1994-01-01

    It is often argued that problem-solving behavior in a complex environment is determined as much by the features of the environment as by the goals of the problem solver. This article explores a technique to determine the extent to which measured features of a complex environment influence problem-solving behavior observed within that environment. In this study, the technique is used to determine how complex flight deck and air traffic control environment influences the strategies used by airline pilots when controlling the flight path of a modern jetliner. Data collected aboard 16 commercial flights are used to measure selected features of the task environment. A record of the pilots' problem-solving behavior is analyzed to determine to what extent behavior is adapted to the environmental features that were measured. The results suggest that the measured features of the environment account for as much as half of the variability in the pilots' problem-solving behavior and provide estimates on the probable effects of each environmental feature.

  17. Human opinion dynamics: An inspiration to solve complex optimization problems

    NASA Astrophysics Data System (ADS)

    Kaur, Rishemjit; Kumar, Ritesh; Bhondekar, Amol P.; Kapur, Pawan

    2013-10-01

    Human interactions give rise to the formation of different kinds of opinions in a society. The study of formations and dynamics of opinions has been one of the most important areas in social physics. The opinion dynamics and associated social structure leads to decision making or so called opinion consensus. Opinion formation is a process of collective intelligence evolving from the integrative tendencies of social influence with the disintegrative effects of individualisation, and therefore could be exploited for developing search strategies. Here, we demonstrate that human opinion dynamics can be utilised to solve complex mathematical optimization problems. The results have been compared with a standard algorithm inspired from bird flocking behaviour and the comparison proves the efficacy of the proposed approach in general. Our investigation may open new avenues towards understanding the collective decision making.

  18. Human opinion dynamics: An inspiration to solve complex optimization problems

    PubMed Central

    Kaur, Rishemjit; Kumar, Ritesh; Bhondekar, Amol P.; Kapur, Pawan

    2013-01-01

    Human interactions give rise to the formation of different kinds of opinions in a society. The study of formations and dynamics of opinions has been one of the most important areas in social physics. The opinion dynamics and associated social structure leads to decision making or so called opinion consensus. Opinion formation is a process of collective intelligence evolving from the integrative tendencies of social influence with the disintegrative effects of individualisation, and therefore could be exploited for developing search strategies. Here, we demonstrate that human opinion dynamics can be utilised to solve complex mathematical optimization problems. The results have been compared with a standard algorithm inspired from bird flocking behaviour and the comparison proves the efficacy of the proposed approach in general. Our investigation may open new avenues towards understanding the collective decision making. PMID:24141795

  19. Applied social and behavioral science to address complex health problems.

    PubMed

    Livingood, William C; Allegrante, John P; Airhihenbuwa, Collins O; Clark, Noreen M; Windsor, Richard C; Zimmerman, Marc A; Green, Lawrence W

    2011-11-01

    Complex and dynamic societal factors continue to challenge the capacity of the social and behavioral sciences in preventive medicine and public health to overcome the most seemingly intractable health problems. This paper proposes a fundamental shift from a research approach that presumes to identify (from highly controlled trials) universally applicable interventions expected to be implemented "with fidelity" by practitioners, to an applied social and behavioral science approach similar to that of engineering. Such a shift would build on and complement the recent recommendations of the NIH Office of Behavioral and Social Science Research and require reformulation of the research-practice dichotomy. It would also require disciplines now engaged in preventive medicine and public health practice to develop a better understanding of systems thinking and the science of application that is sensitive to the complexity, interactivity, and unique elements of community and practice settings. Also needed is a modification of health-related education to ensure that those entering the disciplines develop instincts and capacities as applied scientists. PMID:22011425

  20. Strategies in Forecasting Outcomes in Ethical Decision-making: Identifying and Analyzing the Causes of the Problem

    PubMed Central

    Beeler, Cheryl K.; Antes, Alison L.; Wang, Xiaoqian; Caughron, Jared J.; Thiel, Chase E.; Mumford, Michael D.

    2010-01-01

    This study examined the role of key causal analysis strategies in forecasting and ethical decision-making. Undergraduate participants took on the role of the key actor in several ethical problems and were asked to identify and analyze the causes, forecast potential outcomes, and make a decision about each problem. Time pressure and analytic mindset were manipulated while participants worked through these problems. The results indicated that forecast quality was associated with decision ethicality, and the identification of the critical causes of the problem was associated with both higher quality forecasts and higher ethicality of decisions. Neither time pressure nor analytic mindset impacted forecasts or ethicality of decisions. Theoretical and practical implications of these findings are discussed. PMID:20352056

  1. Effects of friction and heat conduction on sound propagation in ducts. [analyzing complex aerodynamic noise problems

    NASA Technical Reports Server (NTRS)

    Huerre, P.; Karamcheti, K.

    1976-01-01

    The theory of sound propagation is examined in a viscous, heat-conducting fluid, initially at rest and in a uniform state, and contained in a rigid, impermeable duct with isothermal walls. Topics covered include: (1) theoretical formulation of the small amplitude fluctuating motions of a viscous, heat-conducting and compressible fluid; (2) sound propagation in a two dimensional duct; and (3) perturbation study of the inplane modes.

  2. Deep graphs—A general framework to represent and analyze heterogeneous complex systems across scales

    NASA Astrophysics Data System (ADS)

    Traxl, Dominik; Boers, Niklas; Kurths, Jürgen

    2016-06-01

    Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of

  3. Deep graphs-A general framework to represent and analyze heterogeneous complex systems across scales.

    PubMed

    Traxl, Dominik; Boers, Niklas; Kurths, Jürgen

    2016-06-01

    Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of

  4. Complex Problem Exercises in Developing Engineering Students' Conceptual and Procedural Knowledge of Electromagnetics

    ERIC Educational Resources Information Center

    Leppavirta, J.; Kettunen, H.; Sihvola, A.

    2011-01-01

    Complex multistep problem exercises are one way to enhance engineering students' learning of electromagnetics (EM). This study investigates whether exposure to complex problem exercises during an introductory EM course improves students' conceptual and procedural knowledge. The performance in complex problem exercises is compared to prior success…

  5. Inverse Problems in Complex Models and Applications to Earth Sciences

    NASA Astrophysics Data System (ADS)

    Bosch, M. E.

    2015-12-01

    The inference of the subsurface earth structure and properties requires the integration of different types of data, information and knowledge, by combined processes of analysis and synthesis. To support the process of integrating information, the regular concept of data inversion is evolving to expand its application to models with multiple inner components (properties, scales, structural parameters) that explain multiple data (geophysical survey data, well-logs, core data). The probabilistic inference methods provide the natural framework for the formulation of these problems, considering a posterior probability density function (PDF) that combines the information from a prior information PDF and the new sets of observations. To formulate the posterior PDF in the context of multiple datasets, the data likelihood functions are factorized assuming independence of uncertainties for data originating across different surveys. A realistic description of the earth medium requires modeling several properties and structural parameters, which relate to each other according to dependency and independency notions. Thus, conditional probabilities across model components also factorize. A common setting proceeds by structuring the model parameter space in hierarchical layers. A primary layer (e.g. lithology) conditions a secondary layer (e.g. physical medium properties), which conditions a third layer (e.g. geophysical data). In general, less structured relations within model components and data emerge from the analysis of other inverse problems. They can be described with flexibility via direct acyclic graphs, which are graphs that map dependency relations between the model components. Examples of inverse problems in complex models can be shown at various scales. At local scale, for example, the distribution of gas saturation is inferred from pre-stack seismic data and a calibrated rock-physics model. At regional scale, joint inversion of gravity and magnetic data is applied

  6. Can SNOMED CT fulfill the vision of a compositional terminology? Analyzing the use case for Problem List

    PubMed Central

    Campbell, James R.; Xu, Junchuan; Fung, Kin Wah

    2011-01-01

    We analyzed 598 of 63,952 terms employed in problem list entries from seven major healthcare institutions that were not mapped with UMLS to SNOMED CT when preparing the NLM UMLS-CORE problem list subset. We intended to determine whether published or post-coordinated SNOMED concepts could accurately capture the problems as stated by the clinician and to characterize the workload for the local terminology manager. From the terms we analyzed, we estimate that 7.5% of the total terms represent ambiguous statements that require clarification. Of those terms which were unambiguous, we estimate that 38.1% could be encoded using the SNOMED CT January 2011 pre-coordinated (published core) content. 60.4% of unambiguous terms required post-coordination to capture the term meaning within the SNOMED model. Approximately 28.5% of post-coordinated content could not be fully defined and required primitive forms. This left 1.5% of unambiguous terms which were expressed with meaning which could not be represented in SNOMED CT. We estimate from our study that 98.5% of clinical terms unambiguously suggested for the problem list can be equated to published concepts or can be modeled with SNOMED CT but that roughly one in four SNOMED modeled expressions fail to represent the full meaning of the term. Implications for the business model of the local terminology manager and the development of SNOMED CT are discussed. PMID:22195069

  7. A novel approach to analyze membrane proteins by laser mass spectrometry: from protein subunits to the integral complex.

    PubMed

    Morgner, Nina; Kleinschroth, Thomas; Barth, Hans-Dieter; Ludwig, Bernd; Brutschy, Bernhard

    2007-08-01

    A novel laser-based mass spectrometry method termed LILBID (laser-induced liquid bead ion desorption) is applied to analyze large integral membrane protein complexes and their subunits. In this method the ions are IR-laser desorbed from aqueous microdroplets containing the hydrophobic protein complexes solubilized by detergent. The method is highly sensitive, very efficient in sample handling, relatively tolerant to various buffers, and detects the ions in narrow, mainly low-charge state distributions. The crucial experimental parameter determining whether the integral complex or its subunits are observed is the laser intensity: At very low intensity level corresponding to an ultrasoft desorption, the intact complexes, together with few detergent molecules, are transferred into vacuum. Under these conditions the oligomerization state of the complex (i.e., its quaternary structure) may be analyzed. At higher laser intensity, complexes are thermolyzed into subunits, with any residual detergent being stripped off to yield the true mass of the polypeptides. The model complexes studied are derived from the respiratory chain of the soil bacterium Paracoccus denitrificans and include complexes III (cytochrome bc(1) complex) and IV (cytochrome c oxidase). These are well characterized multi-subunit membrane proteins, with the individual hydrophobic subunits being composed of up to 12 transmembrane helices. PMID:17544294

  8. Eye-Tracking Study of Complexity in Gas Law Problems

    ERIC Educational Resources Information Center

    Tang, Hui; Pienta, Norbert

    2012-01-01

    This study, part of a series investigating students' use of online tools to assess problem solving, uses eye-tracking hardware and software to explore the effect of problem difficulty and cognitive processes when students solve gas law word problems. Eye movements are indices of cognition; eye-tracking data typically include the location,…

  9. An eye-tracking paradigm for analyzing the processing time of sentences with different linguistic complexities.

    PubMed

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics. PMID:24950184

  10. An Eye-Tracking Paradigm for Analyzing the Processing Time of Sentences with Different Linguistic Complexities

    PubMed Central

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics. PMID:24950184

  11. Using New Models to Analyze Complex Regularities of the World: Commentary on Musso et al. (2013)

    ERIC Educational Resources Information Center

    Nokelainen, Petri; Silander, Tomi

    2014-01-01

    This commentary to the recent article by Musso et al. (2013) discusses issues related to model fitting, comparison of classification accuracy of generative and discriminative models, and two (or more) cultures of data modeling. We start by questioning the extremely high classification accuracy with an empirical data from a complex domain. There is…

  12. The Influence of Prior Experience and Process Utilization in Solving Complex Problems.

    ERIC Educational Resources Information Center

    Sterner, Paula; Wedman, John

    By using ill-structured problems and examining problem- solving processes, this study was conducted to explore the nature of solving complex, multistep problems, focusing on how prior knowledge, problem-solving process utilization, and analogical problem solving are related to success. Twenty-four college students qualified to participate by…

  13. An Effective Methodology for Processing and Analyzing Large, Complex Spacecraft Data Streams

    ERIC Educational Resources Information Center

    Teymourlouei, Haydar

    2013-01-01

    The emerging large datasets have made efficient data processing a much more difficult task for the traditional methodologies. Invariably, datasets continue to increase rapidly in size with time. The purpose of this research is to give an overview of some of the tools and techniques that can be utilized to manage and analyze large datasets. We…

  14. Complex Problem Solving in Educational Contexts--Something beyond "g": Concept, Assessment, Measurement Invariance, and Construct Validity

    ERIC Educational Resources Information Center

    Greiff, Samuel; Wustenberg, Sascha; Molnar, Gyongyver; Fischer, Andreas; Funke, Joachim; Csapo, Beno

    2013-01-01

    Innovative assessments of cross-curricular competencies such as complex problem solving (CPS) have currently received considerable attention in large-scale educational studies. This study investigated the nature of CPS by applying a state-of-the-art approach to assess CPS in high school. We analyzed whether two processes derived from cognitive…

  15. Investigating the Effect of Complexity Factors in Stoichiometry Problems Using Logistic Regression and Eye Tracking

    ERIC Educational Resources Information Center

    Tang, Hui; Kirk, John; Pienta, Norbert J.

    2014-01-01

    This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we…

  16. A note on the Dirichlet problem for model complex partial differential equations

    NASA Astrophysics Data System (ADS)

    Ashyralyev, Allaberen; Karaca, Bahriye

    2016-08-01

    Complex model partial differential equations of arbitrary order are considered. The uniqueness of the Dirichlet problem is studied. It is proved that the Dirichlet problem for higher order of complex partial differential equations with one complex variable has infinitely many solutions.

  17. A complexity analysis of space-bounded learning algorithms for the constraint satisfaction problem

    SciTech Connect

    Bayardo, R.J. Jr.; Miranker, D.P.

    1996-12-31

    Learning during backtrack search is a space-intensive process that records information (such as additional constraints) in order to avoid redundant work. In this paper, we analyze the effects of polynomial-space-bounded learning on runtime complexity of backtrack search. One space-bounded learning scheme records only those constraints with limited size, and another records arbitrarily large constraints but deletes those that become irrelevant to the portion of the search space being explored. We find that relevance-bounded learning allows better runtime bounds than size-bounded learning on structurally restricted constraint satisfaction problems. Even when restricted to linear space, our relevance-bounded learning algorithm has runtime complexity near that of unrestricted (exponential space-consuming) learning schemes.

  18. The Interfacial Interaction Problem in Complex Multiple Porosity Fractured Reservoirs

    NASA Astrophysics Data System (ADS)

    Suarez-Arriaga, Mario-Cesar

    2003-04-01

    Many productive reservoirs (oil, gas, water, geothermal) are associated to natural fracturing. Fault zones and fractures act as open networks for fluid and energy flow from depth. Their petrophysical parameters are heterogeneous and randomly distributed, conforming extremely complex natural systems. Here, the simultaneous heat and mass flows are coupled to the deformation of thermoporoelastic rocks. The system's volume is divided into N interacting continua, each one occupying a region of space Vn wrapped by a surface Sn (n=1,N). The mass flow is represented by: ∂/∂t ∫ Vn ρf φdV + ∫ Sn F⃗M ṡ n⃗dS = ∫ Vn qMdV (3) Taking into account a non-isothermal process the coupled equation of energy is: ∂/∂t ∫ Vn [φρf hf + (1 - φ)ρrhr]dV + ∫ Sn F⃗E ṡ n⃗dS = ∫ Vn qEdV (4) Where t means time, φ is porosity, ρf, ρr are fluid and rock densities, F⃗M and F⃗E are total mass and energy flows, qM and qE are volumetric mass and energy extracted or injected into Vn, hf and hr are specific enthalpies for fluid and rock respectively. Rock deformation is coupled through the equation: ∇⃗ ṡ (ρf/μK ṡ ∇⃗pφ)Vn = φ (Dtρf + ρf/VφdVφ/dt)Vn (5) K is the absolute permeability tensor, μ means dynamic fluid viscosity, Dt is a total derivative, pφ is pore pressure and Vφ is the volume of pores in Vn. The N media interact with each other, every one has its own parameters and its own interporosity flow. Modelling these coupled phenomena requires to average highly contrasting physical properties, independently of the method used in the solution of equations. A lot of attention has been devoted to develop realistic numerical models to describe flows in reservoirs under exploitation. But to the best of our knowledge very little attention has been focused on the problem of interfacial interaction and averaging petrophysical parameters in multiple porosity reservoirs.

  19. Mass spectrometric methods to analyze the structural organization of macromolecular complexes.

    PubMed

    Rajabi, Khadijeh; Ashcroft, Alison E; Radford, Sheena E

    2015-11-01

    With the development of soft ionization techniques such as electrospray ionization (ESI), mass spectrometry (MS) has found widespread application in structural biology. The ability to transfer large biomolecular complexes intact into the gas-phase, combined with the low sample consumption and high sensitivity of MS, has made ESI-MS a method of choice for the characterization of macromolecules. This paper describes the application of MS to study large non-covalent complexes. We categorize the available techniques in two groups. First, solution-based techniques in which the biomolecules are labeled in solution and subsequently characterized by MS. Three MS-based techniques are discussed, namely hydroxyl radical footprinting, cross-linking and hydrogen/deuterium exchange (HDX) MS. In the second group, MS-based techniques to probe intact biomolecules in the gas-phase, e.g. side-chain microsolvation, HDX and ion mobility spectrometry are discussed. Together, the approaches place MS as a powerful methodology for an ever growing plethora of structural applications. PMID:25782628

  20. Detrended Partial-Cross-Correlation Analysis: A New Method for Analyzing Correlations in Complex System

    PubMed Central

    Yuan, Naiming; Fu, Zuntao; Zhang, Huan; Piao, Lin; Xoplaki, Elena; Luterbacher, Juerg

    2015-01-01

    In this paper, a new method, detrended partial-cross-correlation analysis (DPCCA), is proposed. Based on detrended cross-correlation analysis (DCCA), this method is improved by including partial-correlation technique, which can be applied to quantify the relations of two non-stationary signals (with influences of other signals removed) on different time scales. We illustrate the advantages of this method by performing two numerical tests. Test I shows the advantages of DPCCA in handling non-stationary signals, while Test II reveals the “intrinsic” relations between two considered time series with potential influences of other unconsidered signals removed. To further show the utility of DPCCA in natural complex systems, we provide new evidence on the winter-time Pacific Decadal Oscillation (PDO) and the winter-time Nino3 Sea Surface Temperature Anomaly (Nino3-SSTA) affecting the Summer Rainfall over the middle-lower reaches of the Yangtze River (SRYR). By applying DPCCA, better significant correlations between SRYR and Nino3-SSTA on time scales of 6 ~ 8 years are found over the period 1951 ~ 2012, while significant correlations between SRYR and PDO on time scales of 35 years arise. With these physically explainable results, we have confidence that DPCCA is an useful method in addressing complex systems. PMID:25634341

  1. Taking advantage of local structure descriptors to analyze interresidue contacts in protein structures and protein complexes.

    PubMed

    Martin, Juliette; Regad, Leslie; Etchebest, Catherine; Camproux, Anne-Claude

    2008-11-15

    Interresidue protein contacts in proteins structures and at protein-protein interface are classically described by the amino acid types of interacting residues and the local structural context of the contact, if any, is described using secondary structures. In this study, we present an alternate analysis of interresidue contact using local structures defined by the structural alphabet introduced by Camproux et al. This structural alphabet allows to describe a 3D structure as a sequence of prototype fragments called structural letters, of 27 different types. Each residue can then be assigned to a particular local structure, even in loop regions. The analysis of interresidue contacts within protein structures defined using Voronoï tessellations reveals that pairwise contact specificity is greater in terms of structural letters than amino acids. Using a simple heuristic based on specificity score comparison, we find that 74% of the long-range contacts within protein structures are better described using structural letters than amino acid types. The investigation is extended to a set of protein-protein complexes, showing that the similar global rules apply as for intraprotein contacts, with 64% of the interprotein contacts best described by local structures. We then present an evaluation of pairing functions integrating structural letters to decoy scoring and show that some complexes could benefit from the use of structural letter-based pairing functions. PMID:18491388

  2. Technologically Mediated Complex Problem-Solving on a Statistics Task

    ERIC Educational Resources Information Center

    Scanlon, Eileen; Blake, Canan; Joiner, Richard; O'Shea, Tim

    2005-01-01

    Simulations on computers can allow many experiments to be conducted quickly to help students develop an understanding of statistical topics. We used a simulation of a challenging problem in statistics as the focus of an exploration of situations where members of a problem-solving group are physically separated then reconnected via combinations of…

  3. THE ROLE OF PROBLEM SOLVING IN COMPLEX INTRAVERBAL REPERTOIRES

    PubMed Central

    Sautter, Rachael A; LeBlanc, Linda A; Jay, Allison A; Goldsmith, Tina R; Carr, James E

    2011-01-01

    We examined whether typically developing preschoolers could learn to use a problem-solving strategy that involved self-prompting with intraverbal chains to provide multiple responses to intraverbal categorization questions. Teaching the children to use the problem-solving strategy did not produce significant increases in target responses until problem solving was modeled and prompted. Following the model and prompts, all participants showed immediate significant increases in intraverbal categorization, and all prompts were quickly eliminated. Use of audible self-prompts was evident initially for all participants, but declined over time for 3 of the 4 children. Within-session response patterns remained consistent with use of the problem-solving strategy even when self-prompts were not audible. These findings suggest that teaching and prompting a problem-solving strategy can be an effective way to produce intraverbal categorization responses. PMID:21709781

  4. Nuclear three-body problem in the complex energy plane: Complex-scaling Slater method

    NASA Astrophysics Data System (ADS)

    Kruppa, A. T.; Papadimitriou, G.; Nazarewicz, W.; Michel, N.

    2014-01-01

    Background: The physics of open quantum systems is an interdisciplinary area of research. The nuclear "openness" manifests itself through the presence of the many-body continuum representing various decay, scattering, and reaction channels. As the radioactive nuclear beam experimentation extends the known nuclear landscape toward the particle drip lines, the coupling to the continuum space becomes exceedingly more important. Of particular interest are weakly bound and unbound nuclear states appearing around particle thresholds. Theories of such nuclei must take into account their open quantum nature. Purpose: To describe open quantum systems, we introduce a complex-scaling (CS) approach in the Slater basis. We benchmark it with the complex-energy Gamow shell model (GSM) by studying energies and wave functions of the bound and unbound states of the two-neutron halo nucleus 6He viewed as an α +n+n cluster system. Methods: Both CS and GSM approaches are applied to a translationally invariant Hamiltonian with the two-body interaction approximated by the finite-range central Minnesota force. In the CS approach, we use the Slater basis, which exhibits the correct asymptotic behavior at large distances. To extract particle densities from the back-rotated CS solutions, we apply the Tikhonov regularization procedure, which minimizes the ultraviolet numerical noise. Results: We show that the CS-Slater method is both accurate and efficient. Its equivalence to the GSM approach has been demonstrated numerically for both energies and wave functions of 6He. One important technical aspect of our calculation was to fully retrieve the correct asymptotic behavior of a resonance state from the complex-scaled (square-integrable) wave function. While standard applications of the inverse complex transformation to the complex-rotated solution provide unstable results, the stabilization method fully reproduces the GSM benchmark. We also propose a method to determine the smoothing

  5. Complex Causal Process Diagrams for Analyzing the Health Impacts of Policy Interventions

    PubMed Central

    Joffe, Michael; Mindell, Jennifer

    2006-01-01

    Causal diagrams are rigorous tools for controlling confounding. They also can be used to describe complex causal systems, which is done routinely in communicable disease epidemiology. The use of change diagrams has advantages over static diagrams, because change diagrams are more tractable, relate better to interventions, and have clearer interpretations. Causal diagrams are a useful basis for modeling. They make assumptions explicit, provide a framework for analysis, generate testable predictions, explore the effects of interventions, and identify data gaps. Causal diagrams can be used to integrate different types of information and to facilitate communication both among public health experts and between public health experts and experts in other fields. Causal diagrams allow the use of instrumental variables, which can help control confounding and reverse causation. PMID:16449586

  6. Teaching Problem Solving; the Effect of Algorithmic and Heuristic Problem Solving Training in Relation to Task Complexity and Relevant Aptitudes.

    ERIC Educational Resources Information Center

    de Leeuw, L.

    Sixty-four fifth and sixth-grade pupils were taught number series extrapolation by either an algorithm, fully prescribed problem-solving method or a heuristic, less prescribed method. The trained problems were within categories of two degrees of complexity. There were 16 subjects in each cell of the 2 by 2 design used. Aptitude Treatment…

  7. The Fallacy of Univariate Solutions to Complex Systems Problems.

    PubMed

    Lessov-Schlaggar, Christina N; Rubin, Joshua B; Schlaggar, Bradley L

    2016-01-01

    Complex biological systems, by definition, are composed of multiple components that interact non-linearly. The human brain constitutes, arguably, the most complex biological system known. Yet most investigation of the brain and its function is carried out using assumptions appropriate for simple systems-univariate design and linear statistical approaches. This heuristic must change before we can hope to discover and test interventions to improve the lives of individuals with complex disorders of brain development and function. Indeed, a movement away from simplistic models of biological systems will benefit essentially all domains of biology and medicine. The present brief essay lays the foundation for this argument. PMID:27375425

  8. The Fallacy of Univariate Solutions to Complex Systems Problems

    PubMed Central

    Lessov-Schlaggar, Christina N.; Rubin, Joshua B.; Schlaggar, Bradley L.

    2016-01-01

    Complex biological systems, by definition, are composed of multiple components that interact non-linearly. The human brain constitutes, arguably, the most complex biological system known. Yet most investigation of the brain and its function is carried out using assumptions appropriate for simple systems—univariate design and linear statistical approaches. This heuristic must change before we can hope to discover and test interventions to improve the lives of individuals with complex disorders of brain development and function. Indeed, a movement away from simplistic models of biological systems will benefit essentially all domains of biology and medicine. The present brief essay lays the foundation for this argument. PMID:27375425

  9. Problems in processing multizonal video information at specialized complexes

    NASA Technical Reports Server (NTRS)

    Shamis, V. A.

    1979-01-01

    Architectural requirements of a minicomputer-based specialized complex for automated digital analysis of multizonal video data are examined. The logic structure of multizonal video data and the complex mathematical provision required for the analysis of such data are described. The composition of the specialized complex, its operating system, and the required set of peripheral devices are discussed. It is noted that although much of the analysis can be automated, the operator-computer dialog mode is essential for certain stages of the analysis.

  10. Asbestos quantification in track ballast, a complex analytical problem

    NASA Astrophysics Data System (ADS)

    Cavallo, Alessandro

    2016-04-01

    Track ballast forms the trackbeb upon which railroad ties are laid. It is used to bear the load from the railroad ties, to facilitate water drainage, and also to keep down vegetation. It is typically made of angular crushed stone, with a grain size between 30 and 60 mm, with good mechanical properties (high compressive strength, freeze - thaw resistance, resistance to fragmentation). The most common rock types are represented by basalts, porphyries, orthogneisses, some carbonatic rocks and "green stones" (serpentinites, prasinites, amphibolites, metagabbros). Especially "green stones" may contain traces, and sometimes appreciable amounts of asbestiform minerals (chrysotile and/or fibrous amphiboles, generally tremolite - actinolite). In Italy, the chrysotile asbestos mine in Balangero (Turin) produced over 5 Mt railroad ballast (crushed serpentinites), which was used for the railways in northern and central Italy, from 1930 up to 1990. In addition to Balangero, several other serpentinite and prasinite quarries (e.g. Emilia Romagna) provided the railways ballast up to the year 2000. The legal threshold for asbestos content in track ballast is established in 1000 ppm: if the value is below this threshold, the material can be reused, otherwise it must be disposed of as hazardous waste, with very high costs. The quantitative asbestos determination in rocks is a very complex analytical issue: although techniques like TEM-SAED and micro-Raman are very effective in the identification of asbestos minerals, a quantitative determination on bulk materials is almost impossible or really expensive and time consuming. Another problem is represented by the discrimination of asbestiform minerals (e.g. chrysotile, asbestiform amphiboles) from the common acicular - pseudo-fibrous varieties (lamellar serpentine minerals, prismatic/acicular amphiboles). In this work, more than 200 samples from the main Italian rail yards were characterized by a combined use of XRD and a special SEM

  11. Problem analysis of geotechnical well drilling in complex environment

    NASA Astrophysics Data System (ADS)

    Kasenov, A. K.; Biletskiy, M. T.; Ratov, B. T.; Korotchenko, T. V.

    2015-02-01

    The article examines primary causes of problems occurring during the drilling of geotechnical wells (injection, production and monitoring wells) for in-situ leaching to extract uranium in South Kazakhstan. Such a drilling problem as hole caving which is basically caused by various chemical and physical factors (hydraulic, mechanical, etc.) has been thoroughly investigated. The analysis of packing causes has revealed that this problem usually occurs because of insufficient amount of drilling mud being associated with small cross section downward flow and relatively large cross section upward flow. This is explained by the fact that when spear bores are used to drill clay rocks, cutting size is usually rather big and there is a risk for clay particles to coagulate.

  12. [Problems of formal organizational structure of industrial health care complexes].

    PubMed

    Włodarczyk, C

    1978-01-01

    The author formulates the thesis that the description of organizational structure of industrial health care complex calls for isolation of the following aspects:--structure of territorial links--systemof organizational units and divisions--organization of basic functions--structure of management--structure of supervision of middle and lowe-level personnel--composition of health care complex council--system of accessibility ranges. Each of the above aspects has been considered on the basis of operative rules of law, using organizational analysis methods. PMID:745544

  13. Client-Centered Problem-Solving Networks in Complex Organizations.

    ERIC Educational Resources Information Center

    Tucker, Charles; Hanna, Michael

    Employees in different kinds of organizations were surveyed for their perceptions of their companies' client and operational problem-solving networks. The individuals came from a manufacturing firm, a community college, a telephone company, a farmers' cooperative, and a hospital. Interviews were conducted with those people reporting numerous…

  14. The Teaching-Upbringing Complex: Experience, Problems, Prospects.

    ERIC Educational Resources Information Center

    Vul'fov, B. Z.; And Others

    1990-01-01

    Describes the teaching-upbringing complex (UVK), a new type of Soviet school that attempts to deal with raising and educating children in an integrated manner. Stresses combining required subjects with students' special interests to encourage student achievement and teacher involvement. Concentrates on the development of self-expression and…

  15. Games that Enlist Collective Intelligence to Solve Complex Scientific Problems.

    PubMed

    Burnett, Stephen; Furlong, Michelle; Melvin, Paul Guy; Singiser, Richard

    2016-03-01

    There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article. PMID:27047610

  16. Games that Enlist Collective Intelligence to Solve Complex Scientific Problems

    PubMed Central

    Burnett, Stephen; Furlong, Michelle; Melvin, Paul Guy; Singiser, Richard

    2016-01-01

    There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article. PMID:27047610

  17. On the Complexity of the Asymmetric VPN Problem

    NASA Astrophysics Data System (ADS)

    Rothvoß, Thomas; Sanità, Laura

    We give the first constant factor approximation algorithm for the asymmetric Virtual Private Network (textsc{Vpn}) problem with arbitrary concave costs. We even show the stronger result, that there is always a tree solution of cost at most 2·OPT and that a tree solution of (expected) cost at most 49.84·OPT can be determined in polynomial time.

  18. Problem-oriented stereo vision quality evaluation complex

    NASA Astrophysics Data System (ADS)

    Sidorchuk, D.; Gusamutdinova, N.; Konovalenko, I.; Ershov, E.

    2015-12-01

    We describe an original low cost hardware setting for efficient testing of stereo vision algorithms. The method uses a combination of a special hardware setup and mathematical model and is easy to construct, precise in applications of our interest. For a known scene we derive its analytical representation, called virtual scene. Using a four point correspondence between the scene and virtual one we compute extrinsic camera parameters, and project virtual scene on the image plane, which is the ground truth for depth map. Another result, presented in this paper, is a new depth map quality metric. Its main purpose is to tune stereo algorithms for particular problem, e.g. obstacle avoidance.

  19. On the problem of constructing a modern, economic radiotelescope complex

    NASA Technical Reports Server (NTRS)

    Bogomolov, A. F.; Sokolov, A. G.; Poperechenko, B. A.; Polyak, V. S.

    1977-01-01

    Criteria for comparing and planning the technical and economic characteristics of large parabolic reflector antenna systems and other types used in radioastronomy and deep space communications are discussed. The experience gained in making and optimizing a series of highly efficient parabolic antennas in the USSR is reviewed. Several ways are indicated for further improving the complex characteristics of antennas similar to the original TNA-1500 64m radio telescope. The suggestions can be applied in planning the characteristics of radiotelescopes which are now being built, in particular, the TNA-8000 with a diameter of 128 m.

  20. Assessing Complex Problem-Solving Skills and Knowledge Assembly Using Web-Based Hypermedia Design.

    ERIC Educational Resources Information Center

    Dabbagh, Nada

    This research project studied the effects of hierarchical versus heterarchical hypermedia structures of Web-based case representations on complex problem-solving skills and knowledge assembly in problem-centered learning environments in order to develop a system or model that informs the design of Web-based cases for ill-structured problems across…

  1. Complex multipole beam approach to electromagnetic scattering problems

    NASA Astrophysics Data System (ADS)

    Mittra, Raj; Boag, Amir

    1994-03-01

    A novel approach to reducing the matrix size associated with the Method of Moments (MoM) solution of the problem of electromagnetic scattering from arbitrary shaped closed bodies is presented in this paper. The key step in this approach is to represent the scattered field in terms of a series of beams produced by multipole sources resemble the Gabor basis functions. By utilizing the properties of the Gabor series, guidelines for selecting the orders as well as locations of the multipole sources are developed. It is shown that the present approach not only reduces the number of unknowns, but also generates a generalized impedance matrix with a banded structure and a low condition number. The accuracy of the proposed method is verified by comparing the numerical results with those derived by using the method of moments.

  2. How Students Circumvent Problem-Solving Strategies that Require Greater Cognitive Complexity.

    ERIC Educational Resources Information Center

    Niaz, Mansoor

    1996-01-01

    Analyzes the great diversity in problem-solving strategies used by students in solving a chemistry problem and discusses the relationship between these variables and different cognitive variables. Concludes that students try to circumvent certain problem-solving strategies by adapting flexible and stylistic innovations that render the cognitive…

  3. Dusty (complex) plasmas: recent developments, advances, and unsolved problems

    NASA Astrophysics Data System (ADS)

    Popel, Sergey

    The area of dusty (complex) plasma research is a vibrant subfield of plasma physics that be-longs to frontier research in physical sciences. This area is intrinsically interdisciplinary and encompasses astrophysics, planetary science, atmospheric science, magnetic fusion energy sci-ence, and various applied technologies. The research in dusty plasma started after two major discoveries in very different areas: (1) the discovery by the Voyager 2 spacecraft in 1980 of the radial spokes in Saturn's B ring, and (2) the discovery of the early 80's growth of contaminating dust particles in plasma processing. Dusty plasmas are ubiquitous in the universe; examples are proto-planetary and solar nebulae, molecular clouds, supernovae explosions, interplanetary medium, circumsolar rings, and asteroids. Within the solar system, we have planetary rings (e.g., Saturn and Jupiter), Martian atmosphere, cometary tails and comae, dust clouds on the Moon, etc. Close to the Earth, there are noctilucent clouds and polar mesospheric summer echoes, which are clouds of tiny (charged) ice particles that are formed in the summer polar mesosphere at the altitudes of about 82-95 km. Dust and dusty plasmas are also found in the vicinity of artificial satellites and space stations. Dust also turns out to be common in labo-ratory plasmas, such as in the processing of semiconductors and in tokamaks. In processing plasmas, dust particles are actually grown in the discharge from the reactive gases used to form the plasmas. An example of the relevance of industrial dusty plasmas is the growth of silicon microcrystals for improved solar cells in the future. In fact, nanostructured polymorphous sili-con films provide solar cells with high and time stable efficiency. These nano-materials can also be used for the fabrication of ultra-large-scale integration circuits, display devices, single elec-tron devices, light emitting diodes, laser diodes, and others. In microelectronic industries, dust has to be

  4. Measurements of student understanding on complex scientific reasoning problems

    NASA Astrophysics Data System (ADS)

    Izumi, Alisa Sau-Lin

    While there has been much discussion of cognitive processes underlying effective scientific teaching, less is known about the response nature of assessments targeting processes of scientific reasoning specific to biology content. This study used multiple-choice (m-c) and short-answer essay student responses to evaluate progress in high-order reasoning skills. In a pilot investigation of student responses on a non-content-based test of scientific thinking, it was found that some students showed a pre-post gain on the m-c test version while showing no gain on a short-answer essay version of the same questions. This result led to a subsequent research project focused on differences between alternate versions of tests of scientific reasoning. Using m-c and written responses from biology tests targeted toward the skills of (1) reasoning with a model and (2) designing controlled experiments, test score frequencies, factor analysis, and regression models were analyzed to explore test format differences. Understanding the format differences in tests is important for the development of practical ways to identify student gains in scientific reasoning. The overall results suggested test format differences. Factor analysis revealed three interpretable factors---m-c format, genetics content, and model-based reasoning. Frequency distributions on the m-c and open explanation portions of the hybrid items revealed that many students answered the m-c portion of an item correctly but gave inadequate explanations. In other instances students answered the m-c portion incorrectly yet demonstrated sufficient explanation or answered the m-c correctly and also provided poor explanations. When trying to fit test score predictors for non-associated student measures---VSAT, MSAT, high school grade point average, or final course grade---the test scores accounted for close to zero percent of the variance. Overall, these results point to the importance of using multiple methods of testing and of

  5. Sleep, Cognition, and Behavioral Problems in School-Age Children: A Century of Research Meta-Analyzed

    ERIC Educational Resources Information Center

    Astill, Rebecca G.; Van der Heijden, Kristiaan B.; Van IJzendoorn, Marinus H.; Van Someren, Eus J. W.

    2012-01-01

    Clear associations of sleep, cognitive performance, and behavioral problems have been demonstrated in meta-analyses of studies in adults. This meta-analysis is the first to systematically summarize all relevant studies reporting on sleep, cognition, and behavioral problems in healthy school-age children (5-12 years old) and incorporates 86 studies…

  6. One Problem, Many Solutions: Simple Statistical Approaches Help Unravel the Complexity of the Immune System in an Ecological Context

    PubMed Central

    Matson, Kevin D.; Tieleman, B. Irene

    2011-01-01

    The immune system is a complex collection of interrelated and overlapping solutions to the problem of disease. To deal with this complexity, researchers have devised multiple ways to measure immune function and to analyze the resulting data. In this way both organisms and researchers employ many tactics to solve a complex problem. One challenge facing ecological immunologists is the question of how these many dimensions of immune function can be synthesized to facilitate meaningful interpretations and conclusions. We tackle this challenge by employing and comparing several statistical methods, which we used to test assumptions about how multiple aspects of immune function are related at different organizational levels. We analyzed three distinct datasets that characterized 1) species, 2) subspecies, and 3) among- and within-individual level differences in the relationships among multiple immune indices. Specifically, we used common principal components analysis (CPCA) and two simpler approaches, pair-wise correlations and correlation circles. We also provide a simple example of how these techniques could be used to analyze data from multiple studies. Our findings lead to several general conclusions. First, relationships among indices of immune function may be consistent among some organizational groups (e.g. months over the annual cycle) but not others (e.g. species); therefore any assumption of consistency requires testing before further analyses. Second, simple statistical techniques used in conjunction with more complex multivariate methods give a clearer and more robust picture of immune function than using complex statistics alone. Moreover, these simpler approaches have potential for analyzing comparable data from multiple studies, especially as the field of ecological immunology moves towards greater methodological standardization. PMID:21526186

  7. Mass analyzed threshold ionization of phenolṡCO: Intermolecular binding energies of a hydrogen-bonded complex

    NASA Astrophysics Data System (ADS)

    Haines, Stephen R.; Dessent, Caroline E. H.; Müller-Dethlefs, Klaus

    1999-08-01

    [PhenolṡCO]+ was studied using a combination of two-color resonant zero kinetic energy (ZEKE) spectroscopy and mass analyzed threshold ionization (MATI) spectroscopy to investigate the interaction of the CO ligand with a hydrogen-bonding cation. Vibrational progressions were observed in three intermolecular modes, the in-plane bend (42 cm-1), stretch (130 cm-1), and in-plane wag (160 cm-1), and are consistent with a planar hydrogen-bonded structure where the CO bonds through the carbon atom to the phenol OH group. Dissociation energies for the S0, S1, and D0 states were determined as 659±20, 849±20, and 2425±10 cm-1, respectively. The cationic and neutral dissociation energies of the phenolṡCO complex are considerably stronger than those of phenolṡN2, demonstrating the extent to which the larger quadrupole of CO affects the strength of binding.

  8. Using SEM to Analyze Complex Survey Data: A Comparison between Design-Based Single-Level and Model-Based Multilevel Approaches

    ERIC Educational Resources Information Center

    Wu, Jiun-Yu; Kwok, Oi-man

    2012-01-01

    Both ad-hoc robust sandwich standard error estimators (design-based approach) and multilevel analysis (model-based approach) are commonly used for analyzing complex survey data with nonindependent observations. Although these 2 approaches perform equally well on analyzing complex survey data with equal between- and within-level model structures…

  9. An Exploratory Framework for Handling the Complexity of Mathematical Problem Posing in Small Groups

    ERIC Educational Resources Information Center

    Kontorovich, Igor; Koichu, Boris; Leikin, Roza; Berman, Avi

    2012-01-01

    The paper introduces an exploratory framework for handling the complexity of students' mathematical problem posing in small groups. The framework integrates four facets known from past research: task organization, students' knowledge base, problem-posing heuristics and schemes, and group dynamics and interactions. In addition, it contains a new…

  10. Communities of Practice: A New Approach to Solving Complex Educational Problems

    ERIC Educational Resources Information Center

    Cashman, J.; Linehan, P.; Rosser, M.

    2007-01-01

    Communities of Practice offer state agency personnel a promising approach for engaging stakeholder groups in collaboratively solving complex and, often, persistent problems in special education. Communities of Practice can help state agency personnel drive strategy, solve problems, promote the spread of best practices, develop members'…

  11. The Ethnology of Traditional and Complex Societies. Test Edition. AAAS Study Guides on Contemporary Problems.

    ERIC Educational Resources Information Center

    Simic, Andrei

    This is one of several study guides on contemporary problems produced by the American Association for the Advancement of Science with support of the National Science Foundation. This guide focuses on the ethnology of traditional and complex societies. Part I, Simple and Complex Societies, includes three sections: (1) Introduction: Anthropologists…

  12. Development and operation of an integrated sampling probe and gas analyzer for turbulent mixing studies in complex supersonic flows

    NASA Astrophysics Data System (ADS)

    Wiswall, John D.

    -temporal characteristic scales of the flow on the resulting time-area-averaged concentration measurements. Two series of experiments were performed to verify the probe's design; the first used Schlieren photography and verified that the probe sampled from the supersonic flowfield isokinetically. The second series involved traversing the probe across a free mixing layer of air and helium, to obtain both mean concentration and high frequency measurements. High-frequency data was statistically analyzed and inspection of the Probability Density Function (PDF) of the hot-film response was instrumental to interpret how well the resulting average mixing measurements represent these types of complex flows. The probe is minimally intrusive, has accuracy comparable to its predecessors, has an improved frequency response for mean concentration measurements, and samples from a very small area in the flowfield.

  13. Complex Cervical Aortic Arch With Hypoplasia: A Simple Solution to a Complex Problem.

    PubMed

    Rajbanshi, Bijoy G; Gautam, Navin C; Pradhan, Sidhartha; Sharma, Apurb; Ghimire, Ram K; Joyce, Lyle D

    2016-07-01

    We report a rare case of a 6-year-old boy with a complex right-sided cervical aortic arch, with retroesophageal hypoplastic transverse arch, left subclavian artery arising from the Kommerell diverticulum of the descending aorta, and a vascular ring formed by the ductus ligament. An extraanatomic ascending-to-descending aorta bypass was done through a median sternotomy along with division of the ductus ligament, without any complications and good results. PMID:27343523

  14. Harm reduction as a complex adaptive system: A dynamic framework for analyzing Tanzanian policies concerning heroin use.

    PubMed

    Ratliff, Eric A; Kaduri, Pamela; Masao, Frank; Mbwambo, Jessie K K; McCurdy, Sheryl A

    2016-04-01

    Contrary to popular belief, policies on drug use are not always based on scientific evidence or composed in a rational manner. Rather, decisions concerning drug policies reflect the negotiation of actors' ambitions, values, and facts as they organize in different ways around the perceived problems associated with illicit drug use. Drug policy is thus best represented as a complex adaptive system (CAS) that is dynamic, self-organizing, and coevolving. In this analysis, we use a CAS framework to examine how harm reduction emerged around heroin trafficking and use in Tanzania over the past thirty years (1985-present). This account is an organizational ethnography based on of the observant participation of the authors as actors within this system. We review the dynamic history and self-organizing nature of harm reduction, noting how interactions among system actors and components have coevolved with patterns of heroin us, policing, and treatment activities over time. Using a CAS framework, we describe harm reduction as a complex process where ambitions, values, facts, and technologies interact in the Tanzanian sociopolitical environment. We review the dynamic history and self-organizing nature of heroin policies, noting how the interactions within and between competing prohibitionist and harm reduction policies have changed with patterns of heroin use, policing, and treatment activities over time. Actors learn from their experiences to organize with other actors, align their values and facts, and implement new policies. Using a CAS approach provides researchers and policy actors a better understanding of patterns and intricacies in drug policy. This knowledge of how the system works can help improve the policy process through adaptive action to introduce new actors, different ideas, and avenues for communication into the system. PMID:26790689

  15. Exhaustive expansion: A novel technique for analyzing complex data generated by higher-order polychromatic flow cytometry experiments

    PubMed Central

    2010-01-01

    Background The complex data sets generated by higher-order polychromatic flow cytometry experiments are a challenge to analyze. Here we describe Exhaustive Expansion, a data analysis approach for deriving hundreds to thousands of cell phenotypes from raw data, and for interrogating these phenotypes to identify populations of biological interest given the experimental context. Methods We apply this approach to two studies, illustrating its broad applicability. The first examines the longitudinal changes in circulating human memory T cell populations within individual patients in response to a melanoma peptide (gp100209-2M) cancer vaccine, using 5 monoclonal antibodies (mAbs) to delineate subpopulations of viable, gp100-specific, CD8+ T cells. The second study measures the mobilization of stem cells in porcine bone marrow that may be associated with wound healing, and uses 5 different staining panels consisting of 8 mAbs each. Results In the first study, our analysis suggests that the cell surface markers CD45RA, CD27 and CD28, commonly used in historical lower order (2-4 color) flow cytometry analysis to distinguish memory from naïve and effector T cells, may not be obligate parameters in defining central memory T cells (TCM). In the second study, we identify novel phenotypes such as CD29+CD31+CD56+CXCR4+CD90+Sca1-CD44+, which may characterize progenitor cells that are significantly increased in wounded animals as compared to controls. Conclusions Taken together, these results demonstrate that Exhaustive Expansion supports thorough interrogation of complex higher-order flow cytometry data sets and aids in the identification of potentially clinically relevant findings. PMID:21034498

  16. Methods and Challenges of Analyzing Spatial Data for Social Work Problems: The Case of Examining Child Maltreatment Geographically

    ERIC Educational Resources Information Center

    Freisthler, Bridget; Lery, Bridgette; Gruenewald, Paul J.; Chow, Julian

    2006-01-01

    Increasingly, social work researchers are interested in examining how "place" and "location" contribute to social problems. Yet, often these researchers do not use the specialized spatial statistical techniques developed to handle the analytic issues faced when conducting ecological analyses. This article explains the importance of these…

  17. Thinking Problems of the Present Collision Warning Work by Analyzing the Intersection Between Cosmos 2251 and Iridium 33

    NASA Astrophysics Data System (ADS)

    Wang, R. L.; Liu, W.; Yan, R. D.; Gong, J. C.

    2013-08-01

    After Cosmos 2251 and Iridium 33 collision breakup event, the institutions at home and abroad began the collision warning analysis for the event. This paper compared the results from the different research units and discussed the problems of the current collision warning work, then gave the suggestions of further study.

  18. Analyzing Multiple Informant Data on Child and Adolescent Behavior Problems: Predictive Validity and Comparison of Aggregation Procedures

    ERIC Educational Resources Information Center

    van Dulmen, Manfred H. M.; Egeland, Byron

    2011-01-01

    We compared the predictive validity of five aggregation methods for multiple informant data on child and adolescent behavior problems. In addition, we compared the predictive validity of these aggregation methods with single informant scores. Data were derived from the Minnesota Longitudinal Study of Parents and Children (N = 175). Maternal and…

  19. Temporality Matters: Advancing a Method for Analyzing Problem-Solving Processes in a Computer-Supported Collaborative Environment

    ERIC Educational Resources Information Center

    Kapur, Manu

    2011-01-01

    This paper argues for a need to develop methods for examining temporal patterns in computer-supported collaborative learning (CSCL) groups. It advances one such quantitative method--Lag-sequential Analysis (LsA)--and instantiates it in a study of problem-solving interactions of collaborative groups in an online, synchronous environment. LsA…

  20. Conceptual and procedural knowledge community college students use when solving a complex science problem

    NASA Astrophysics Data System (ADS)

    Steen-Eibensteiner, Janice Lee

    2006-07-01

    A strong science knowledge base and problem solving skills have always been highly valued for employment in the science industry. Skills currently needed for employment include being able to problem solve (Overtoom, 2000). Academia also recognizes the need for effectively teaching students to apply problem solving skills in clinical settings. This thesis investigates how students solve complex science problems in an academic setting in order to inform the development of problem solving skills for the workplace. Students' use of problem solving skills in the form of learned concepts and procedural knowledge was studied as students completed a problem that might come up in real life. Students were taking a community college sophomore biology course, Human Anatomy & Physiology II. The problem topic was negative feedback inhibition of the thyroid and parathyroid glands. The research questions answered were (1) How well do community college students use a complex of conceptual knowledge when solving a complex science problem? (2) What conceptual knowledge are community college students using correctly, incorrectly, or not using when solving a complex science problem? (3) What problem solving procedural knowledge are community college students using successfully, unsuccessfully, or not using when solving a complex science problem? From the whole class the high academic level participants performed at a mean of 72% correct on chapter test questions which was a low average to fair grade of C-. The middle and low academic participants both failed (F) the test questions (37% and 30% respectively); 29% (9/31) of the students show only a fair performance while 71% (22/31) fail. From the subset sample population of 2 students each from the high, middle, and low academic levels selected from the whole class 35% (8/23) of the concepts were used effectively, 22% (5/23) marginally, and 43% (10/23) poorly. Only 1 concept was used incorrectly by 3/6 of the students and identified as

  1. Analyzing the Effects of a Mathematics Problem-Solving Program, Exemplars, on Mathematics Problem-Solving Scores with Deaf and Hard-of-Hearing Students

    ERIC Educational Resources Information Center

    Chilvers, Amanda Leigh

    2013-01-01

    Researchers have noted that mathematics achievement for deaf and hard-of-hearing (d/hh) students has been a concern for many years, including the ability to problem solve. This quasi-experimental study investigates the use of the Exemplars mathematics program with students in grades 2-8 in a school for the deaf that utilizes American Sign Language…

  2. Processing and Correcting Master Images to Analyze and map Metamorphic Core Complexes in the Southern Basin and Range Province

    NASA Astrophysics Data System (ADS)

    Sanchez, S. O.

    2004-12-01

    Metamorphic core complexes (MCCs) have been of great interest to geologists and geophysicists and our goal is to facilitate integrated studies of these intriguing features. Our specific targets are the exposed Whipple Mountains in Southeastern California and the spectrally similar Mohave Mountains in Western Arizona. These two ranges were selected for study using the MODIS/ASTER airborne sensor also known as MASTER, and NASA/JPL acquired the data for us. These two ranges were chosen because of their close proximity to each other in the imagery. This sensor was chosen because it has a good resolution (15m) and 50 different bands ranging from the visible to thermal infrared. However, because it is flown on a light aircraft its flight line patterns and photogrammetric distortions make it hard to georeference and mosaic with other images from adjacent flight lines. The distortions become misalignments of images during mosaicing. This project involved two efforts: 1) developing a method for correcting and processing MASTER multispectral images; and 2) using those images to analyze and map MCCs in the southern Basin and Range Province. Standard image processing techniques available within the ENVI software package were applied to this imagery to geometrically correct, mosaic, and spectrally process it in order to locate defining characteristics of MCCs that are mappable with the imagery. These techniques include the use of warping, histogram matching, mosaicing, classification, Principal Component Analysis, decorrelation stretching, Minimum Noise Fraction Transformation, Pixel Purity Index, and end member analysis.

  3. Performance of isotope ratio infrared spectroscopy (IRIS) for analyzing waters containing organic contaminants: Problems and solutions (Invited)

    NASA Astrophysics Data System (ADS)

    West, A. G.; Goldsmith, G. R.; Dawson, T. E.

    2010-12-01

    The development of isotope ratio infrared spectroscopy (IRIS) for simultaneous δ2H and δ18O analysis of liquid water samples shows much potential for affordable, simple and potentially portable isotopic analyses. IRIS has been shown to be comparable in precision and accuracy to isotope ratio mass spectrometry (IRMS) when analyzing pure water samples. However, recent studies have shown that organic contaminants in analyzed water samples may interfere with the spectroscopy leading to errors of considerable magnitude in the reported stable isotope data. Many environmental, biological and forensic studies require analyses of water containing organic contaminants in some form, yet our current methods of removing organic contaminants prior to analysis appear inadequate for IRIS. Treated plant water extracts analyzed by IRIS showed deviations as large as 35‰ (δ2H) and 11.8‰ (δ18O) from the IRMS value, indicating that trace amounts of contaminants were sufficient to disrupt IRIS analyses. However, not all organic contaminants negatively influence IRIS. For such samples, IRIS presents a labour saving method relative to IRMS. Prior to widespread use in the environmental, biological and forensic sciences, a means of obtaining reliable data from IRIS needs to be demonstrated. One approach is to use instrument-based software to flag potentially problematic spectra and output a corrected isotope value based on analysis of the spectra. We evaluate this approach on two IRIS systems and discuss the way forward for ensuring accurate stable isotope data using IRIS.

  4. Medicines counterfeiting is a complex problem: a review of key challenges across the supply chain.

    PubMed

    Tremblay, Michael

    2013-02-01

    The paper begins by asking why there is a market for counterfeit medicines, which in effect creates the problem of counterfeiting itself. Contributing factors include supply chain complexity and the lack of whole-systems thinking. These two underpin the author's view that counterfeiting is a complex (i.e. wicked) problem, and that corporate, public policy and regulatory actions need to be mindful of how their actions may be causal. The paper offers a problem-based review of key components of this complexity, viz., the knowledge end-users/consumers have of medicines; whether restrictive information policies may hamper information provision to patients; the internet's direct access to consumers; internet-enabled distribution of unsafe and counterfeit medicines; whether the internet is a parallel and competitive supply chain to legitimate routes; organised crime as an emerging medicines manufacturer and supplier and whether substandard medicines is really the bigger problem. Solutions respect the perceived complexity of the supply chain challenges. The paper identifies the need to avoid technologically-driven solutions, calling for 'technological agnosticism'. Both regulation and public policy need to reflect the dynamic nature of the problem and avoid creating perverse incentives; it may be, for instance, that medicines pricing and reimbursement policies, which affect consumer/patient access may act as market signals to counterfeiters, since this creates a cash market in cheaper drugs. PMID:23656447

  5. Analogy as a strategy for supporting complex problem solving under uncertainty.

    PubMed

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving. PMID:22815065

  6. On the Critical Behaviour, Crossover Point and Complexity of the Exact Cover Problem

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.; Smelyanskiy, Vadim N.; Shumow, Daniel; Koga, Dennis (Technical Monitor)

    2003-01-01

    Research into quantum algorithms for NP-complete problems has rekindled interest in the detailed study a broad class of combinatorial problems. A recent paper applied the quantum adiabatic evolution algorithm to the Exact Cover problem for 3-sets (EC3), and provided an empirical evidence that the algorithm was polynomial. In this paper we provide a detailed study of the characteristics of the exact cover problem. We present the annealing approximation applied to EC3, which gives an over-estimate of the phase transition point. We also identify empirically the phase transition point. We also study the complexity of two classical algorithms on this problem: Davis-Putnam and Simulated Annealing. For these algorithms, EC3 is significantly easier than 3-SAT.

  7. To Live With Complexity: A Problem for Students--And for the Rest of Us.

    ERIC Educational Resources Information Center

    Ford, Franklin L.

    1968-01-01

    In articles on student unrest, there is a great tendency to oversimplify the issues and to assume that the components and stakes are the same from Minnesota to Czechoslovakia. To understand this complex phenomenon, the following questions should be answered: How many different problems, of what orders of magnitude and intensity, need to be…

  8. Computer-Based Assessment of Complex Problem Solving: Concept, Implementation, and Application

    ERIC Educational Resources Information Center

    Greiff, Samuel; Wustenberg, Sascha; Holt, Daniel V.; Goldhammer, Frank; Funke, Joachim

    2013-01-01

    Complex Problem Solving (CPS) skills are essential to successfully deal with environments that change dynamically and involve a large number of interconnected and partially unknown causal influences. The increasing importance of such skills in the 21st century requires appropriate assessment and intervention methods, which in turn rely on adequate…

  9. Differential Relations between Facets of Complex Problem Solving and Students' Immigration Background

    ERIC Educational Resources Information Center

    Sonnleitner, Philipp; Brunner, Martin; Keller, Ulrich; Martin, Romain

    2014-01-01

    Whereas the assessment of complex problem solving (CPS) has received increasing attention in the context of international large-scale assessments, its fairness in regard to students' cultural background has gone largely unexplored. On the basis of a student sample of 9th-graders (N = 299), including a representative number of immigrant students (N…

  10. Small-Group Problem-Based Learning as a Complex Adaptive System

    ERIC Educational Resources Information Center

    Mennin, Stewart

    2007-01-01

    Small-group problem-based learning (PBL) is widely embraced as a method of study in health professions schools and at many different levels of education. Complexity science provides a different lens with which to view and understand the application of this method. It presents new concepts and vocabulary that may be unfamiliar to practitioners of…

  11. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    ERIC Educational Resources Information Center

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  12. Ecosystem services and cooperative fisheries research to address a complex fishery problem

    EPA Science Inventory

    The St. Louis River represents a complex fishery management problem. Current fishery management goals have to be developed taking into account bi-state commercial, subsistence and recreational fisheries which are valued for different characteristics by a wide range of anglers, as...

  13. Calculating Probabilistic Distance to Solution in a Complex Problem Solving Domain

    ERIC Educational Resources Information Center

    Sudol, Leigh Ann; Rivers, Kelly; Harris, Thomas K.

    2012-01-01

    In complex problem solving domains, correct solutions are often comprised of a combination of individual components. Students usually go through several attempts, each attempt reflecting an individual solution state that can be observed during practice. Classic metrics to measure student performance over time rely on counting the number of…

  14. The Development of Complex Problem Solving in Adolescence: A Latent Growth Curve Analysis

    ERIC Educational Resources Information Center

    Frischkorn, Gidon T.; Greiff, Samuel; Wüstenberg, Sascha

    2014-01-01

    Complex problem solving (CPS) as a cross-curricular competence has recently attracted more attention in educational psychology as indicated by its implementation in international educational large-scale assessments such as the Programme for International Student Assessment. However, research on the development of CPS is scarce, and the few…

  15. Learning about Complex Multi-Stakeholder Issues: Assessing the Visual Problem Appraisal

    ERIC Educational Resources Information Center

    Witteveen, Loes; Put, Marcel; Leeuwis, Cees

    2010-01-01

    This paper presents an evaluation of the visual problem appraisal (VPA) learning environment in higher education. The VPA has been designed for the training of competences that are required in complex stakeholder settings in relation to sustainability issues. The design of VPA incorporates a diversity of instruction strategies to accommodate the…

  16. On Using Meta-Modeling and Multi-Modeling to Address Complex Problems

    ERIC Educational Resources Information Center

    Abu Jbara, Ahmed

    2013-01-01

    Models, created using different modeling techniques, usually serve different purposes and provide unique insights. While each modeling technique might be capable of answering specific questions, complex problems require multiple models interoperating to complement/supplement each other; we call this Multi-Modeling. To address the syntactic and…

  17. Mixing Bandt-Pompe and Lempel-Ziv approaches: another way to analyze the complexity of continuous-state sequences

    NASA Astrophysics Data System (ADS)

    Zozor, S.; Mateos, D.; Lamberti, P. W.

    2014-05-01

    In this paper, we propose to mix the approach underlying Bandt-Pompe permutation entropy with Lempel-Ziv complexity, to design what we call Lempel-Ziv permutation complexity. The principle consists of two steps: (i) transformation of a continuous-state series that is intrinsically multivariate or arises from embedding into a sequence of permutation vectors, where the components are the positions of the components of the initial vector when re-arranged; (ii) performing the Lempel-Ziv complexity for this series of `symbols', as part of a discrete finite-size alphabet. On the one hand, the permutation entropy of Bandt-Pompe aims at the study of the entropy of such a sequence; i.e., the entropy of patterns in a sequence (e.g., local increases or decreases). On the other hand, the Lempel-Ziv complexity of a discrete-state sequence aims at the study of the temporal organization of the symbols (i.e., the rate of compressibility of the sequence). Thus, the Lempel-Ziv permutation complexity aims to take advantage of both of these methods. The potential from such a combined approach - of a permutation procedure and a complexity analysis - is evaluated through the illustration of some simulated data and some real data. In both cases, we compare the individual approaches and the combined approach.

  18. An HPLC chromatographic framework to analyze the β-cyclodextrin/solute complexation mechanism using a carbon nanotube stationary phase.

    PubMed

    Aljhni, Rania; Andre, Claire; Lethier, Lydie; Guillaume, Yves Claude

    2015-11-01

    A carbon nanotube (CNT) stationary phase was used for the first time to study the β-cyclodextrin (β-CD) solute complexation mechanism using high performance liquid chromatography (HPLC). For this, the β-CD was added at various concentrations in the mobile phase and the effect of column temperature was studied on both the retention of a series of aniline and benzoic acid derivatives with the CNT stationary phase and their complexation mechanism with β-CD. A decrease in the solute retention factor was observed for all the studied molecules without change in the retention order. The apparent formation constant KF of the inclusion complex β-CD/solute was determined at various temperatures. Our results showed that the interaction of β-CD with both the mobile phase and the stationary phase interfered in the complex formation. The enthalpy and entropy of the complex formation (ΔHF and ΔSF) between the solute molecule and CD were determined using a thermodynamic approach. Negative enthalpies and entropies indicated that the inclusion process of the studied molecule in the CD cavity was enthalpically driven and that the hydrogen bonds between carboxylic or aniline groups and the functional groups on the β-CD rim play an important role in the complex formation. PMID:26452814

  19. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem.

    PubMed

    Williams, Patricia Ah; Woodward, Andrew J

    2015-01-01

    The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat. PMID:26229513

  20. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem

    PubMed Central

    Williams, Patricia AH; Woodward, Andrew J

    2015-01-01

    The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat. PMID:26229513

  1. Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving.

    PubMed

    Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni

    2015-03-01

    It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load. PMID:25652466

  2. Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving

    PubMed Central

    Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni

    2015-01-01

    It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load. PMID:25652466

  3. Solving the three-body Coulomb breakup problem using exterior complex scaling

    SciTech Connect

    McCurdy, C.W.; Baertschy, M.; Rescigno, T.N.

    2004-05-17

    Electron-impact ionization of the hydrogen atom is the prototypical three-body Coulomb breakup problem in quantum mechanics. The combination of subtle correlation effects and the difficult boundary conditions required to describe two electrons in the continuum have made this one of the outstanding challenges of atomic physics. A complete solution of this problem in the form of a ''reduction to computation'' of all aspects of the physics is given by the application of exterior complex scaling, a modern variant of the mathematical tool of analytic continuation of the electronic coordinates into the complex plane that was used historically to establish the formal analytic properties of the scattering matrix. This review first discusses the essential difficulties of the three-body Coulomb breakup problem in quantum mechanics. It then describes the formal basis of exterior complex scaling of electronic coordinates as well as the details of its numerical implementation using a variety of methods including finite difference, finite elements, discrete variable representations, and B-splines. Given these numerical implementations of exterior complex scaling, the scattering wave function can be generated with arbitrary accuracy on any finite volume in the space of electronic coordinates, but there remains the fundamental problem of extracting the breakup amplitudes from it. Methods are described for evaluating these amplitudes. The question of the volume-dependent overall phase that appears in the formal theory of ionization is resolved. A summary is presented of accurate results that have been obtained for the case of electron-impact ionization of hydrogen as well as a discussion of applications to the double photoionization of helium.

  4. The Role of Prior Knowledge and Problem Contexts in Students' Explanations of Complex System

    NASA Astrophysics Data System (ADS)

    Barth-Cohen, Lauren April

    The purpose of this dissertation is to study students' competencies in generating scientific explanations within the domain of complex systems, an interdisciplinary area in which students tend to have difficulties. While considering students' developing explanations of how complex systems work, I investigate the role of prior knowledge and how students' explanations systematically vary across seven problem contexts (e.g. the movement of sand dunes, the formation of traffic jams, and diffusion in water). Using the Knowledge in Pieces epistemological perspective, I build a mini-theory of how students construct explanations about the behavior of complex systems. The mini-theory shows how advanced, "decentralized" explanations evolve from a variety of prior knowledge resources, which depend on specific features of the problem. A general emphasis on students' competences is exhibited through three strands of analysis: (1) a focus on moment-to-moment shifts in individuals' explanations in the direction of a normative understanding; (2) a comparison of explanations across the seven problem contexts in order to highlight variation in kinds of prior knowledge that are used; and (3) a concentration on the diversity within explanations that can be all considered examples of emergent thinking. First, I document cases of students' shifting explanations as they become less prototypically centralized (a more naive causality) and then become more prototypically decentralized over short time periods. The analysis illustrates the lines of continuity between these two ways of understanding and how change can occur during the process of students generating a progression of increasingly sophisticated transitional explanations. Second, I find a variety of students' understandings across the problem contexts, expressing both variation in their prior knowledge and how the nature of a specific domain influences reasoning. Certain problem contexts are easier or harder for students

  5. The markov-dubins problem with free terminal direction in a nonpositively curved cube complex

    NASA Astrophysics Data System (ADS)

    La Corte, Jason Thomson

    State complexes are nonpositively curved cube complexes that model the state spaces of reconfigurable systems. The problem of determining a strategy for reconfiguring the system from a given initial state to a given goal state is equivalent to that of finding a path between two points in the state complex. The additional requirement that allowable paths must have a prescribed initial direction and minimal turning radius determines a Markov-Dubins problem with free terminal direction (MDPFTD). Given a nonpositively curved, locally finite cube complex X, we consider the set of unit-speed paths which satisfy a certain smoothness condition in addition to the boundary conditions and curvature constraint that define a MDPFTD. We show that this set either contains a path of minimal length, or is empty. We then focus on the case that X is a surface with a nonpositively curved cubical structure. We show that any solution to a MDPFTD in X must consist of finitely many geodesic segments and arcs of constant curvature, and we give an algorithm for determining those solutions to the MDPFTD in X which are CL paths, that is, made up of an arc of constant curvature followed by a geodesic segment. Finally, under the assumption that the 1-skeleton of X is d-regular, we give sufficient conditions for a topological ray in X of constant curvature to be a rose curve or a proper ray.

  6. Analysis and formulation of a class of complex dynamic optimization problems

    NASA Astrophysics Data System (ADS)

    Kameswaran, Shivakumar

    The Direct Transcription approach, also known as the direct simultaneous approach, is a widely used solution strategy for the solution of dynamic optimization problems involving differential-algebraic equations (DAEs). Direct transcription refers to the procedure of approximating the infinite dimensional problem by a finite dimensional one, which is then solved using a nonlinear programming (NLP) solver tailored to large-scale problems. Systems governed by partial differential equations (PDEs) can also be handled by spatially discretizing the PDEs to convert them to a system of DAEs. The objective of this thesis is firstly to ensure that direct transcription using Radau collocation is provably correct, and secondly to widen the applicability of the direct simultaneous approach to a larger class of dynamic optimization and optimal control problems (OCPs). This thesis aims at addressing these issues using rigorous theoretical tools and/or characteristic examples, and at the same time use the results for solving large-scale industrial applications to realize the benefits. The first part of this work deals with the analysis of convergence rates for direct transcription of unconstrained and final-time equality constrained optimal control problems. The problems are discretized using collocation at Radau points. Convergence is analyzed from an NLP/matrix-algebra perspective, which enables the prediction of the conditioning of the direct transcription NLP as the mesh size becomes finer. Several convergence results are presented along with tests on numerous example problems. These convergence results lead to an adjoint estimation procedure given the Lagrange multipliers for the large-scale NLP. The work also reveals the role of process control concepts such as controllability on the convergence analysis, and provides a very important link between control and optimization inside the framework of dynamic optimization. As an effort to extend the applicability of the direct

  7. Beyond roots alone: Novel methodologies for analyzing complex soil and minirhizotron imagery using image processing and GIS tools

    NASA Astrophysics Data System (ADS)

    Silva, Justina A.

    Quantifying belowground dynamics is critical to our understanding of plant and ecosystem function and belowground carbon cycling, yet currently available tools for complex belowground image analyses are insufficient. We introduce novel techniques combining digital image processing tools and geographic information systems (GIS) analysis to permit semi-automated analysis of complex root and soil dynamics. We illustrate methodologies with imagery from microcosms, minirhizotrons, and a rhizotron, in upland and peatland soils. We provide guidelines for correct image capture, a method that automatically stitches together numerous minirhizotron images into one seamless image, and image analysis using image segmentation and classification in SPRING or change analysis in ArcMap. These methods facilitate spatial and temporal root and soil interaction studies, providing a framework to expand a more comprehensive understanding of belowground dynamics.

  8. You Need to Know: There Is a Causal Relationship between Structural Knowledge and Control Performance in Complex Problem Solving Tasks

    ERIC Educational Resources Information Center

    Goode, Natassia; Beckmann, Jens F.

    2010-01-01

    This study investigates the relationships between structural knowledge, control performance and fluid intelligence in a complex problem solving (CPS) task. 75 participants received either complete, partial or no information regarding the underlying structure of a complex problem solving task, and controlled the task to reach specific goals.…

  9. How to solve complex problems in foundry plants - future of casting simulation -

    NASA Astrophysics Data System (ADS)

    Ohnaka, I.

    2015-06-01

    Although the computer simulation of casting has progressed dramatically over the last decades, there are still many challenges and problems. This paper discusses how to solve complex engineering problems in foundry plants and what we should do in the future, in particular, for casting simulation. First, problem solving procedures including application of computer simulation are demonstrated and various difficulties are pointed-out exemplifying mainly porosity defects in sand castings of spheroidal graphite cast irons. Next, looking back conventional scientific and engineering research to understand casting phenomena, challenges and problems are discussed from problem solving view point, followed by discussion on the issues we should challenge such as how to integrate huge amount of dispersed knowledge in various disciplines, differentiation of science-oriented and engineering-oriented models, professional ethics, how to handle fluctuating materials, initial and boundary conditions, error accumulation, simulation codes as black-box, etc. Finally some suggestions are made on how to challenge the issues such as promotion of research on the simulation based on the science- oriented model and publication of reliable data of casting phenomena in complicated-shaped castings including reconsideration of the evaluation system.

  10. Numerical calculation of thermo-mechanical problems at large strains based on complex step derivative approximation of tangent stiffness matrices

    NASA Astrophysics Data System (ADS)

    Balzani, Daniel; Gandhi, Ashutosh; Tanaka, Masato; Schröder, Jörg

    2015-05-01

    In this paper a robust approximation scheme for the numerical calculation of tangent stiffness matrices is presented in the context of nonlinear thermo-mechanical finite element problems and its performance is analyzed. The scheme extends the approach proposed in Kim et al. (Comput Methods Appl Mech Eng 200:403-413, 2011) and Tanaka et al. (Comput Methods Appl Mech Eng 269:454-470, 2014 and bases on applying the complex-step-derivative approximation to the linearizations of the weak forms of the balance of linear momentum and the balance of energy. By incorporating consistent perturbations along the imaginary axis to the displacement as well as thermal degrees of freedom, we demonstrate that numerical tangent stiffness matrices can be obtained with accuracy up to computer precision leading to quadratically converging schemes. The main advantage of this approach is that contrary to the classical forward difference scheme no round-off errors due to floating-point arithmetics exist within the calculation of the tangent stiffness. This enables arbitrarily small perturbation values and therefore leads to robust schemes even when choosing small values. An efficient algorithmic treatment is presented which enables a straightforward implementation of the method in any standard finite-element program. By means of thermo-elastic and thermo-elastoplastic boundary value problems at finite strains the performance of the proposed approach is analyzed.

  11. Low complexity interference alignment algorithms for desired signal power maximization problem of MIMO channels

    NASA Astrophysics Data System (ADS)

    Sun, Cong; Yang, Yunchuan; Yuan, Yaxiang

    2012-12-01

    In this article, we investigate the interference alignment (IA) solution for a K-user MIMO interference channel. Proper users' precoders and decoders are designed through a desired signal power maximization model with IA conditions as constraints, which forms a complex matrix optimization problem. We propose two low complexity algorithms, both of which apply the Courant penalty function technique to combine the leakage interference and the desired signal power together as the new objective function. The first proposed algorithm is the modified alternating minimization algorithm (MAMA), where each subproblem has closed-form solution with an eigenvalue decomposition. To further reduce algorithm complexity, we propose a hybrid algorithm which consists of two parts. As the first part, the algorithm iterates with Householder transformation to preserve the orthogonality of precoders and decoders. In each iteration, the matrix optimization problem is considered in a sequence of 2D subspaces, which leads to one dimensional optimization subproblems. From any initial point, this algorithm obtains precoders and decoders with low leakage interference in short time. In the second part, to exploit the advantage of MAMA, it continues to iterate to perfectly align the interference from the output point of the first part. Analysis shows that in one iteration generally both proposed two algorithms have lower computational complexity than the existed maximum signal power (MSP) algorithm, and the hybrid algorithm enjoys lower complexity than MAMA. Simulations reveal that both proposed algorithms achieve similar performances as the MSP algorithm with less executing time, and show better performances than the existed alternating minimization algorithm in terms of sum rate. Besides, from the view of convergence rate, simulation results show that the MAMA enjoys fastest speed with respect to a certain sum rate value, while hybrid algorithm converges fastest to eliminate interference.

  12. Complexity of analysis and verification problems for communicating automata and discrete dynamical systems.

    SciTech Connect

    Hunt, H. B.; Rosenkrantz, D. J.; Barrett, C. L.; Marathe, M. V.; Ravi, S. S.

    2001-01-01

    We identify several simple but powerful concepts, techniques, and results; and we use them to characterize the complexities of a number of basic problems II, that arise in the analysis and verification of the following models M of communicating automata and discrete dynamical systems: systems of communicating automata including both finite and infinite cellular automata, transition systems, discrete dynamical systems, and succinctly-specified finite automata. These concepts, techniques, and results are centered on the following: (1) reductions Of STATE-REACHABILITY problems, especially for very simple systems of communicating copies of a single simple finite automaton, (2) reductions of generalized CNF satisfiability problems [Sc78], especially to very simple communicating systems of copies of a few basic acyclic finite sequential machines, and (3) reductions of the EMPTINESS and EMPTINESS-OF-INTERSECTION problems, for several kinds of regular set descriptors. For systems of communicating automata and transition systems, the problems studied include: all equivalence relations and simulation preorders in the Linear-time/Branching-time hierarchies of equivalence relations and simulation preorders of [vG90, vG93], both without and with the hiding abstraction. For discrete dynamical systems, the problems studied include the INITIAL and BOUNDARY VALUE PROBLEMS (denoted IVPs and BVPs, respectively), for nonlinear difference equations over many different algebraic structures, e.g. all unitary rings, all finite unitary semirings, and all lattices. For succinctly specified finite automata, the problems studied also include the several problems studied in [AY98], e.g. the EMPTINESS, EMPTINESS-OF-INTERSECTION, EQUIVALENCE and CONTAINMENT problems. The concepts, techniques, and results presented unify and significantly extend many of the known results in the literature, e.g. [Wo86, Gu89, BPT91, GM92, Ra92, HT94, SH+96, AY98, AKY99, RH93, SM73, Hu73, HRS76, HR78], for

  13. COMPLEXITY OF ANALYSIS & VERIFICATION PROBLEMS FOR COMMUNICATING AUTOMATA & DISCRETE DYNAMICAL SYSTEMS

    SciTech Connect

    H. B. HUNT; D. J. ROSENKRANTS; ET AL

    2001-03-01

    We identify several simple but powerful concepts, techniques, and results; and we use them to characterize the complexities of a number of basic problems II, that arise in the analysis and verification of the following models M of communicating automata and discrete dynamical systems: systems of communicating automata including both finite and infinite cellular automata, transition systems, discrete dynamical systems, and succinctly-specified finite automata. These concepts, techniques, and results are centered on the following: (i) reductions Of STATE-REACHABILITY problems, especially for very simple systems of communicating copies of a single simple finite automaton, (ii) reductions of generalized CNF satisfiability problems [Sc78], especially to very simple communicating systems of copies of a few basic acyclic finite sequential machines, and (iii) reductions of the EMPTINESS and EMPTINESS-OF-INTERSECTION problems, for several kinds of regular set descriptors. For systems of communicating automata and transition systems, the problems studied include: all equivalence relations and simulation preorders in the Linear-time/Branching-time hierarchies of equivalence relations and simulation preorders of [vG90, vG93], both without and with the hiding abstraction. For discrete dynamical systems, the problems studied include the INITIAL and BOUNDARY VALUE PROBLEMS (denoted IVPs and BVPS, respectively), for nonlinear difference equations over many different algebraic structures, e.g. all unitary rings, all finite unitary semirings, and all lattices. For succinctly-specified finite automata, the problems studied also include the several problems studied in [AY98], e.g. the EMPTINESS, EMPTINESS-OF-INTERSECTION, EQUIVALENCE and CONTAINMENT problems. The concepts, techniques, and results presented unify and significantly extend many of the known results in the literature, e.g. [Wo86, Gu89, BPT91, GM92, Ra92, HT94, SH+96, AY98, AKY99, RH93, SM73, Hu73, HRS76, HR78], for

  14. Thresholds of Knowledge Development in Complex Problem Solving: A Multiple-Case Study of Advanced Learners' Cognitive Processes

    ERIC Educational Resources Information Center

    Bogard, Treavor; Liu, Min; Chiang, Yueh-hui Vanessa

    2013-01-01

    This multiple-case study examined how advanced learners solved a complex problem, focusing on how their frequency and application of cognitive processes contributed to differences in performance outcomes, and developing a mental model of a problem. Fifteen graduate students with backgrounds related to the problem context participated in the study.…

  15. Analyzing the tradeoff between electrical complexity and accuracy in patient-specific computational models of deep brain stimulation

    NASA Astrophysics Data System (ADS)

    Howell, Bryan; McIntyre, Cameron C.

    2016-06-01

    Objective. Deep brain stimulation (DBS) is an adjunctive therapy that is effective in treating movement disorders and shows promise for treating psychiatric disorders. Computational models of DBS have begun to be utilized as tools to optimize the therapy. Despite advancements in the anatomical accuracy of these models, there is still uncertainty as to what level of electrical complexity is adequate for modeling the electric field in the brain and the subsequent neural response to the stimulation. Approach. We used magnetic resonance images to create an image-based computational model of subthalamic DBS. The complexity of the volume conductor model was increased by incrementally including heterogeneity, anisotropy, and dielectric dispersion in the electrical properties of the brain. We quantified changes in the load of the electrode, the electric potential distribution, and stimulation thresholds of descending corticofugal (DCF) axon models. Main results. Incorporation of heterogeneity altered the electric potentials and subsequent stimulation thresholds, but to a lesser degree than incorporation of anisotropy. Additionally, the results were sensitive to the choice of method for defining anisotropy, with stimulation thresholds of DCF axons changing by as much as 190%. Typical approaches for defining anisotropy underestimate the expected load of the stimulation electrode, which led to underestimation of the extent of stimulation. More accurate predictions of the electrode load were achieved with alternative approaches for defining anisotropy. The effects of dielectric dispersion were small compared to the effects of heterogeneity and anisotropy. Significance. The results of this study help delineate the level of detail that is required to accurately model electric fields generated by DBS electrodes.

  16. Using Brain Imaging to Track Problem Solving in a Complex State Space

    PubMed Central

    Anderson, John R.; Fincham, Jon M.; Schneider, Darryl W.; Yang, Jian

    2011-01-01

    This paper describes how behavioral and imaging data can be combined with a Hidden Markov Model (HMM) to track participants’ trajectories through a complex state space. Participants completed a problem-solving variant of a memory game that involved 625 distinct states, 24 operators, and an astronomical number of paths through the state space. Three sources of information were used for classification purposes. First, an Imperfect Memory Model was used to estimate transition probabilities for the HMM. Second, behavioral data provided information about the timing of different events. Third, multivoxel pattern analysis of the imaging data was used to identify features of the operators. By combining the three sources of information, an HMM algorithm was able to efficiently identify the most probable path that participants took through the state space, achieving over 80% accuracy. These results support the approach as a general methodology for tracking mental states that occur during individual problem-solving episodes. PMID:22209783

  17. Direct, inverse, and combined problems in complex engineered system modeling by artificial neural networks

    NASA Astrophysics Data System (ADS)

    Terekhoff, Serge A.

    1997-04-01

    This paper summarizes theoretical findings and applications of artificial neural networks to modeling of complex engineered system response in the abnormal environments. The thermal fire impact on the industrial container for waste and fissile materials was investigated using model and experimental data. Solutions for the direct problem show that the generalization properties of neural network based model are significantly better than those for standard interpolation methods. Minimal amount of data required for good prediction of system response is estimated in computer experiments with MLP network. It was shown that Kohonen's self-organizing map with counterpropagation may also estimate local accuracy of regularized solution for inverse and combined problems. Feature space regions of partial correctness of the inverse model can be automatically extracted using adaptive clustering. Practical findings include time strategy recommendations for fire-safe services when industrial or transport accidents occur.

  18. A boundary collocation meshfree method for the treatment of Poisson problems with complex morphologies

    NASA Astrophysics Data System (ADS)

    Soghrati, Soheil; Mai, Weijie; Liang, Bowen; Buchheit, Rudolph G.

    2015-01-01

    A new meshfree method based on a discrete transformation of Green's basis functions is introduced to simulate Poisson problems with complex morphologies. The proposed Green's Discrete Transformation Method (GDTM) uses source points that are located along a virtual boundary outside the problem domain to construct the basis functions needed to approximate the field. The optimal number of Green's functions source points and their relative distances with respect to the problem boundaries are evaluated to obtain the best approximation of the partition of unity condition. A discrete transformation technique together with the boundary point collocation method is employed to evaluate the unknown coefficients of the solution series via satisfying the problem boundary conditions. A comprehensive convergence study is presented to investigate the accuracy and convergence rate of the GDTM. We will also demonstrate the application of this meshfree method for simulating the conductive heat transfer in a heterogeneous materials system and the dissolved aluminum ions concentration in the electrolyte solution formed near a passive corrosion pit.

  19. The anatomical problem posed by brain complexity and size: a potential solution

    PubMed Central

    DeFelipe, Javier

    2015-01-01

    Over the years the field of neuroanatomy has evolved considerably but unraveling the extraordinary structural and functional complexity of the brain seems to be an unattainable goal, partly due to the fact that it is only possible to obtain an imprecise connection matrix of the brain. The reasons why reaching such a goal appears almost impossible to date is discussed here, together with suggestions of how we could overcome this anatomical problem by establishing new methodologies to study the brain and by promoting interdisciplinary collaboration. Generating a realistic computational model seems to be the solution rather than attempting to fully reconstruct the whole brain or a particular brain region. PMID:26347617

  20. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    SciTech Connect

    Tahvili, Sahar; Österberg, Jonas; Silvestrov, Sergei; Biteus, Jonas

    2014-12-10

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation.

  1. SVD-GFD scheme to simulate complex moving body problems in 3D space

    NASA Astrophysics Data System (ADS)

    Wang, X. Y.; Yu, P.; Yeo, K. S.; Khoo, B. C.

    2010-03-01

    The present paper presents a hybrid meshfree-and-Cartesian grid method for simulating moving body incompressible viscous flow problems in 3D space. The method combines the merits of cost-efficient and accurate conventional finite difference approximations on Cartesian grids with the geometric freedom of generalized finite difference (GFD) approximations on meshfree grids. Error minimization in GFD is carried out by singular value decomposition (SVD). The Arbitrary Lagrangian-Eulerian (ALE) form of the Navier-Stokes equations on convecting nodes is integrated by a fractional-step projection method. The present hybrid grid method employs a relatively simple mode of nodal administration. Nevertheless, it has the geometrical flexibility of unstructured mesh-based finite-volume and finite element methods. Boundary conditions are precisely implemented on boundary nodes without interpolation. The present scheme is validated by a moving patch consistency test as well as against published results for 3D moving body problems. Finally, the method is applied on low-Reynolds number flapping wing applications, where large boundary motions are involved. The present study demonstrates the potential of the present hybrid meshfree-and-Cartesian grid scheme for solving complex moving body problems in 3D.

  2. Induction of mutation spectra by complex mixtures: approaches, problems, and possibilities.

    PubMed Central

    DeMarini, D M

    1994-01-01

    More complex environmental mixtures have been evaluated for mutagenic activity at the hisD3052 allele of Salmonella, primarily in strain TA98, than in any other target or mutation assay. Using colony probe hybridization to detect a common hot spot deletion, followed by polymerase chain reaction and DNA sequencing, we have generated 10 mutation spectra from three classes of mixtures (i.e., urban air, cigarette smoke condensate, and municipal waste incinerator emissions). The mutation spectra are distinctly different among the three classes of mixtures; however, the spectra for samples within the same class of mixture are similar. In addition to the hot spot mutation, the mixtures induce complex mutations, which consist of a small deletion and a base substitution. These mutations suggest a mechanism involving misinsertion of a base opposite a DNA adduct followed by a slippage and mismatch. A role for DNA secondary structure also may be the basis for the mutational site specificity exhibited by the various mixtures. The results suggest that unique mutation spectra can be generated by different classes of complex mixtures and that such spectra are a consequence of the dominance of a particular chemical class or classes within the mixture. The problems associated with this type of research are discussed along with the potential value of mutation spectra as a tool for exposure and risk assessment. PMID:7821286

  3. Exploring the complexity of inquiry learning in an open-ended problem space

    NASA Astrophysics Data System (ADS)

    Clarke, Jody

    Data-gathering and problem identification are key components of scientific inquiry. However, few researchers have studied how students learn these skills because historically this required a time-consuming, complicated method of capturing the details of learners' data-gathering processes. Nor are classroom settings authentic contexts in which students could exhibit problem identification skills parallel to those involved in deconstructing complex real world situations. In this study of middle school students, because of my access to an innovative technology, I simulated a disease outbreak in a virtual community as a complicated, authentic problem. As students worked through the curriculum in the virtual world, their time-stamped actions were stored by the computer in event-logs. Using these records, I tracked in detail how the student scientists made sense of the complexity they faced and how they identified and investigated the problem using science-inquiry skills. To describe the degree to which students' data collection narrowed and focused on a specific disease over time, I developed a rubric and automated the coding of records in the event-logs. I measured the ongoing development of the students' "systematicity" in investigating the disease outbreak. I demonstrated that coding event-logs is an effective yet non-intrusive way of collecting and parsing detailed information about students' behaviors in real time in an authentic setting. My principal research question was "Do students who are more thoughtful about their inquiry prior to entry into the curriculum demonstrate increased systematicity in their inquiry behavior during the experience, by narrowing the focus of their data-gathering more rapidly than students who enter with lower levels of thoughtfulness about inquiry?" My sample consisted of 403 middle-school students from public schools in the US who volunteered to participate in the River City Project in spring 2008. Contrary to my hypothesis, I found

  4. Communication: Overcoming the root search problem in complex quantum trajectory calculations

    SciTech Connect

    Zamstein, Noa; Tannor, David J.

    2014-01-28

    Three new developments are presented regarding the semiclassical coherent state propagator. First, we present a conceptually different derivation of Huber and Heller's method for identifying complex root trajectories and their equations of motion [D. Huber and E. J. Heller, J. Chem. Phys. 87, 5302 (1987)]. Our method proceeds directly from the time-dependent Schrödinger equation and therefore allows various generalizations of the formalism. Second, we obtain an analytic expression for the semiclassical coherent state propagator. We show that the prefactor can be expressed in a form that requires solving significantly fewer equations of motion than in alternative expressions. Third, the semiclassical coherent state propagator is used to formulate a final value representation of the time-dependent wavefunction that avoids the root search, eliminates problems with caustics and automatically includes interference. We present numerical results for the 1D Morse oscillator showing that the method may become an attractive alternative to existing semiclassical approaches.

  5. Leadership and leadership development in healthcare settings - a simplistic solution to complex problems?

    PubMed

    McDonald, Ruth

    2014-10-01

    There is a trend in health systems around the world to place great emphasis on and faith in improving 'leadership'. Leadership has been defined in many ways and the elitist implications of traditional notions of leadership sit uncomfortably with modern healthcare organisations. The concept of distributed leadership incorporates inclusivity, collectiveness and collaboration, with the result that, to some extent, all staff, not just those in senior management roles, are viewed as leaders. Leadership development programmes are intended to equip individuals to improve leadership skills, but we know little about their effectiveness. Furthermore, the content of these programmes varies widely and the fact that many lack a sense of how they fit with individual or organisational goals raises questions about how they are intended to achieve their aims. It is important to avoid simplistic assumptions about the ability of improved leadership to solve complex problems. It is also important to evaluate leadership development programmes in ways that go beyond descriptive accounts. PMID:25337595

  6. Validation Study of a Method for Assessing Complex Ill-Structured Problem Solving by Using Causal Representations

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ifenthaler, Dirk; Ge, Xun

    2013-01-01

    The important but little understood problem that motivated this study was the lack of research on valid assessment methods to determine progress in higher-order learning in situations involving complex and ill-structured problems. Without a valid assessment method, little progress can occur in instructional design research with regard to designing…

  7. Using Educational Data Mining Methods to Assess Field-Dependent and Field-Independent Learners' Complex Problem Solving

    ERIC Educational Resources Information Center

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    The present study investigated the problem-solving performance of 101 university students and their interactions with a computer modeling tool in order to solve a complex problem. Based on their performance on the hidden figures test, students were assigned to three groups of field-dependent (FD), field-mixed (FM), and field-independent (FI)…

  8. Seeing around a Ball: Complex, Technology-Based Problems in Calculus with Applications in Science and Engineering-Redux

    ERIC Educational Resources Information Center

    Winkel, Brian

    2008-01-01

    A complex technology-based problem in visualization and computation for students in calculus is presented. Strategies are shown for its solution and the opportunities for students to put together sequences of concepts and skills to build for success are highlighted. The problem itself involves placing an object under water in order to actually see…

  9. An Investigation of the Interrelationships between Motivation, Engagement, and Complex Problem Solving in Game-Based Learning

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Law, Victor; Ifenthaler, Dirk; Ge, Xun; Miller, Raymond

    2014-01-01

    Digital game-based learning, especially massively multiplayer online games, has been touted for its potential to promote student motivation and complex problem-solving competency development. However, current evidence is limited to anecdotal studies. The purpose of this empirical investigation is to examine the complex interplay between…

  10. An immersed boundary computational model for acoustic scattering problems with complex geometries.

    PubMed

    Sun, Xiaofeng; Jiang, Yongsong; Liang, An; Jing, Xiaodong

    2012-11-01

    An immersed boundary computational model is presented in order to deal with the acoustic scattering problem by complex geometries, in which the wall boundary condition is treated as a direct body force determined by satisfying the non-penetrating boundary condition. Two distinct discretized grids are used to discrete the fluid domain and immersed boundary, respectively. The immersed boundaries are represented by Lagrangian points and the direct body force determined on these points is applied on the neighboring Eulerian points. The coupling between the Lagrangian points and Euler points is linked by a discrete delta function. The linearized Euler equations are spatially discretized with a fourth-order dispersion-relation-preserving scheme and temporal integrated with a low-dissipation and low-dispersion Runge-Kutta scheme. A perfectly matched layer technique is applied to absorb out-going waves and in-going waves in the immersed bodies. Several benchmark problems for computational aeroacoustic solvers are performed to validate the present method. PMID:23145603

  11. Enhancements of evolutionary algorithm for the complex requirements of a nurse scheduling problem

    NASA Astrophysics Data System (ADS)

    Tein, Lim Huai; Ramli, Razamin

    2014-12-01

    Over the years, nurse scheduling is a noticeable problem that is affected by the global nurse turnover crisis. The more nurses are unsatisfied with their working environment the more severe the condition or implication they tend to leave. Therefore, the current undesirable work schedule is partly due to that working condition. Basically, there is a lack of complimentary requirement between the head nurse's liability and the nurses' need. In particular, subject to highly nurse preferences issue, the sophisticated challenge of doing nurse scheduling is failure to stimulate tolerance behavior between both parties during shifts assignment in real working scenarios. Inevitably, the flexibility in shifts assignment is hard to achieve for the sake of satisfying nurse diverse requests with upholding imperative nurse ward coverage. Hence, Evolutionary Algorithm (EA) is proposed to cater for this complexity in a nurse scheduling problem (NSP). The restriction of EA is discussed and thus, enhancement on the EA operators is suggested so that the EA would have the characteristic of a flexible search. This paper consists of three types of constraints which are the hard, semi-hard and soft constraints that can be handled by the EA with enhanced parent selection and specialized mutation operators. These operators and EA as a whole contribute to the efficiency of constraint handling, fitness computation as well as flexibility in the search, which correspond to the employment of exploration and exploitation principles.

  12. Fibromyalgia and disability adjudication: No simple solutions to a complex problem

    PubMed Central

    Harth, Manfred; Nielson, Warren R

    2014-01-01

    BACKGROUND: Adjudication of disability claims related to fibromyalgia (FM) syndrome can be a challenging and complex process. A commentary published in the current issue of Pain Research & Management makes suggestions for improvement. The authors of the commentary contend that: previously and currently used criteria for the diagnosis of FM are irrelevant to clinical practice; the opinions of family physicians should supersede those of experts; there is little evidence that trauma can cause FM; no formal instruments are necessary to assess disability; and many FM patients on or applying for disability are exaggerating or malingering, and tests of symptoms validity should be used to identify malingerers. OBJECTIVES: To assess the assertions made by Fitzcharles et al. METHODS: A narrative review of the available research literature was performed. RESULTS: Available diagnostic criteria should be used in a medicolegal context; family physicians are frequently uncertain about FM and/or biased; there is considerable evidence that trauma can be a cause of FM; it is essential to use validated instruments to assess functional impairment; and the available tests of physical effort and symptom validity are of uncertain value in identifying malingering in FM. CONCLUSIONS: The available evidence does not support many of the suggestions presented in the commentary. Caution is advised in adopting simple solutions for disability adjudication in FM because they are generally incompatible with the inherently complex nature of the problem. PMID:25479149

  13. Decision Analysis for Environmental Problems

    EPA Science Inventory

    Environmental management problems are often complex and uncertain. A formal process with proper guidance is needed to understand the issues, identify sources of disagreement, and analyze the major uncertainties in environmental problems. This course will present a process that fo...

  14. Exploring Corn-Ethanol As A Complex Problem To Teach Sustainability Concepts Across The Science-Business-Liberal Arts Curriculum

    NASA Astrophysics Data System (ADS)

    Oches, E. A.; Szymanski, D. W.; Snyder, B.; Gulati, G. J.; Davis, P. T.

    2012-12-01

    The highly interdisciplinary nature of sustainability presents pedagogic challenges when sustainability concepts are incorporated into traditional disciplinary courses. At Bentley University, where over 90 percent of students major in business disciplines, we have created a multidisciplinary course module centered on corn ethanol that explores a complex social, environmental, and economic problem and develops basic data analysis and analytical thinking skills in several courses spanning the natural, physical, and social sciences within the business curriculum. Through an NSF-CCLI grant, Bentley faculty from several disciplines participated in a summer workshop to define learning objectives, create course modules, and develop an assessment plan to enhance interdisciplinary sustainability teaching. The core instructional outcome was a data-rich exercise for all participating courses in which students plot and analyze multiple parameters of corn planted and harvested for various purposes including food (human), feed (animal), ethanol production, and commodities exchanged for the years 1960 to present. Students then evaluate patterns and trends in the data and hypothesize relationships among the plotted data and environmental, social, and economic drivers, responses, and unintended consequences. After the central data analysis activity, students explore corn ethanol production as it relates to core disciplinary concepts in their individual classes. For example, students in Environmental Chemistry produce ethanol using corn and sugar as feedstocks and compare the efficiency of each process, while learning about enzymes, fermentation, distillation, and other chemical principles. Principles of Geology students examine the effects of agricultural runoff on surface water quality associated with extracting greater agricultural yield from mid-continent croplands. The American Government course examines the role of political institutions, the political process, and various

  15. The Problem with Word Problems: Solving Word Problems in Math Requires a Complex Web of Skills. But There's No Reason Why it Can't Be Fun

    ERIC Educational Resources Information Center

    Forsten, Char

    2004-01-01

    Children need to combine reading, thinking, and computational skills to solve math word problems. The author provides some strategies that principals can share with their teachers to help students become proficient and advanced problem-solvers. They include creating a conducive classroom environment, providing daily mental math activities, making…

  16. Subspace Iteration Method for Complex Eigenvalue Problems with Nonsymmetric Matrices in Aeroelastic System

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Lung, Shu

    2009-01-01

    Modern airplane design is a multidisciplinary task which combines several disciplines such as structures, aerodynamics, flight controls, and sometimes heat transfer. Historically, analytical and experimental investigations concerning the interaction of the elastic airframe with aerodynamic and in retia loads have been conducted during the design phase to determine the existence of aeroelastic instabilities, so called flutter .With the advent and increased usage of flight control systems, there is also a likelihood of instabilities caused by the interaction of the flight control system and the aeroelastic response of the airplane, known as aeroservoelastic instabilities. An in -house code MPASES (Ref. 1), modified from PASES (Ref. 2), is a general purpose digital computer program for the analysis of the closed-loop stability problem. This program used subroutines given in the International Mathematical and Statistical Library (IMSL) (Ref. 3) to compute all of the real and/or complex conjugate pairs of eigenvalues of the Hessenberg matrix. For high fidelity configuration, these aeroelastic system matrices are large and compute all eigenvalues will be time consuming. A subspace iteration method (Ref. 4) for complex eigenvalues problems with nonsymmetric matrices has been formulated and incorporated into the modified program for aeroservoelastic stability (MPASES code). Subspace iteration method only solve for the lowest p eigenvalues and corresponding eigenvectors for aeroelastic and aeroservoelastic analysis. In general, the selection of p is ranging from 10 for wing flutter analysis to 50 for an entire aircraft flutter analysis. The application of this newly incorporated code is an experiment known as the Aerostructures Test Wing (ATW) which was designed by the National Aeronautic and Space Administration (NASA) Dryden Flight Research Center, Edwards, California to research aeroelastic instabilities. Specifically, this experiment was used to study an instability

  17. Computational issues in complex water-energy optimization problems: Time scales, parameterizations, objectives and algorithms

    NASA Astrophysics Data System (ADS)

    Efstratiadis, Andreas; Tsoukalas, Ioannis; Kossieris, Panayiotis; Karavokiros, George; Christofides, Antonis; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris

    2015-04-01

    Modelling of large-scale hybrid renewable energy systems (HRES) is a challenging task, for which several open computational issues exist. HRES comprise typical components of hydrosystems (reservoirs, boreholes, conveyance networks, hydropower stations, pumps, water demand nodes, etc.), which are dynamically linked with renewables (e.g., wind turbines, solar parks) and energy demand nodes. In such systems, apart from the well-known shortcomings of water resources modelling (nonlinear dynamics, unknown future inflows, large number of variables and constraints, conflicting criteria, etc.), additional complexities and uncertainties arise due to the introduction of energy components and associated fluxes. A major difficulty is the need for coupling two different temporal scales, given that in hydrosystem modeling, monthly simulation steps are typically adopted, yet for a faithful representation of the energy balance (i.e. energy production vs. demand) a much finer resolution (e.g. hourly) is required. Another drawback is the increase of control variables, constraints and objectives, due to the simultaneous modelling of the two parallel fluxes (i.e. water and energy) and their interactions. Finally, since the driving hydrometeorological processes of the integrated system are inherently uncertain, it is often essential to use synthetically generated input time series of large length, in order to assess the system performance in terms of reliability and risk, with satisfactory accuracy. To address these issues, we propose an effective and efficient modeling framework, key objectives of which are: (a) the substantial reduction of control variables, through parsimonious yet consistent parameterizations; (b) the substantial decrease of computational burden of simulation, by linearizing the combined water and energy allocation problem of each individual time step, and solve each local sub-problem through very fast linear network programming algorithms, and (c) the substantial

  18. Can fuzzy logic bring complex problems into focus? Modeling imprecise factors in environmental policy

    SciTech Connect

    McKone, Thomas E.; Deshpande, Ashok W.

    2004-06-14

    In modeling complex environmental problems, we often fail to make precise statements about inputs and outcome. In this case the fuzzy logic method native to the human mind provides a useful way to get at these problems. Fuzzy logic represents a significant change in both the approach to and outcome of environmental evaluations. Risk assessment is currently based on the implicit premise that probability theory provides the necessary and sufficient tools for dealing with uncertainty and variability. The key advantage of fuzzy methods is the way they reflect the human mind in its remarkable ability to store and process information which is consistently imprecise, uncertain, and resistant to classification. Our case study illustrates the ability of fuzzy logic to integrate statistical measurements with imprecise health goals. But we submit that fuzzy logic and probability theory are complementary and not competitive. In the world of soft computing, fuzzy logic has been widely used and has often been the ''smart'' behind smart machines. But it will require more effort and case studies to establish its niche in risk assessment or other types of impact assessment. Although we often hear complaints about ''bright lines,'' could we adapt to a system that relaxes these lines to fuzzy gradations? Would decision makers and the public accept expressions of water or air quality goals in linguistic terms with computed degrees of certainty? Resistance is likely. In many regions, such as the US and European Union, it is likely that both decision makers and members of the public are more comfortable with our current system in which government agencies avoid confronting uncertainties by setting guidelines that are crisp and often fail to communicate uncertainty. But some day perhaps a more comprehensive approach that includes exposure surveys, toxicological data, epidemiological studies coupled with fuzzy modeling will go a long way in resolving some of the conflict, divisiveness

  19. Accurate gradient approximation for complex interface problems in 3D by an improved coupling interface method

    NASA Astrophysics Data System (ADS)

    Shu, Yu-Chen; Chern, I.-Liang; Chang, Chien C.

    2014-10-01

    Most elliptic interface solvers become complicated for complex interface problems at those “exceptional points” where there are not enough neighboring interior points for high order interpolation. Such complication increases especially in three dimensions. Usually, the solvers are thus reduced to low order accuracy. In this paper, we classify these exceptional points and propose two recipes to maintain order of accuracy there, aiming at improving the previous coupling interface method [26]. Yet the idea is also applicable to other interface solvers. The main idea is to have at least first order approximations for second order derivatives at those exceptional points. Recipe 1 is to use the finite difference approximation for the second order derivatives at a nearby interior grid point, whenever this is possible. Recipe 2 is to flip domain signatures and introduce a ghost state so that a second-order method can be applied. This ghost state is a smooth extension of the solution at the exceptional point from the other side of the interface. The original state is recovered by a post-processing using nearby states and jump conditions. The choice of recipes is determined by a classification scheme of the exceptional points. The method renders the solution and its gradient uniformly second-order accurate in the entire computed domain. Numerical examples are provided to illustrate the second order accuracy of the presently proposed method in approximating the gradients of the original states for some complex interfaces which we had tested previous in two and three dimensions, and a real molecule (1D63) which is double-helix shape and composed of hundreds of atoms.

  20. Speed and Complexity Characterize Attention Problems in Children with Localization-Related Epilepsy

    PubMed Central

    Berl, Madison; Terwilliger, Virginia; Scheller, Alexandra; Sepeta, Leigh; Walkowiak, Jenifer; Gaillard, William D.

    2015-01-01

    Summary Objective Children with epilepsy (EPI) have a higher rate of ADHD (28–70%) than typically developing (TD) children (5–10%); however, attention is multidimensional. Thus, we aimed to characterize the profile of attention difficulties in children with epilepsy. Methods Seventy-five children with localization-related epilepsy ages 6–16 and 75 age-matched controls were evaluated using multimodal, multidimensional measures of attention including direct performance and parent ratings of attention as well as intelligence testing. We assessed group differences across attention measures, determined if parent rating predicted performance on attention measures, and examined if epilepsy characteristics were associated with attention skills. Results The EPI group performed worse than the TD group on timed and complex attention aspects of attention (p<.05), while performance on simple visual and simple auditory attention tasks was comparable. Children with EPI were 12 times as likely as TD children to have clinically elevated symptoms of inattention as rated by parents, but ratings were a weak predictor of attention performance. Earlier age of onset was associated with slower motor speed (p<.01), but no other epilepsy-related clinical characteristics were associated with attention skills. Significance This study clarifies the nature of the attention problems in pediatric epilepsy, which may be under recognized. Children with EPI had difficulty with complex attention and rapid response, not simple attention. As such, they may not exhibit difficulty until later in primary school when demands increase. Parent report with standard ADHD screening tools may underdetect these higher order attention difficulties. Thus, monitoring through direct neuropsychological performance is recommended. PMID:25940056

  1. Accurate gradient approximation for complex interface problems in 3D by an improved coupling interface method

    SciTech Connect

    Shu, Yu-Chen; Chern, I-Liang; Chang, Chien C.

    2014-10-15

    Most elliptic interface solvers become complicated for complex interface problems at those “exceptional points” where there are not enough neighboring interior points for high order interpolation. Such complication increases especially in three dimensions. Usually, the solvers are thus reduced to low order accuracy. In this paper, we classify these exceptional points and propose two recipes to maintain order of accuracy there, aiming at improving the previous coupling interface method [26]. Yet the idea is also applicable to other interface solvers. The main idea is to have at least first order approximations for second order derivatives at those exceptional points. Recipe 1 is to use the finite difference approximation for the second order derivatives at a nearby interior grid point, whenever this is possible. Recipe 2 is to flip domain signatures and introduce a ghost state so that a second-order method can be applied. This ghost state is a smooth extension of the solution at the exceptional point from the other side of the interface. The original state is recovered by a post-processing using nearby states and jump conditions. The choice of recipes is determined by a classification scheme of the exceptional points. The method renders the solution and its gradient uniformly second-order accurate in the entire computed domain. Numerical examples are provided to illustrate the second order accuracy of the presently proposed method in approximating the gradients of the original states for some complex interfaces which we had tested previous in two and three dimensions, and a real molecule ( (1D63)) which is double-helix shape and composed of hundreds of atoms.

  2. Managing the Complexity of Design Problems through Studio-Based Learning

    ERIC Educational Resources Information Center

    Cennamo, Katherine; Brandt, Carol; Scott, Brigitte; Douglas, Sarah; McGrath, Margarita; Reimer, Yolanda; Vernon, Mitzi

    2011-01-01

    The ill-structured nature of design problems makes them particularly challenging for problem-based learning. Studio-based learning (SBL), however, has much in common with problem-based learning and indeed has a long history of use in teaching students to solve design problems. The purpose of this ethnographic study of an industrial design class,…

  3. Leadership and leadership development in healthcare settings – a simplistic solution to complex problems?

    PubMed Central

    McDonald, Ruth

    2014-01-01

    There is a trend in health systems around the world to place great emphasis on and faith in improving ‘leadership’. Leadership has been defined in many ways and the elitist implications of traditional notions of leadership sit uncomfortably with modern healthcare organisations. The concept of distributed leadership incorporates inclusivity, collectiveness and collaboration, with the result that, to some extent, all staff, not just those in senior management roles, are viewed as leaders. Leadership development programmes are intended to equip individuals to improve leadership skills, but we know little about their effectiveness. Furthermore, the content of these programmes varies widely and the fact that many lack a sense of how they fit with individual or organisational goals raises questions about how they are intended to achieve their aims. It is important to avoid simplistic assumptions about the ability of improved leadership to solve complex problems. It is also important to evaluate leadership development programmes in ways that go beyond descriptive accounts. PMID:25337595

  4. Case study method and problem-based learning: utilizing the pedagogical model of progressive complexity in nursing education.

    PubMed

    McMahon, Michelle A; Christopher, Kimberly A

    2011-01-01

    As the complexity of health care delivery continues to increase, educators are challenged to determine educational best practices to prepare BSN students for the ambiguous clinical practice setting. Integrative, active, and student-centered curricular methods are encouraged to foster student ability to use clinical judgment for problem solving and informed clinical decision making. The proposed pedagogical model of progressive complexity in nursing education suggests gradually introducing students to complex and multi-contextual clinical scenarios through the utilization of case studies and problem-based learning activities, with the intention to transition nursing students into autonomous learners and well-prepared practitioners at the culmination of a nursing program. Exemplar curricular activities are suggested to potentiate student development of a transferable problem solving skill set and a flexible knowledge base to better prepare students for practice in future novel clinical experiences, which is a mutual goal for both educators and students. PMID:22718667

  5. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  6. A framework to approach problems of forensic anthropology using complex networks

    NASA Astrophysics Data System (ADS)

    Caridi, Inés; Dorso, Claudio O.; Gallo, Pablo; Somigliana, Carlos

    2011-05-01

    We have developed a method to analyze and interpret emerging structures in a set of data which lacks some information. It has been conceived to be applied to the problem of getting information about people who disappeared in the Argentine state of Tucumán from 1974 to 1981. Even if the military dictatorship formally started in Argentina had begun in 1976 and lasted until 1983, the disappearance and assassination of people began some months earlier. During this period several circuits of Illegal Detention Centres (IDC) were set up in different locations all over the country. In these secret centres, disappeared people were illegally kept without any sort of constitutional guarantees, and later assassinated. Even today, the final destination of most of the disappeared people’s remains is still unknown. The fundamental hypothesis in this work is that a group of people with the same political affiliation whose disappearances were closely related in time and space shared the same place of captivity (the same IDC or circuit of IDCs). This hypothesis makes sense when applied to the systematic method of repression and disappearances which was actually launched in Tucumán, Argentina (2007) [11]. In this work, the missing individuals are identified as nodes on a network and connections are established among them based on the individuals’ attributes while they were alive, by using rules to link them. In order to determine which rules are the most effective in defining the network, we use other kind of knowledge available in this problem: previous results from the anthropological point of view (based on other sources of information, both oral and written, historical and anthropological data, etc.); and information about the place (one or more IDCs) where some people were kept during their captivity. For these best rules, a prediction about these people’s possible destination is assigned (one or more IDCs where they could have been kept), and the success of the

  7. Untangling the Complex Needs of People Experiencing Gambling Problems and Homelessness

    ERIC Educational Resources Information Center

    Holdsworth, Louise; Tiyce, Margaret

    2013-01-01

    People with gambling problems are now recognised among those at increased risk of homelessness, and the link between housing and gambling problems has been identified as an area requiring further research. This paper discusses the findings of a qualitative study that explored the relationship between gambling problems and homelessness. Interviews…

  8. Generalist solutions to complex problems: generating practice-based evidence - the example of managing multi-morbidity

    PubMed Central

    2013-01-01

    Background A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet the concept of multi-morbidity (the presence of >2 diseases) is a product of the design of health care systems which define health care need on the basis of disease status. So does the solution lie in an alternative model of healthcare? Discussion Strengthening generalist practice has been proposed as part of the solution to tackling multi-morbidity. Generalism is a professional philosophy of practice, deeply known to many practitioners, and described as expertise in whole person medicine. But generalism lacks the evidence base needed by policy makers and planners to support service redesign. The challenge is to fill this practice-research gap in order to critically explore if and when generalist care offers a robust alternative to management of this complex problem. We need practice-based evidence to fill this gap. By recognising generalist practice as a ‘complex intervention’ (intervening in a complex system), we outline an approach to evaluate impact using action-research principles. We highlight the implications for those who both commission and undertake research in order to tackle this problem. Summary Answers to the complex problem of multi-morbidity won’t come from doing more of the same. We need to change systems of care, and so the systems for generating evidence to support that care. This paper contributes to that work through outlining a process for generating practice-based evidence of generalist solutions to the complex problem of person-centred care for people with multi-morbidity. PMID:23919296

  9. New approach to the complex-action problem and its application to a nonperturbative study of superstring theory

    NASA Astrophysics Data System (ADS)

    Anagnostopoulos, K. N.; Nishimura, J.

    2002-11-01

    Monte Carlo simulations of a system whose action has an imaginary part are considered to be extremely difficult. We propose a new approach to this ``complex-action problem,'' which utilizes a factorization property of distribution functions. The basic idea is quite general, and it removes the so-called overlap problem completely. Here we apply the method to a nonperturbative study of superstring theory using its matrix formulation. In this particular example, the distribution function turns out to be positive definite, which allows us to reduce the problem even further. Our numerical results suggest an intuitive explanation for the dynamical generation of 4D space-time.

  10. Complex Networks Approach for Analyzing the Correlation of Traditional Chinese Medicine Syndrome Evolvement and Cardiovascular Events in Patients with Stable Coronary Heart Disease

    PubMed Central

    Gao, Zhuye; Li, Siwei; Jiao, Yang; Zhou, Xuezhong; Fu, Changgeng; Shi, Dazhuo; Chen, Keji

    2015-01-01

    This is a multicenter prospective cohort study to analyze the correlation of traditional Chinese medicine (TCM) syndrome evolvement and cardiovascular events in patients with stable coronary heart disease (CHD). The impact of syndrome evolvement on cardiovascular events during the 6-month and 12-month follow-up was analyzed using complex networks approach. Results of verification using Chi-square test showed that the occurrence of cardiovascular events was positively correlated with syndrome evolvement when it evolved from toxic syndrome to Qi deficiency, blood stasis, or sustained toxic syndrome, when it evolved from Qi deficiency to blood stasis, toxic syndrome, or sustained Qi deficiency, and when it evolved from blood stasis to Qi deficiency. Blood stasis, Qi deficiency, and toxic syndrome are important syndrome factors for stable CHD. There are positive correlations between cardiovascular events and syndrome evolution from toxic syndrome to Qi deficiency or blood stasis, from Qi deficiency to blood stasis, or toxic syndrome and from blood stasis to Qi deficiency. These results indicate that stable CHD patients with pathogenesis of toxin consuming Qi, toxin leading to blood stasis, and mutual transformation of Qi deficiency and blood stasis are prone to recurrent cardiovascular events. PMID:25821500

  11. DIFFERENTIAL ANALYZER

    DOEpatents

    Sorensen, E.G.; Gordon, C.M.

    1959-02-10

    Improvements in analog eomputing machines of the class capable of evaluating differential equations, commonly termed differential analyzers, are described. In general form, the analyzer embodies a plurality of basic computer mechanisms for performing integration, multiplication, and addition, and means for directing the result of any one operation to another computer mechanism performing a further operation. In the device, numerical quantities are represented by the rotation of shafts, or the electrical equivalent of shafts.

  12. Validity of the MicroDYN Approach: Complex Problem Solving Predicts School Grades beyond Working Memory Capacity

    ERIC Educational Resources Information Center

    Schweizer, Fabian; Wustenberg, Sascha; Greiff, Samuel

    2013-01-01

    This study examines the validity of the complex problem solving (CPS) test MicroDYN by investigating a) the relation between its dimensions--rule identification (exploration strategy), rule knowledge (acquired knowledge), rule application (control performance)--and working memory capacity (WMC), and b) whether CPS predicts school grades in…

  13. Linking Complex Problem Solving and General Mental Ability to Career Advancement: Does a Transversal Skill Reveal Incremental Predictive Validity?

    ERIC Educational Resources Information Center

    Mainert, Jakob; Kretzschmar, André; Neubert, Jonas C.; Greiff, Samuel

    2015-01-01

    Transversal skills, such as complex problem solving (CPS) are viewed as central twenty-first-century skills. Recent empirical findings have already supported the importance of CPS for early academic advancement. We wanted to determine whether CPS could also contribute to the understanding of career advancement later in life. Towards this end, we…

  14. Learning by Preparing to Teach: Fostering Self-Regulatory Processes and Achievement during Complex Mathematics Problem Solving

    ERIC Educational Resources Information Center

    Muis, Krista R.; Psaradellis, Cynthia; Chevrier, Marianne; Di Leo, Ivana; Lajoie, Susanne P.

    2016-01-01

    We developed an intervention based on the learning by teaching paradigm to foster self-regulatory processes and better learning outcomes during complex mathematics problem solving in a technology-rich learning environment. Seventy-eight elementary students were randomly assigned to 1 of 2 conditions: learning by preparing to teach, or learning for…

  15. Does Visualization Enhance Complex Problem Solving? The Effect of Causal Mapping on Performance in the Computer-Based Microworld Tailorshop

    ERIC Educational Resources Information Center

    Öllinger, Michael; Hammon, Stephanie; von Grundherr, Michael; Funke, Joachim

    2015-01-01

    Causal mapping is often recognized as a technique to support strategic decisions and actions in complex problem situations. Such drawing of causal structures is supposed to particularly foster the understanding of the interaction of the various system elements and to further encourage holistic thinking. It builds on the idea that humans make use…

  16. Environmental Sensing of Expert Knowledge in a Computational Evolution System for Complex Problem Solving in Human Genetics

    NASA Astrophysics Data System (ADS)

    Greene, Casey S.; Hill, Douglas P.; Moore, Jason H.

    The relationship between interindividual variation in our genomes and variation in our susceptibility to common diseases is expected to be complex with multiple interacting genetic factors. A central goal of human genetics is to identify which DNA sequence variations predict disease risk in human populations. Our success in this endeavour will depend critically on the development and implementation of computational intelligence methods that are able to embrace, rather than ignore, the complexity of the genotype to phenotype relationship. To this end, we have developed a computational evolution system (CES) to discover genetic models of disease susceptibility involving complex relationships between DNA sequence variations. The CES approach is hierarchically organized and is capable of evolving operators of any arbitrary complexity. The ability to evolve operators distinguishes this approach from artificial evolution approaches using fixed operators such as mutation and recombination. Our previous studies have shown that a CES that can utilize expert knowledge about the problem in evolved operators significantly outperforms a CES unable to use this knowledge. This environmental sensing of external sources of biological or statistical knowledge is important when the search space is both rugged and large as in the genetic analysis of complex diseases. We show here that the CES is also capable of evolving operators which exploit one of several sources of expert knowledge to solve the problem. This is important for both the discovery of highly fit genetic models and because the particular source of expert knowledge used by evolved operators may provide additional information about the problem itself. This study brings us a step closer to a CES that can solve complex problems in human genetics in addition to discovering genetic models of disease.

  17. Gas Analyzer

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The M200 originated in the 1970's under an Ames Research Center/Stanford University contract to develop a small, lightweight gas analyzer for Viking Landers. Although the unit was not used on the spacecraft, it was further developed by The National Institute for Occupational Safety and Health (NIOSH). Three researchers from the project later formed Microsensor Technology, Inc. (MTI) to commercialize the analyzer. The original version (Micromonitor 500) was introduced in 1982, and the M200 in 1988. The M200, a more advanced version, features dual gas chromatograph which separate a gaseous mixture into components and measure concentrations of each gas. It is useful for monitoring gas leaks, chemical spills, etc. Many analyses are completed in less than 30 seconds, and a wide range of mixtures can be analyzed.

  18. Process Analyzer

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The ChemScan UV-6100 is a spectrometry system originally developed by Biotronics Technologies, Inc. under a Small Business Innovation Research (SBIR) contract. It is marketed to the water and wastewater treatment industries, replacing "grab sampling" with on-line data collection. It analyzes the light absorbance characteristics of a water sample, simultaneously detects hundreds of individual wavelengths absorbed by chemical substances in a process solution, and quantifies the information. Spectral data is then processed by ChemScan analyzer and compared with calibration files in the system's memory in order to calculate concentrations of chemical substances that cause UV light absorbance in specific patterns. Monitored substances can be analyzed for quality and quantity. Applications include detection of a variety of substances, and the information provided enables an operator to control a process more efficiently.

  19. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information. Part 1—Methodology

    NASA Astrophysics Data System (ADS)

    Mejer Hansen, Thomas; Skou Cordua, Knud; Caroline Looms, Majken; Mosegaard, Klaus

    2013-03-01

    From a probabilistic point-of-view, the solution to an inverse problem can be seen as a combination of independent states of information quantified by probability density functions. Typically, these states of information are provided by a set of observed data and some a priori information on the solution. The combined states of information (i.e. the solution to the inverse problem) is a probability density function typically referred to as the a posteriori probability density function. We present a generic toolbox for Matlab and Gnu Octave called SIPPI that implements a number of methods for solving such probabilistically formulated inverse problems by sampling the a posteriori probability density function. In order to describe the a priori probability density function, we consider both simple Gaussian models and more complex (and realistic) a priori models based on higher order statistics. These a priori models can be used with both linear and non-linear inverse problems. For linear inverse Gaussian problems we make use of least-squares and kriging-based methods to describe the a posteriori probability density function directly. For general non-linear (i.e. non-Gaussian) inverse problems, we make use of the extended Metropolis algorithm to sample the a posteriori probability density function. Together with the extended Metropolis algorithm, we use sequential Gibbs sampling that allow computationally efficient sampling of complex a priori models. The toolbox can be applied to any inverse problem as long as a way of solving the forward problem is provided. Here we demonstrate the methods and algorithms available in SIPPI. An application of SIPPI, to a tomographic cross borehole inverse problems, is presented in a second part of this paper.

  20. Blood Analyzer

    NASA Technical Reports Server (NTRS)

    1992-01-01

    In the 1970's, NASA provided funding for development of an automatic blood analyzer for Skylab at the Oak Ridge National Laboratory (ORNL). ORNL devised "dynamic loading," which employed a spinning rotor to load, transfer, and analyze blood samples by centrifugal processing. A refined, commercial version of the system was produced by ABAXIS and is marketed as portable ABAXIS MiniLab MCA. Used in a doctor's office, the equipment can perform 80 to 100 chemical blood tests on a single drop of blood and report results in five minutes. Further development is anticipated.

  1. Complex Problem Solving in Radiologic Technology: Understanding the Roles of Experience, Reflective Judgment, and Workplace Culture

    ERIC Educational Resources Information Center

    Yates, Jennifer L.

    2011-01-01

    The purpose of this research study was to explore the process of learning and development of problem solving skills in radiologic technologists. The researcher sought to understand the nature of difficult problems encountered in clinical practice, to identify specific learning practices leading to the development of professional expertise, and to…

  2. Introducing the Hero Complex and the Mythic Iconic Pathway of Problem Gambling

    ERIC Educational Resources Information Center

    Nixon, Gary; Solowoniuk, Jason

    2009-01-01

    Early research into the motivations behind problem gambling reflected separate paradigms of thought splitting our understanding of the gambler into divergent categories. However, over the past 25 years, problem gambling is now best understood to arise from biological, environmental, social, and psychological processes, and is now encapsulated…

  3. The Role of Prior Knowledge and Problem Contexts in Students' Explanations of Complex System

    ERIC Educational Resources Information Center

    Barth-Cohen, Lauren April

    2012-01-01

    The purpose of this dissertation is to study students' competencies in generating scientific explanations within the domain of complex systems, an interdisciplinary area in which students tend to have difficulties. While considering students' developing explanations of how complex systems work, I investigate the role of prior knowledge…

  4. Atmosphere Analyzer

    NASA Technical Reports Server (NTRS)

    1982-01-01

    California Measurements, Inc.'s model PC-2 Aerosol Particle Analyzer is produced in both airborne and ground-use versions. Originating from NASA technology, it is a quick and accurate method of detecting minute amounts of mass loadings on a quartz crystal -- offers utility as highly sensitive detector of fine particles suspended in air. When combined with suitable air delivery system, it provides immediate information on the size distribution and mass concentrations of aerosols. William Chiang, obtained a NASA license for multiple crystal oscillator technology, and initially developed a particle analyzer for NASA use with Langley Research Center assistance. Later his company produced the modified PC-2 for commercial applications Brunswick Corporation uses the device for atmospheric research and in studies of smoke particles in Fires. PC-2 is used by pharmaceutical and chemical companies in research on inhalation toxicology and environmental health. Also useful in testing various filters for safety masks and nuclear installations.

  5. Oxygen analyzer

    DOEpatents

    Benner, W.H.

    1984-05-08

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N/sub 2/), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable obtained by decomposing the sample at 1135/sup 0/C, or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135/sup 0/C as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N/sub 2/, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  6. Oxygen analyzer

    DOEpatents

    Benner, William H.

    1986-01-01

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  7. Process Analyzer

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Under a NASA Small Business Innovation Research (SBIR) contract, Axiomatics Corporation developed a shunting Dielectric Sensor to determine the nutrient level and analyze plant nutrient solutions in the CELSS, NASA's space life support program. (CELSS is an experimental facility investigating closed-cycle plant growth and food processing for long duration manned missions.) The DiComp system incorporates a shunt electrode and is especially sensitive to changes in dielectric property changes in materials at measurements much lower than conventional sensors. The analyzer has exceptional capabilities for predicting composition of liquid streams or reactions. It measures concentrations and solids content up to 100 percent in applications like agricultural products, petrochemicals, food and beverages. The sensor is easily installed; maintenance is low, and it can be calibrated on line. The software automates data collection and analysis.

  8. MULTICHANNEL ANALYZER

    DOEpatents

    Kelley, G.G.

    1959-11-10

    A multichannel pulse analyzer having several window amplifiers, each amplifier serving one group of channels, with a single fast pulse-lengthener and a single novel interrogation circuit serving all channels is described. A pulse followed too closely timewise by another pulse is disregarded by the interrogation circuit to prevent errors due to pulse pileup. The window amplifiers are connected to the pulse lengthener output, rather than the linear amplifier output, so need not have the fast response characteristic formerly required.

  9. Contextual approach to technology assessment: Implications for one-factor fix solutions to complex social problems

    NASA Technical Reports Server (NTRS)

    Mayo, L. H.

    1975-01-01

    The contextual approach is discussed which undertakes to demonstrate that technology assessment assists in the identification of the full range of implications of taking a particular action and facilitates the consideration of alternative means by which the total affected social problem context might be changed by available project options. It is found that the social impacts of an application on participants, institutions, processes, and social interests, and the accompanying interactions may not only induce modifications in the problem contest delineated for examination with respect to the design, operations, regulation, and use of the posited application, but also affect related social problem contexts.

  10. Gas Analyzer

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A miniature gas chromatograph, a system which separates a gaseous mixture into its components and measures the concentration of the individual gases, was designed for the Viking Lander. The technology was further developed under National Institute for Occupational Safety and Health (NIOSH) and funded by Ames Research Center/Stanford as a toxic gas leak detection device. Three researchers on the project later formed Microsensor Technology, Inc. to commercialize the product. It is a battery-powered system consisting of a sensing wand connected to a computerized analyzer. Marketed as the Michromonitor 500, it has a wide range of applications.

  11. Metabolic analyzer

    NASA Technical Reports Server (NTRS)

    Lem, J. D.

    1977-01-01

    The metabolic analyzer was designed to support experiment M171. It operates on the so-called open circuit method to measure a subject's metabolic activity in terms of oxygen consumed, carbon dioxide produced, minute volume, respiratory exchange ratio, and tidal volume or vital capacity. The system operates in either of two modes. (1) In Mode I, inhaled respiratory volumes are actually measured by a piston spirometer. (2) In Mode II, inhaled volumes are calculated from the exhaled volume and the measured inhaled and exhaled nitrogen concentrations. This second mode was the prime mode for Skylab. Following is a brief description of the various subsystems and their operation.

  12. Contamination Analyzer

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  13. Analyzing Orientations

    NASA Astrophysics Data System (ADS)

    Ruggles, Clive L. N.

    Archaeoastronomical field survey typically involves the measurement of structural orientations (i.e., orientations along and between built structures) in relation to the visible landscape and particularly the surrounding horizon. This chapter focuses on the process of analyzing the astronomical potential of oriented structures, whether in the field or as a desktop appraisal, with the aim of establishing the archaeoastronomical "facts". It does not address questions of data selection (see instead Chap. 25, "Best Practice for Evaluating the Astronomical Significance of Archaeological Sites", 10.1007/978-1-4614-6141-8_25) or interpretation (see Chap. 24, "Nature and Analysis of Material Evidence Relevant to Archaeoastronomy", 10.1007/978-1-4614-6141-8_22). The main necessity is to determine the azimuth, horizon altitude, and declination in the direction "indicated" by any structural orientation. Normally, there are a range of possibilities, reflecting the various errors and uncertainties in estimating the intended (or, at least, the constructed) orientation, and in more formal approaches an attempt is made to assign a probability distribution extending over a spread of declinations. These probability distributions can then be cumulated in order to visualize and analyze the combined data from several orientations, so as to identify any consistent astronomical associations that can then be correlated with the declinations of particular astronomical objects or phenomena at any era in the past. The whole process raises various procedural and methodological issues and does not proceed in isolation from the consideration of corroborative data, which is essential in order to develop viable cultural interpretations.

  14. Modern Problems. Grade 12.

    ERIC Educational Resources Information Center

    Wilmington Public Schools, DE.

    The general purpose of the twelfth grade course is to help the student assume his role as a decision-maker in a democratic society. The nature and complexity of contemporary problems are examined using this guide to enable the student: 1) to analyze alternative solutions to these problems; 2) to develop attitudes and values appropriate to a…

  15. The vulnerability of rules in complex work environments: dynamism and uncertainty pose problems for cognition.

    PubMed

    Clewley, Richard; Stupple, Edward J N

    2015-01-01

    Many complex work environments rely heavily on cognitive operators using rules. Operators sometimes fail to implement rules, with catastrophic human, social and economic costs. Rule-based error is widely reported, yet the mechanisms of rule vulnerability have received less attention. This paper examines rule vulnerability in the complex setting of airline transport operations. We examined 'the stable approach criteria rule', which acts as a system defence during the approach to land. The study experimentally tested whether system state complexity influenced rule failure. The results showed increased uncertainty and dynamism led to increased likelihood of rule failure. There was also an interaction effect, indicating complexity from different sources can combine to further constrain rule-based response. We discuss the results in relation to recent aircraft accidents and suggest that 'rule-based error' could be progressed to embrace rule vulnerability, fragility and failure. This better reflects the influence that system behaviour and cognitive variety have on rule-based response. Practitioner Summary: In this study, we examined mechanisms of rule vulnerability in the complex setting of airline transport operations. The results suggest work scenarios featuring high uncertainty and dynamism constrain rule-based response, leading to rules becoming vulnerable, fragile or failing completely. This has significant implications for rule-intensive, safety critical work environments. PMID:25588754

  16. The Research of Solution to the Problems of Complex Task Scheduling Based on Self-adaptive Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Zhu, Li; He, Yongxiang; Xue, Haidong; Chen, Leichen

    Traditional genetic algorithms (GA) displays a disadvantage of early-constringency in dealing with scheduling problem. To improve the crossover operators and mutation operators self-adaptively, this paper proposes a self-adaptive GA at the target of multitask scheduling optimization under limited resources. The experiment results show that the proposed algorithm outperforms the traditional GA in evolutive ability to deal with complex task scheduling optimization.

  17. Solving hard computational problems efficiently: asymptotic parametric complexity 3-coloring algorithm.

    PubMed

    Martín H, José Antonio

    2013-01-01

    Many practical problems in almost all scientific and technological disciplines have been classified as computationally hard (NP-hard or even NP-complete). In life sciences, combinatorial optimization problems frequently arise in molecular biology, e.g., genome sequencing; global alignment of multiple genomes; identifying siblings or discovery of dysregulated pathways. In almost all of these problems, there is the need for proving a hypothesis about certain property of an object that can be present if and only if it adopts some particular admissible structure (an NP-certificate) or be absent (no admissible structure), however, none of the standard approaches can discard the hypothesis when no solution can be found, since none can provide a proof that there is no admissible structure. This article presents an algorithm that introduces a novel type of solution method to "efficiently" solve the graph 3-coloring problem; an NP-complete problem. The proposed method provides certificates (proofs) in both cases: present or absent, so it is possible to accept or reject the hypothesis on the basis of a rigorous proof. It provides exact solutions and is polynomial-time (i.e., efficient) however parametric. The only requirement is sufficient computational power, which is controlled by the parameter α∈N. Nevertheless, here it is proved that the probability of requiring a value of α>k to obtain a solution for a random graph decreases exponentially: P(α>k)≤2(-(k+1)), making tractable almost all problem instances. Thorough experimental analyses were performed. The algorithm was tested on random graphs, planar graphs and 4-regular planar graphs. The obtained experimental results are in accordance with the theoretical expected results. PMID:23349711

  18. Optical analyzer

    DOEpatents

    Hansen, A.D.

    1987-09-28

    An optical analyzer wherein a sample of particulate matter, and particularly of organic matter, which has been collected on a quartz fiber filter is placed in a combustion tube, and light from a light source is passed through the sample. The temperature of the sample is raised at a controlled rate and in a controlled atmosphere. The magnitude of the transmission of light through the sample is detected as the temperature is raised. A data processor, differentiator and a two pen recorder provide a chart of the optical transmission versus temperature and the rate of change of optical transmission versus temperature signatures (T and D) of the sample. These signatures provide information as to physical and chemical processes and a variety of quantitative and qualitative information about the sample. Additional information is obtained by repeating the run in different atmospheres and/or different rates or heating with other samples of the same particulate material collected on other filters. 7 figs.

  19. Speech analyzer

    NASA Technical Reports Server (NTRS)

    Lokerson, D. C. (Inventor)

    1977-01-01

    A speech signal is analyzed by applying the signal to formant filters which derive first, second and third signals respectively representing the frequency of the speech waveform in the first, second and third formants. A first pulse train having approximately a pulse rate representing the average frequency of the first formant is derived; second and third pulse trains having pulse rates respectively representing zero crossings of the second and third formants are derived. The first formant pulse train is derived by establishing N signal level bands, where N is an integer at least equal to two. Adjacent ones of the signal bands have common boundaries, each of which is a predetermined percentage of the peak level of a complete cycle of the speech waveform.

  20. Extension of the tridiagonal reduction (FEER) method for complex eigenvalue problems in NASTRAN

    NASA Technical Reports Server (NTRS)

    Newman, M.; Mann, F. I.

    1978-01-01

    As in the case of real eigenvalue analysis, the eigensolutions closest to a selected point in the eigenspectrum were extracted from a reduced, symmetric, tridiagonal eigenmatrix whose order was much lower than that of the full size problem. The reduction process was effected automatically, and thus avoided the arbitrary lumping of masses and other physical quantities at selected grid points. The statement of the algebraic eigenvalue problem admitted mass, damping, and stiffness matrices which were unrestricted in character, i.e., they might be real, symmetric or nonsymmetric, singular or nonsingular.

  1. Undergraduate Student Task Group Approach to Complex Problem Solving Employing Computer Programming.

    ERIC Educational Resources Information Center

    Brooks, LeRoy D.

    A project formulated a computer simulation game for use as an instructional device to improve financial decision making. The author constructed a hypothetical firm, specifying its environment, variables, and a maximization problem. Students, assisted by a professor and computer consultants and having access to B5500 and B6700 facilities, held 16…

  2. Convergent Validity of the Aberrant Behavior Checklist and Behavior Problems Inventory with People with Complex Needs

    ERIC Educational Resources Information Center

    Hill, Jennie; Powlitch, Stephanie; Furniss, Frederick

    2008-01-01

    The current study aimed to replicate and extend Rojahn et al. [Rojahn, J., Aman, M. G., Matson, J. L., & Mayville, E. (2003). "The aberrant behavior checklist and the behavior problems inventory: Convergent and divergent validity." "Research in Developmental Disabilities", 24, 391-404] by examining the convergent validity of the behavior problems…

  3. The Complex Relationship between Students' Critical Thinking and Epistemological Beliefs in the Context of Problem Solving

    ERIC Educational Resources Information Center

    Hyytinen, Heidi; Holma, Katariina; Toom, Auli; Shavelson, Richard J.; Lindblom-Ylänne, Sari

    2014-01-01

    The study utilized a multi-method approach to explore the connection between critical thinking and epistemological beliefs in a specific problem-solving situation. Data drawn from a sample of ten third-year bioscience students were collected using a combination of a cognitive lab and a performance task from the Collegiate Learning Assessment…

  4. New ecology education: Preparing students for the complex human-environmental problems of dryland East Asia

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Present-day environmental problems of Dryland East Asia are serious, and future prospects look especially disconcerting owing to current trends in population growth and economic development. Land degradation and desertification, invasive species, biodiversity losses, toxic waste and air pollution, a...

  5. The Species Problem and the Value of Teaching and the Complexities of Species

    ERIC Educational Resources Information Center

    Chung, Carl

    2004-01-01

    Discussions on species taxa directly refer to a range of complex biological phenomena. Given these phenomena, biologists have developed and continue to appeal to a series of species concepts and do not have a clear definition for it as each species concept tells us part of the story or helps the biologists to explain and understand a subset of…

  6. Foucault as Complexity Theorist: Overcoming the Problems of Classical Philosophical Analysis

    ERIC Educational Resources Information Center

    Olssen, Mark

    2008-01-01

    This article explores the affinities and parallels between Foucault's Nietzschean view of history and models of complexity developed in the physical sciences in the twentieth century. It claims that Foucault's rejection of structuralism and Marxism can be explained as a consequence of his own approach which posits a radical ontology whereby the…

  7. ARC syndrome with complex renal problems: nephrocalcinosis, proximal and hyperkalemic distal RTA and nephrogenic diabetes insipidus.

    PubMed

    Malaki, Majid; Mandana, Rafeei; Ghaffari, Shamsi

    2012-07-01

    We present a female neonate with arthrogryposis, renal tubular abnormalities and cholestasis syndrome and complex renal structural and functional abnormalities that include medullary nephrocalcinosis, hydronephrosis, nephrogenic diabetes insipidus, Fanconi syndrome, proximal and distal hyperkalemic renal tubular acidosis, near-nephrotic range proteinuria, hypercalciuria and severe hypovitaminosis D. PMID:22805396

  8. ETD QA CORE TEAM: AN ELOQUENT SOLUTION TO A COMPLEX PROBLEM

    EPA Science Inventory

    ETD QA CORE TEAM: AN ELOQUENT SOLUTION TO A COMPLEX PROBLEMThomas J. Hughes, QA and Records Manager, Experimental Toxicology Division (ETD), National Health and Environmental Effects Research Laboratory (NHEERL), ORD, U.S. EPA, RTP, NC 27709

    ETD is the largest health divis...

  9. Analogize This! The Politics of Scale and the Problem of Substance in Complexity-Based Composition

    ERIC Educational Resources Information Center

    Roderick, Noah R.

    2012-01-01

    In light of recent enthusiasm in composition studies (and in the social sciences more broadly) for complexity theory and ecology, this article revisits the debate over how much composition studies can or should align itself with the natural sciences. For many in the discipline, the science debate--which was ignited in the 1970s, both by the…

  10. INDUCTION OF MUTATION SPECTRA BY COMPLEX MIXTURES: APPROACHES, PROBLEMS, AND POSSIBILITIES

    EPA Science Inventory

    More complex environmental mixtures have been evaluated for mutagenic activity at the hisD3052 allele of Salmonella, primarily in strain TA98, than in any other mutation assay. Using colony probe hybridization to detect a common hotspot deletion, followed by PCR and DNA sequencin...

  11. Optical analyzer

    DOEpatents

    Hansen, Anthony D.

    1989-02-07

    An optical analyzer (10) wherein a sample (19) of particulate matter, and particularly of organic matter, which has been collected on a quartz fiber filter (20) is placed in a combustion tube (11), and light from a light source (14) is passed through the sample (19). The temperature of the sample (19) is raised at a controlled rate and in a controlled atmosphere. The magnitude of the transmission of light through the sample (19) is detected (18) as the temperature is raised. A data processor (23), differentiator (28) and a two pen recorder (24) provide a chart of the optical transmission versus temperature and the rate of change of optical transmission versus temperature signatures (T and D) of the sample (19). These signatures provide information as to physical and chemical processes and a variety of quantitative and qualitative information about the sample (19). Additional information is obtained by repeating the run in different atmospheres and/or different rates of heating with other samples of the same particulate material collected on other filters.

  12. Optical analyzer

    DOEpatents

    Hansen, Anthony D.

    1989-01-01

    An optical analyzer (10) wherein a sample (19) of particulate matter, and particularly of organic matter, which has been collected on a quartz fiber filter (20) is placed in a combustion tube (11), and light from a light source (14) is passed through the sample (19). The temperature of the sample (19) is raised at a controlled rate and in a controlled atmosphere. The magnitude of the transmission of light through the sample (19) is detected (18) as the temperature is raised. A data processor (23), differentiator (28) and a two pen recorder (24) provide a chart of the optical transmission versus temperature and the rate of change of optical transmission versus temperature signatures (T and D) of the sample (19). These signatures provide information as to physical and chemical processes and a variety of quantitative and qualitative information about the sample (19). Additional information is obtained by repeating the run in different atmospheres and/or different rates of heating with other samples of the same particulate material collected on other filters.

  13. Recent advances in hopanoids analysis: Quantification protocols overview, main research targets and selected problems of complex data exploration.

    PubMed

    Zarzycki, Paweł K; Portka, Joanna K

    2015-09-01

    Pentacyclic triterpenoids, particularly hopanoids, are organism-specific compounds and are generally considered as useful biomarkers that allow fingerprinting and classification of biological, environmental and geological samples. Simultaneous quantification of various hopanoids together with battery of related non-polar and low-molecular mass compounds may provide principal information for geochemical and environmental research focusing on both modern and ancient investigations. Target compounds can be derived from microbial biomass, water columns, sediments, coals, crude fossils or rocks. This create number of analytical problems due to different composition of the analytical matrix and interfering compounds and therefore, proper optimization of quantification protocols for such biomarkers is still the challenge. In this work we summarizing typical analytical protocols that were recently applied for quantification of hopanoids like compounds from different samples. Main steps including components of interest extraction, pre-purification, fractionation, derivatization and quantification involving gas (1D and 2D) as well as liquid separation techniques (liquid-liquid extraction, solid-phase extraction, planar and low resolution column chromatography, high-performance liquid chromatography) are described and discussed from practical point of view, mainly based on the experimental papers that were published within last two years, where significant increase in hopanoids research was noticed. The second aim of this review is to describe the latest research trends concerning determination of hopanoids and related low-molecular mass lipids analyzed in various samples including sediments, rocks, coals, crude oils and plant fossils as well as stromatolites and microbial biomass cultivated under different conditions. It has been found that majority of the most recent papers are based on uni- or bivariate approach for complex data analysis. Data interpretation involves

  14. Dynamic Modeling as a Cognitive Regulation Scaffold for Developing Complex Problem-Solving Skills in an Educational Massively Multiplayer Online Game Environment

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ge, Xun; Ifenthaler, Dirk; Law, Victor

    2011-01-01

    Following a design-based research framework, this article reports two empirical studies with an educational MMOG, called "McLarin's Adventures," on facilitating 9th-grade students' complex problem-solving skill acquisition in interdisciplinary STEM education. The article discusses the nature of complex and ill-structured problem solving and,…

  15. ABSORPTION ANALYZER

    DOEpatents

    Brooksbank, W.A. Jr.; Leddicotte, G.W.; Strain, J.E.; Hendon, H.H. Jr.

    1961-11-14

    A means was developed for continuously computing and indicating the isotopic assay of a process solution and for automatically controlling the process output of isotope separation equipment to provide a continuous output of the desired isotopic ratio. A counter tube is surrounded with a sample to be analyzed so that the tube is exactly in the center of the sample. A source of fast neutrons is provided and is spaced from the sample. The neutrons from the source are thermalized by causing them to pass through a neutron moderator, and the neutrons are allowed to diffuse radially through the sample to actuate the counter. A reference counter in a known sample of pure solvent is also actuated by the thermal neutrons from the neutron source. The number of neutrons which actuate the detectors is a function of a concentration of the elements in solution and their neutron absorption cross sections. The pulses produced by the detectors responsive to each neu tron passing therethrough are amplified and counted. The respective times required to accumulate a selected number of counts are measured by associated timing devices. The concentration of a particular element in solution may be determined by utilizing the following relation: T2/Ti = BCR, where B is a constant proportional to the absorption cross sections, T2 is the time of count collection for the unknown solution, Ti is the time of count collection for the pure solvent, R is the isotopic ratlo, and C is the molar concentration of the element to be determined. Knowing the slope constant B for any element and when the chemical concentration is known, the isotopic concentration may be readily determined, and conversely when the isotopic ratio is known, the chemical concentrations may be determined. (AEC)

  16. Simulations for Complex Fluid Flow Problems from Berkeley Lab's Center for Computational Sciences and Engineering (CCSE)

    DOE Data Explorer

    The Center for Computational Sciences and Engineering (CCSE) develops and applies advanced computational methodologies to solve large-scale scientific and engineering problems arising in the Department of Energy (DOE) mission areas involving energy, environmental, and industrial technology. The primary focus is in the application of structured-grid finite difference methods on adaptive grid hierarchies for compressible, incompressible, and low Mach number flows. The diverse range of scientific applications that drive the research typically involve a large range of spatial and temporal scales (e.g. turbulent reacting flows) and require the use of extremely large computing hardware, such as the 153,000-core computer, Hopper, at NERSC. The CCSE approach to these problems centers on the development and application of advanced algorithms that exploit known separations in scale; for many of the application areas this results in algorithms are several orders of magnitude more efficient than traditional simulation approaches.

  17. Studying PubMed usages in the field for complex problem solving: Implications for tool design.

    PubMed

    Mirel, Barbara; Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa

    2013-05-01

    Many recent studies on MEDLINE-based information seeking have shed light on scientists' behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists' problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375

  18. Studying PubMed usages in the field for complex problem solving: Implications for tool design

    PubMed Central

    Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa

    2012-01-01

    Many recent studies on MEDLINE-based information seeking have shed light on scientists’ behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists’ problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375

  19. Increased risk of preterm delivery in areas with cancer mortality problems from petrochemical complexes.

    PubMed

    Yang, Chun-Yuh; Chiu, Hui-Fen; Tsai, Shang-Shyue; Chang, Chih-Ching; Chuang, Hung-Yi

    2002-07-01

    The petrochemical and petroleum industries are the main sources of industrial air pollution in Taiwan. Data in this study concern outdoor air pollution and the health of individuals living in communities in close proximity to petrochemical industrial complexes. The prevalence of delivery of preterm birth infants was significantly higher in mothers living near petrochemical industrial complexes than in control mothers living elsewhere in Taiwan. After controlling for several possible confounders (including maternal age, season, marital status, maternal education, and infant sex), the adjusted odds ratio was 1.18 (95% CI=1.04-1.34) for delivery of preterm infants in the petrochemically polluted region. The data provide further support for the hypothesis that air pollution can affect the outcome of pregnancy. PMID:12176003

  20. Criteria for assessing problem solving and decision making in complex environments

    NASA Technical Reports Server (NTRS)

    Orasanu, Judith

    1993-01-01

    Training crews to cope with unanticipated problems in high-risk, high-stress environments requires models of effective problem solving and decision making. Existing decision theories use the criteria of logical consistency and mathematical optimality to evaluate decision quality. While these approaches are useful under some circumstances, the assumptions underlying these models frequently are not met in dynamic time-pressured operational environments. Also, applying formal decision models is both labor and time intensive, a luxury often lacking in operational environments. Alternate approaches and criteria are needed. Given that operational problem solving and decision making are embedded in ongoing tasks, evaluation criteria must address the relation between those activities and satisfaction of broader task goals. Effectiveness and efficiency become relevant for judging reasoning performance in operational environments. New questions must be addressed: What is the relation between the quality of decisions and overall performance by crews engaged in critical high risk tasks? Are different strategies most effective for different types of decisions? How can various decision types be characterized? A preliminary model of decision types found in air transport environments will be described along with a preliminary performance model based on an analysis of 30 flight crews. The performance analysis examined behaviors that distinguish more and less effective crews (based on performance errors). Implications for training and system design will be discussed.

  1. Outcomes, moderators, and mediators of empathic-emotion recognition training for complex conduct problems in childhood.

    PubMed

    Dadds, Mark Richard; Cauchi, Avril Jessica; Wimalaweera, Subodha; Hawes, David John; Brennan, John

    2012-10-30

    Impairments in emotion recognition skills are a trans-diagnostic indicator of early mental health problems and may be responsive to intervention. We report on a randomized controlled trial of "Emotion-recognition-training" (ERT) versus treatment-as-usual (TAU) with N=195 mixed diagnostic children (mean age 10.52 years) referred for behavioral/emotional problems measured at pre- and 6 months post-treatment. We tested overall outcomes plus moderation and mediation models, whereby diagnostic profile was tested as a moderator of change. ERT had no impact on the group as a whole. Diagnostic status of the child did not moderate outcomes; however, levels of callous-unemotional (CU) traits moderated outcomes such that children with high CU traits responded less well to TAU, while ERT produced significant improvements in affective empathy and conduct problems in these children. Emotion recognition training has potential as an adjunctive intervention specifically for clinically referred children with high CU traits, regardless of their diagnostic status. PMID:22703720

  2. Class II malocclusion with complex problems treated with a novel combination of lingual orthodontic appliances and lingual arches.

    PubMed

    Yanagita, Takeshi; Nakamura, Masahiro; Kawanabe, Noriaki; Yamashiro, Takashi

    2014-07-01

    This case report describes a novel method of combining lingual appliances and lingual arches to control horizontal problems. The patient, who was 25 years of age at her first visit to our hospital with a chief complaint of crooked anterior teeth, was diagnosed with skeletal Class II and Angle Class II malocclusion with anterior deep bite, lateral open bite, premolar crossbite, and severe crowding in both arches. She was treated with premolar extractions and temporary anchorage devices. Conventionally, it is ideal to use labial brackets simultaneously with appliances, such as a lingual arch, a quad-helix, or a rapid expansion appliance, in patients with complex problems requiring horizontal, anteroposterior, and vertical control; however, this patient strongly requested orthodontic treatment with lingual appliances. A limitation of lingual appliances is that they cannot be used with other conventional appliances. In this report, we present the successful orthodontic treatment of a complex problem using modified lingual appliances that enabled combined use of a conventional lingual arch. PMID:24975004

  3. Infinite-range exterior complex scaling as a perfect absorber in time-dependent problems

    SciTech Connect

    Scrinzi, Armin

    2010-05-15

    We introduce infinite range exterior complex scaling (irECS) which provides for complete absorption of outgoing flux in numerical solutions of the time-dependent Schroedinger equation with strong infrared fields. This is demonstrated by computing high harmonic spectra and wave-function overlaps with the exact solution for a one-dimensional model system and by three-dimensional calculations for the H atom and an Ne atom model. We lay out the key ingredients for correct implementation and identify criteria for efficient discretization.

  4. Efficient Three-Dimensional Direct Simulation Monte Carlo for Complex Geometry Problems

    NASA Technical Reports Server (NTRS)

    Rault, Didier F. G.

    1993-01-01

    The simulation of flowfields in the transition flow regime is notoriously difficult with high demands on computer resources (CPU time and storage) and user expertise/labor. This paper describes a new, efficient code which has been developed to simulate high Knudsen number flowfields in three dimensions about bodies of arbitrarily complex geometry. The algorithm has been tested over a wide range of conditions, from free molecular to near-continuum flow regimes, for slender and blunt bodies, for re-entry vehicles and spacecraft. A series of validation tests have been conducted using both wind-tunnel measurements and flight data.

  5. An unstructured-grid software system for solving complex aerodynamic problems

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Pirzadeh, Shahyar; Parikh, Paresh

    1995-01-01

    A coordinated effort has been underway over the past four years to elevate unstructured-grid methodology to a mature level. The goal of this endeavor is to provide a validated capability to non-expert users for performing rapid aerodynamic analysis and design of complex configurations. The Euler component of the system is well developed, and is impacting a broad spectrum of engineering needs with capabilities such as rapid grid generation and inviscid flow analysis, inverse design, interactive boundary layers, and propulsion effects. Progress is also being made in the more tenuous Navier-Stokes component of the system. A robust grid generator is under development for constructing quality thin-layer tetrahedral grids, along with a companion Navier-Stokes flow solver. This paper presents an overview of this effort, along with a perspective on the present and future status of the methodology.

  6. A HLL-Rankine-Hugoniot Riemann solver for complex non-linear hyperbolic problems

    NASA Astrophysics Data System (ADS)

    Guy, Capdeville

    2013-10-01

    We present a new HLL-type approximate Riemann solver that aims at capturing any isolated discontinuity without necessitating extensive characteristic analysis of governing partial differential equations. This property is especially attractive for complex hyperbolic systems with more than two equations. Following Linde's (2002) approach [6], we introduce a generic middle wave into the classical two-state HLL solver. The property of this third wave is typified by the way of a "strength indicator" that is derived from polynomial considerations. The polynomial that constitutes the basis of the procedure is made non-oscillatory by an adapted fourth-order WENO algorithm (CWENO4). This algorithm makes it possible to derive an expression for the strength indicator. According to the size of this latter parameter, the resulting solver (HLL-RH), either computes the multi-dimensional Rankine-Hugoniot equations if an isolated discontinuity appears in the Riemann fan, or asymptotically tends towards the two-state HLL solver if the solution is locally smooth. The asymptotic version of the HLL-RH solver is demonstrated to be positively conservative and entropy satisfying in its first-order multi-dimensional form provided that a relevant and not too restrictive CFL condition is considered; specific limitations of the conservative increments of the numerical solution and a suited entropy condition enable to maintain these properties in its high-order version. With a monotonicity-preserving algorithm for the time integration, the numerical method so generated, is third order in time and fourth-order accurate in space for the smooth part of the solution; moreover, the scheme is stable and accurate when capturing a shock wave, whatever the complexity of the underlying differential system. Extensive numerical tests for the one- and two-dimensional Euler equation of gas dynamics and comparisons with classical Godunov-type methods help to point out the potentialities and insufficiencies

  7. ATSDR evaluation of health effects of chemicals. IV. Polycyclic aromatic hydrocarbons (PAHs): understanding a complex problem.

    PubMed

    Mumtaz, M M; George, J D; Gold, K W; Cibulas, W; DeRosa, C T

    1996-01-01

    Polycyclic Aromatic Hydrocarbons (PAHs) are a group of chemicals that are formed during the incomplete burning of coal, oil, gas, wood, garbage, or other organic substances, such as tobacco and charbroiled meat. There are more than 100 PAHs. PAHs generally occur as complex mixtures (for example, as part of products such as soot), not as single compounds. PAHs are found throughout the environment in the air, water, and soil. As part of its mandate, the Agency for Toxic Substances and Disease Registry (ATSDR) prepares toxicological profiles on hazardous chemicals, including PAHs (ATSDR, 1995), found at facilities on the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) National Priorities List (NPL) and which pose the most significant potential threat to human health, as determined by ATSDR and the Environmental Protection Agency (EPA). These profiles include information on health effects of chemicals from different routes and durations of exposure, their potential for exposure, regulations and advisories, and the adequacy of the existing database. Assessing the health effects of PAHs is a major challenge because environmental exposures to these chemicals are usually to complex mixtures of PAHs with other chemicals. The biological consequences of human exposure to mixtures of PAHs depend on the toxicity, carcinogenic and noncarcinogenic, of the individual components of the mixture, the types of interactions among them, and confounding factors that are not thoroughly understood. Also identified are components of exposure and health effects research needed on PAHs that will allow estimation of realistic human health risks posed by exposures to PAHs. The exposure assessment component of research should focus on (1) development of reliable analytical methods for the determination of bioavailable PAHs following ingestion, (2) estimation of bioavailable PAHs from environmental media, particularly the determination of particle-bound PAHs, (3

  8. On the Parameterized Complexity of Some Optimization Problems Related to Multiple-Interval Graphs

    NASA Astrophysics Data System (ADS)

    Jiang, Minghui

    We show that for any constant t ≥ 2, K -Independent Set and K-Dominating Set in t-track interval graphs are W[1]-hard. This settles an open question recently raised by Fellows, Hermelin, Rosamond, and Vialette. We also give an FPT algorithm for K-Clique in t-interval graphs, parameterized by both k and t, with running time max { t O(k), 2 O(klogk) } ·poly(n), where n is the number of vertices in the graph. This slightly improves the previous FPT algorithm by Fellows, Hermelin, Rosamond, and Vialette. Finally, we use the W[1]-hardness of K-Independent Set in t-track interval graphs to obtain the first parameterized intractability result for a recent bioinformatics problem called Maximal Strip Recovery (MSR). We show that MSR-d is W[1]-hard for any constant d ≥ 4 when the parameter is either the total length of the strips, or the total number of adjacencies in the strips, or the number of strips in the optimal solution.

  9. Application of a low order panel method to complex three-dimensional internal flow problems

    NASA Technical Reports Server (NTRS)

    Ashby, D. L.; Sandlin, D. R.

    1986-01-01

    An evaluation of the ability of a low order panel method to predict complex three-dimensional internal flow fields was made. The computer code VSAERO was used as a basis for the evaluation. Guidelines for modeling internal flow geometries were determined and the effects of varying the boundary conditions and the use of numerical approximations on the solutions accuracy were studied. Several test cases were run and the results were compared with theoretical or experimental results. Modeling an internal flow geometry as a closed box with normal velocities specified on an inlet and exit face provided accurate results and gave the user control over the boundary conditions. The values of the boundary conditions greatly influenced the amount of leakage an internal flow geometry suffered and could be adjusted to eliminate leakage. The use of the far-field approximation to reduce computation time influenced the accuracy of a solution and was coupled with the values of the boundary conditions needed to eliminate leakage. The error induced in the influence coefficients by using the far-field approximation was found to be dependent on the type of influence coefficient, the far-field radius, and the aspect ratio of the panels.

  10. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  11. Combination Therapies for Lysosomal Storage Diseases: A Complex Answer to a Simple Problem.

    PubMed

    Macauley, Shannon L

    2016-06-01

    Abstract Lysosomal storage diseases (LSDs) are a group of 40-50 rare monogenic disorders that result in disrupted lysosomal function and subsequent lysosomal pathology. Depending on the protein or enzyme deficiency associated with each disease, LSDs affect an array of organ systems and elicit a complex set of secondary disease mechanisms that make many of these disorders difficult to fully treat. The etiology of most LSDs is known and the innate biology of lysosomal enzymes favors therapeutic intervention, yet most attempts at treating LSDs with enzyme replacement strategies fall short of being curative. Even with the advent of more sophisticated approaches, like substrate reduction therapy, pharmacologic chaperones, gene therapy or stem cell therapy, comprehensive treatments for LSDs have yet to be achieved. Given the limitations with individual therapies, recent research has focused on using a combination approach to treat LSDs. By coupling protein-, cell-, and gene- based therapies with small molecule drugs, researchers have found greater success in eradicating the clinical features of disease. This review seeks to discuss the positive and negatives of singular therapies used to treat LSDs, and discuss how, in combination, studies have demonstrated a more holistic benefit on pathological and functional parameters. By optimizing routes of delivery, therapeutic timing, and targeting secondary disease mechanisms, combination therapy represents the future for LSD treatment. PMID:27491211

  12. Analyzing Bilingual Education Costs.

    ERIC Educational Resources Information Center

    Bernal, Joe J.

    This paper examines the particular problems involved in analyzing the costs of bilingual education and suggests that cost analysis of bilingual education requires a fundamentally different approach than that followed in other recent school finance studies. Focus of the discussion is the Intercultural Development Research Association's (IDRA)…

  13. Social and ethical dimension of the natural sciences, complex problems of the age, interdisciplinarity, and the contribution of education

    NASA Astrophysics Data System (ADS)

    Develaki, Maria

    2008-09-01

    In view of the complex problems of this age, the question of the socio-ethical dimension of science acquires particular importance. We approach this matter from a philosophical and sociological standpoint, looking at such focal concerns as the motivation, purposes and methods of scientific activity, the ambivalence of scientific research and the concomitant risks, and the conflict between research freedom and external socio-political intervention. We then point out the impediments to the effectiveness of cross-disciplinary or broader meetings for addressing these complex problems and managing the associated risks, given the difficulty in communication between experts in different fields and non-experts, difficulties that education is challenged to help resolve. We find that the social necessity of informed decision-making on the basis of cross-disciplinary collaboration is reflected in the newer curricula, such as that of Greece, in aims like the acquisition of cross-subject knowledge and skills, and the ability to make decisions on controversial issues involving value conflicts. The interest and the reflections of the science education community in these matters increase its—traditionally limited—contribution to the theoretical debate on education and, by extension, the value of science education in the education system.

  14. BioDMET: a physiologically based pharmacokinetic simulation tool for assessing proposed solutions to complex biological problems.

    PubMed

    Graf, John F; Scholz, Bernhard J; Zavodszky, Maria I

    2012-02-01

    We developed a detailed, whole-body physiologically based pharmacokinetic (PBPK) modeling tool for calculating the distribution of pharmaceutical agents in the various tissues and organs of a human or animal as a function of time. Ordinary differential equations (ODEs) represent the circulation of body fluids through organs and tissues at the macroscopic level, and the biological transport mechanisms and biotransformations within cells and their organelles at the molecular scale. Each major organ in the body is modeled as composed of one or more tissues. Tissues are made up of cells and fluid spaces. The model accounts for the circulation of arterial and venous blood as well as lymph. Since its development was fueled by the need to accurately predict the pharmacokinetic properties of imaging agents, BioDMET is more complex than most PBPK models. The anatomical details of the model are important for the imaging simulation endpoints. Model complexity has also been crucial for quickly adapting the tool to different problems without the need to generate a new model for every problem. When simpler models are preferred, the non-critical compartments can be dynamically collapsed to reduce unnecessary complexity. BioDMET has been used for imaging feasibility calculations in oncology, neurology, cardiology, and diabetes. For this purpose, the time concentration data generated by the model is inputted into a physics-based image simulator to establish imageability criteria. These are then used to define agent and physiology property ranges required for successful imaging. BioDMET has lately been adapted to aid the development of antimicrobial therapeutics. Given a range of built-in features and its inherent flexibility to customization, the model can be used to study a variety of pharmacokinetic and pharmacodynamic problems such as the effects of inter-individual differences and disease-states on drug pharmacokinetics and pharmacodynamics, dosing optimization, and inter

  15. Improving and validating 3D models for the leaf energy balance in canopy-scale problems with complex geometry

    NASA Astrophysics Data System (ADS)

    Bailey, B.; Stoll, R., II; Miller, N. E.; Pardyjak, E.; Mahaffee, W.

    2014-12-01

    Plants cover the majority of Earth's land surface, and thus play a critical role in the surface energy balance. Within individual plant communities, the leaf energy balance is a fundamental component of most biophysical processes. Absorbed radiation drives the energy balance and provides the means by which plants produce food. Available energy is partitioned into sensible and latent heat fluxes to determine surface temperature, which strongly influences rates of metabolic activity and growth. The energy balance of an individual leaf is coupled with other leaves in the community through longwave radiation emission and advection through the air. This complex coupling can make scaling models from leaves to whole-canopies difficult, specifically in canopies with complex, heterogeneous geometries. We present a new three-dimensional canopy model that simultaneously resolves sub-tree to whole-canopy scales. The model provides spatially explicit predictions of net radiation exchange, boundary-layer and stomatal conductances, evapotranspiration rates, and ultimately leaf surface temperature. The radiation model includes complex physics such as anisotropic emission and scattering. Radiation calculations are accelerated by leveraging graphics processing unit (GPU) technology, which allows canopy-scale problems to be performed on a standard desktop workstation. Since validating the three-dimensional distribution of leaf temperature can be extremely challenging, we used several independent measurement techniques to quantify errors in measured and modeled values. When compared with measured leaf temperatures, the model gave a mean error of about 2°C, which was close to the estimated measurement uncertainty.

  16. Eddy covariance measurements in complex terrain with a new fast response, closed-path analyzer: spectral characteristics and cross-system comparisons

    EPA Science Inventory

    In recent years, a new class of enclosed, closed-path gas analyzers suitable for eddy covariance applications has come to market, designed to combine the advantages of traditional closed-path systems (small density corrections, good performance in poor weather) and open-path syst...

  17. Main problems and perspectives of the synthesis of nanocomposite coatings on the surface of complex-shaped components

    NASA Astrophysics Data System (ADS)

    Brzhozovskii, B.; Martynov, V.; Zinina, E.; Brovkova, M.

    2016-02-01

    The main methods for composite coating synthesis in the surface layer of complex-shaped workpiece with dissipative properties have been investigated in this study. The problem of their application can be solved by using low-temperature plasma of combined discharge. The plasma is formed directly around the treated surface, which allows the use of the maximum amount of energy transmitted by its ions for the synthesis process itself. The main mechanism is the synthesis of three-body recombination, in which surface structures are heated to temperatures that change the phase of their constituent atoms. As a result, the original structure of the surface layer of the workpiece is transformed into a composite structure consisting of nanoclusters of ∼ 50 ... 100 nm in size and an amorphous layer of particles which acts as binder, and gradient transition from the coating to the base material filling the microvoids.

  18. Complex modeling: a strategy and software program for combining multiple information sources to solve ill posed structure and nanostructure inverse problems.

    PubMed

    Juhás, Pavol; Farrow, Christopher L; Yang, Xiaohao; Knox, Kevin R; Billinge, Simon J L

    2015-11-01

    A strategy is described for regularizing ill posed structure and nanostructure scattering inverse problems (i.e. structure solution) from complex material structures. This paper describes both the philosophy and strategy of the approach, and a software implementation, DiffPy Complex Modeling Infrastructure (DiffPy-CMI). PMID:26522405

  19. The Cauchy Problem in Local Spaces for the Complex Ginzburg-Landau EquationII. Contraction Methods

    NASA Astrophysics Data System (ADS)

    Ginibre, J.; Velo, G.

    We continue the study of the initial value problem for the complex Ginzburg-Landau equation (with a > 0, b > 0, g>= 0) in initiated in a previous paper [I]. We treat the case where the initial data and the solutions belong to local uniform spaces, more precisely to spaces of functions satisfying local regularity conditions and uniform bounds in local norms, but no decay conditions (or arbitrarily weak decay conditions) at infinity in . In [I] we used compactness methods and an extended version of recent local estimates [3] and proved in particular the existence of solutions globally defined in time with local regularity of the initial data corresponding to the spaces Lr for r>= 2 or H1. Here we treat the same problem by contraction methods. This allows us in particular to prove that the solutions obtained in [I] are unique under suitable subcriticality conditions, and to obtain for them additional regularity properties and uniform bounds. The method extends some of those previously applied to the nonlinear heat equation in global spaces to the framework of local uniform spaces.

  20. A system for measuring complex dielectric properties of thin films at submillimeter wavelengths using an open hemispherical cavity and a vector network analyzer

    NASA Astrophysics Data System (ADS)

    Rahman, Rezwanur; Taylor, P. C.; Scales, John A.

    2013-08-01

    Quasi-optical (QO) methods of dielectric spectroscopy are well established in the millimeter and submillimeter frequency bands. These methods exploit standing wave structure in the sample produced by a transmitted Gaussian beam to achieve accurate, low-noise measurement of the complex permittivity of the sample [e.g., J. A. Scales and M. Batzle, Appl. Phys. Lett. 88, 062906 (2006);, 10.1063/1.2172403 R. N. Clarke and C. B. Rosenberg, J. Phys. E 15, 9 (1982);, 10.1088/0022-3735/15/1/002 T. M. Hirovnen, P. Vainikainen, A. Lozowski, and A. V. Raisanen, IEEE Trans. Instrum. Meas. 45, 780 (1996)], 10.1109/19.516996. In effect the sample itself becomes a low-Q cavity. On the other hand, for optically thin samples (films of thickness much less than a wavelength) or extremely low loss samples (loss tangents below 10-5) the QO approach tends to break down due to loss of signal. In such a case it is useful to put the sample in a high-Q cavity and measure the perturbation of the cavity modes. Provided that the average mode frequency divided by the shift in mode frequency is less than the Q (quality factor) of the mode, then the perturbation should be resolvable. Cavity perturbation techniques are not new, but there are technological difficulties in working in the millimeter/submillimeter wave region. In this paper we will show applications of cavity perturbation to the dielectric characterization of semi-conductor thin films of the type used in the manufacture of photovoltaics in the 100 and 350 GHz range. We measured the complex optical constants of hot-wire chemical deposition grown 1-μm thick amorphous silicon (a-Si:H) film on borosilicate glass substrate. The real part of the refractive index and dielectric constant of the glass-substrate varies from frequency-independent to linearly frequency-dependent. We also see power-law behavior of the frequency-dependent optical conductivity from 316 GHz (9.48 cm-1) down to 104 GHz (3.12 cm-1).

  1. Characterization and use of new monoclonal antibodies to CD11c, CD14, and CD163 to analyze the phenotypic complexity of ruminant monocyte subsets.

    PubMed

    Elnaggar, Mahmoud M; Abdellrazeq, Gaber S; Mack, Victoria; Fry, Lindsay M; Davis, William C; Park, Kun Taek

    2016-10-01

    The sequencing of the bovine genome and development of mass spectrometry, in conjunction with flow cytometry (FC), have afforded an opportunity to complete the characterization of the specificity of monoclonal antibodies (mAbs), only partially characterized during previous international workshops focused on antibody development for livestock (1991, Leukocyte Antigens in Cattle, Sheep, and Goats; 1993, Leukocyte Antigens of Cattle and Sheep; 1996, Third Workshop on Ruminant Leukocyte Antigens). The objective of this study was to complete the characterization of twelve mAbs incompletely characterized during the workshops that reacted with molecules predominantly expressed on bovine monocytes and use them to provide further information on the phenotypic complexity of monocyte subsets in ruminants. Analysis revealed that the mAbs could be grouped into three clusters that recognize three different molecules: CD11c, CD14, and CD163. Following characterization, comparison of the patterns of expression of CD14 and CD163 with expression of CD16, CD172a, and CD209 revealed the mononuclear cell population is comprised of multiple subsets with differential expression of these molecules. Further analysis revealed the epitopes recognized by mAbs to CD14 and CD163 are conserved on orthologues in sheep and goats. In contrast to CD14 that is also expressed on sheep and goat granulocytes, CD163 is a definitive marker for their monocytes. PMID:27496743

  2. How to deal with weak interactions in noncovalent complexes analyzed by electrospray mass spectrometry: cyclopeptidic inhibitors of the nuclear receptor coactivator 1-STAT6.

    PubMed

    Touboul, David; Maillard, Ludovic; Grässlin, Anja; Moumne, Roba; Seitz, Markus; Robinson, John; Zenobi, Renato

    2009-02-01

    Mass spectrometry, and especially electrospray ionization, is now an efficient tool to study noncovalent interactions between proteins and inhibitors. It is used here to study the interaction of some weak inhibitors with the NCoA-1/STAT6 protein with K(D) values in the microM range. High signal intensities corresponding to some nonspecific electrostatic interactions between NCoA-1 and the oppositely charged inhibitors were observed by nanoelectrospray mass spectrometry, due to the use of high ligand concentrations. Diverse strategies have already been developed to deal with nonspecific interactions, such as controlled dissociation in the gas phase, mathematical modeling, or the use of a reference protein to monitor the appearance of nonspecific complexes. We demonstrate here that this last methodology, validated only in the case of neutral sugar-protein interactions, i.e., where dipole-dipole interactions are crucial, is not relevant in the case of strong electrostatic interactions. Thus, we developed a novel strategy based on half-maximal inhibitory concentration (IC(50)) measurements in a competitive assay with readout by nanoelectrospray mass spectrometry. IC(50) values determined by MS were finally converted into dissociation constants that showed very good agreement with values determined in the liquid phase using a fluorescence polarization assay. PMID:18996720

  3. Promoting Experimental Problem-Solving Ability in Sixth-Grade Students through Problem-Oriented Teaching of Ecology: Findings of an Intervention Study in a Complex Domain

    ERIC Educational Resources Information Center

    Roesch, Frank; Nerb, Josef; Riess, Werner

    2015-01-01

    Our study investigated whether problem-oriented designed ecology lessons with phases of direct instruction and of open experimentation foster the development of cross-domain and domain-specific components of "experimental problem-solving ability" better than conventional lessons in science. We used a paper-and-pencil test to assess…

  4. Advancing our knowledge of the complexity and management of intimate partner violence and co-occurring mental health and substance abuse problems in women

    PubMed Central

    Du Mont, Janice

    2015-01-01

    Globally, intimate partner violence (IPV) is a pervasive and insidious human rights problem with significant adverse physical health outcomes for women. Intimate partner violence has also been closely associated with poor mental health and substance use problems. However, little is known about the relationship among these co-occurring problems and how to best intervene or manage them. Here, we present findings from recent systematic reviews and meta-analyses (where available) to highlight developments in understanding and managing the complex co-occurring problems of intimate partner violence and mental health and substance use in women. PMID:26097738

  5. Structural factoring approach for analyzing stochastic networks

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  6. Fractional channel multichannel analyzer

    DOEpatents

    Brackenbush, L.W.; Anderson, G.A.

    1994-08-23

    A multichannel analyzer incorporating the features of the present invention obtains the effect of fractional channels thus greatly reducing the number of actual channels necessary to record complex line spectra. This is accomplished by using an analog-to-digital converter in the asynchronous mode, i.e., the gate pulse from the pulse height-to-pulse width converter is not synchronized with the signal from a clock oscillator. This saves power and reduces the number of components required on the board to achieve the effect of radically expanding the number of channels without changing the circuit board. 9 figs.

  7. Fractional channel multichannel analyzer

    DOEpatents

    Brackenbush, Larry W.; Anderson, Gordon A.

    1994-01-01

    A multichannel analyzer incorporating the features of the present invention obtains the effect of fractional channels thus greatly reducing the number of actual channels necessary to record complex line spectra. This is accomplished by using an analog-to-digital converter in the asynscronous mode, i.e., the gate pulse from the pulse height-to-pulse width converter is not synchronized with the signal from a clock oscillator. This saves power and reduces the number of components required on the board to achieve the effect of radically expanding the number of channels without changing the circuit board.

  8. Applications of dialectical behavior therapy to the treatment of complex trauma-related problems: when one case formulation does not fit all.

    PubMed

    Wagner, Amy W; Rizvi, Shireen L; Harned, Melanie S

    2007-08-01

    In this article, the authors take the perspective that effective treatment of complex trauma-related problems requires, in the absence of empirically supported treatments, a reliance on theory, idiographic assessment, and empirically supported principles of change. Dialectical behavior therapy (DBT; M. M. Linehan, 1993) is used to demonstrate the applicability of this approach to the treatment of multiproblem, heterogeneous populations in general. Two case studies are presented that highlight the utility of DBT principles to complex trauma-related problems specifically. PMID:17721961

  9. FORTRAN Static Source Code Analyzer

    NASA Technical Reports Server (NTRS)

    Merwarth, P.

    1982-01-01

    FORTRAN Static Source Code Analyzer program (SAP) automatically gathers and reports statistics on occurrences of statements and structures within FORTRAN program. Provisions are made for weighting each statistic, providing user with overall figure of complexity. Statistics, as well as figures of complexity, are gathered on module-by-module basis. Overall summed statistics are accumulated for complete input source file.

  10. Promoting Experimental Problem-solving Ability in Sixth-grade Students Through Problem-oriented Teaching of Ecology: Findings of an intervention study in a complex domain

    NASA Astrophysics Data System (ADS)

    Roesch, Frank; Nerb, Josef; Riess, Werner

    2015-03-01

    Our study investigated whether problem-oriented designed ecology lessons with phases of direct instruction and of open experimentation foster the development of cross-domain and domain-specific components of experimental problem-solving ability better than conventional lessons in science. We used a paper-and-pencil test to assess students' abilities in a quasi-experimental intervention study utilizing a pretest/posttest control-group design (N = 340; average performing sixth-grade students). The treatment group received lessons on forest ecosystems consistent with the principle of education for sustainable development. This learning environment was expected to help students enhance their ecological knowledge and their theoretical and methodological experimental competencies. Two control groups received either the teachers' usual lessons on forest ecosystems or non-specific lessons on other science topics. We found that the treatment promoted specific components of experimental problem-solving ability (generating epistemic questions, planning two-factorial experiments, and identifying correct experimental controls). However, the observed effects were small, and awareness for aspects of higher ecological experimental validity was not promoted by the treatment.

  11. The Computer-Based Assessment of Complex Problem Solving and How It Is Influenced by Students' Information and Communication Technology Literacy

    ERIC Educational Resources Information Center

    Greiff, Samuel; Kretzschmar, André; Müller, Jonas C.; Spinath, Birgit; Martin, Romain

    2014-01-01

    The 21st-century work environment places strong emphasis on nonroutine transversal skills. In an educational context, complex problem solving (CPS) is generally considered an important transversal skill that includes knowledge acquisition and its application in new and interactive situations. The dynamic and interactive nature of CPS requires a…

  12. The Benefit of Being Naïve and Knowing It: The Unfavourable Impact of Perceived Context Familiarity on Learning in Complex Problem Solving Tasks

    ERIC Educational Resources Information Center

    Beckmann, Jens F.; Goode, Natassia

    2014-01-01

    Previous research has found that embedding a problem into a familiar context does not necessarily confer an advantage over a novel context in the acquisition of new knowledge about a complex, dynamic system. In fact, it has been shown that a semantically familiar context can be detrimental to knowledge acquisition. This has been described as the…

  13. The ESPAT tool: a general-purpose DSS shell for solving stochastic optimization problems in complex river-aquifer systems

    NASA Astrophysics Data System (ADS)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel; Tilmant, Amaury

    2015-04-01

    Stochastic programming methods are better suited to deal with the inherent uncertainty of inflow time series in water resource management. However, one of the most important hurdles in their use in practical implementations is the lack of generalized Decision Support System (DSS) shells, usually based on a deterministic approach. The purpose of this contribution is to present a general-purpose DSS shell, named Explicit Stochastic Programming Advanced Tool (ESPAT), able to build and solve stochastic programming problems for most water resource systems. It implements a hydro-economic approach, optimizing the total system benefits as the sum of the benefits obtained by each user. It has been coded using GAMS, and implements a Microsoft Excel interface with a GAMS-Excel link that allows the user to introduce the required data and recover the results. Therefore, no GAMS skills are required to run the program. The tool is divided into four modules according to its capabilities: 1) the ESPATR module, which performs stochastic optimization procedures in surface water systems using a Stochastic Dual Dynamic Programming (SDDP) approach; 2) the ESPAT_RA module, which optimizes coupled surface-groundwater systems using a modified SDDP approach; 3) the ESPAT_SDP module, capable of performing stochastic optimization procedures in small-size surface systems using a standard SDP approach; and 4) the ESPAT_DET module, which implements a deterministic programming procedure using non-linear programming, able to solve deterministic optimization problems in complex surface-groundwater river basins. The case study of the Mijares river basin (Spain) is used to illustrate the method. It consists in two reservoirs in series, one aquifer and four agricultural demand sites currently managed using historical (XIV century) rights, which give priority to the most traditional irrigation district over the XX century agricultural developments. Its size makes it possible to use either the SDP or

  14. Training Preschool Children to Use Visual Imagining as a Problem-Solving Strategy for Complex Categorization Tasks

    ERIC Educational Resources Information Center

    Kisamore, April N.; Carr, James E.; LeBlanc, Linda A.

    2011-01-01

    It has been suggested that verbally sophisticated individuals engage in a series of precurrent behaviors (e.g., covert intraverbal behavior, grouping stimuli, visual imagining) to solve problems such as answering questions (Palmer, 1991; Skinner, 1953). We examined the effects of one problem solving strategy--visual imagining--on increasing…

  15. Uniqueness of self-similar solutions to the Riemann problem for the Hopf equation with complex nonlinearity

    NASA Astrophysics Data System (ADS)

    Kulikovskii, A. G.; Chugainova, A. P.; Shargatov, V. A.

    2016-07-01

    Solutions of the Riemann problem for a generalized Hopf equation are studied. The solutions are constructed using a sequence of non-overturning Riemann waves and shock waves with stable stationary and nonstationary structures.

  16. Making Visible the Complexities of Problem Solving: An Ethnographic Study of a General Chemistry Course in a Studio Learning Environment

    NASA Astrophysics Data System (ADS)

    Kalainoff, Melinda Zapata

    Studio classrooms, designed such that laboratory and lecture functions can occur in the same physical space, have been recognized as a promising contributing factor in promoting collaborative learning in the sciences (NRC, 2011). Moreover, in designing for instruction, a critical goal, especially in the sciences and engineering, is to foster an environment where students have opportunities for learning problem solving practices (NRC, 2012a). However, few studies show how this type of innovative learning environment shapes opportunities for learning in the sciences, which is critical to informing future curricular and instructional designs for these environments. Even fewer studies show how studio environments shape opportunities to develop problem solving practices specifically. In order to make visible how the learning environment promotes problem solving practices, this study explores problem solving phenomena in the daily life of an undergraduate General Chemistry studio class using an ethnographic perspective. By exploring problem solving as a sociocultural process, this study shows how the instructor and students co-construct opportunities for learning in whole class and small group interactional spaces afforded in this studio environment and how the differential demands on students in doing problems requires re-conceptualizing what it means to "apply a concept".

  17. Can potentially useful dynamics to solve complex problems emerge from constrained chaos and/or chaotic itinerancy?

    NASA Astrophysics Data System (ADS)

    Nara, Shigetoshi

    2003-09-01

    Complex dynamics including chaos in systems with large but finite degrees of freedom are considered from the viewpoint that they would play important roles in complex functioning and controlling of biological systems including the brain, also in complex structure formations in nature. As an example of them, the computer experiments of complex dynamics occurring in a recurrent neural network model are shown. Instabilities, itinerancies, or localization in state space are investigated by means of numerical analysis, for instance by calculating correlation functions between neurons, basin visiting measures of chaotic dynamics, etc. As an example of functional experiments with use of such complex dynamics, we show the results of executing a memory search task which is set in a typical ill-posed context. We call such useful dynamics "constrained chaos," which might be called "chaotic itinerancy" as well. These results indicate that constrained chaos could be potentially useful in complex functioning and controlling for systems with large but finite degrees of freedom typically observed in biological systems and may be such that working in a delicate balance between converging dynamics and diverging dynamics in high dimensional state space depending on given situation, environment and context to be controlled or to be processed.

  18. Bourbaki's structure theory in the problem of complex systems simulation models synthesis and model-oriented programming

    NASA Astrophysics Data System (ADS)

    Brodsky, Yu. I.

    2015-01-01

    The work is devoted to the application of Bourbaki's structure theory to substantiate the synthesis of simulation models of complex multicomponent systems, where every component may be a complex system itself. An application of the Bourbaki's structure theory offers a new approach to the design and computer implementation of simulation models of complex multicomponent systems—model synthesis and model-oriented programming. It differs from the traditional object-oriented approach. The central concept of this new approach and at the same time, the basic building block for the construction of more complex structures is the concept of models-components. A model-component endowed with a more complicated structure than, for example, the object in the object-oriented analysis. This structure provides to the model-component an independent behavior-the ability of standard responds to standard requests of its internal and external environment. At the same time, the computer implementation of model-component's behavior is invariant under the integration of models-components into complexes. This fact allows one firstly to construct fractal models of any complexity, and secondly to implement a computational process of such constructions uniformly-by a single universal program. In addition, the proposed paradigm allows one to exclude imperative programming and to generate computer code with a high degree of parallelism.

  19. Source-Code-Analyzing Program

    NASA Technical Reports Server (NTRS)

    Manteufel, Thomas; Jun, Linda

    1991-01-01

    FORTRAN Static Source Code Analyzer program, SAP, developed to gather statistics automatically on occurrences of statements and structures within FORTRAN program and provide for reporting of those statistics. Provisions made to weight each statistic and provide overall figure of complexity. Statistics, as well as figures of complexity, gathered on module-by-module basis. Overall summed statistics also accumulated for complete input source file. Written in FORTRAN IV.

  20. Money, case complexity, and wait lists: perspectives on problems and solutions at children's mental health centers in Ontario.

    PubMed

    Reid, Graham J; Brown, Judith Belle

    2008-07-01

    Senior managers of children's mental health centers across Ontario, Canada were interviewed regarding the challenges and solutions of access and delivery of care. The central challenges--funding, case complexity, waitlists, staffing, and system integration--revealed a complex interplay between the policies and financing of children's mental health services and the provision of clinical services at the agency level and within the community. The desire for integration and collaboration was countered by competition for funding and service demands. A need for policies that allow for local solutions while providing leadership for sustained improvements in the ease and timeliness of access to care and effective clinical services emerged. PMID:18512157

  1. Electron-atom resonances: The complex-scaled multiconfigurational spin-tensor electron propagator method for the 2P Be- shape resonance problem

    NASA Astrophysics Data System (ADS)

    Tsednee, Tsogbayar; Liang, Liyuan; Yeager, Danny L.

    2015-02-01

    We propose and develop the complex-scaled multiconfigurational spin-tensor electron propagator (CMCSTEP) technique for theoretical determination of resonance parameters with electron-atom and electron-molecule systems including open-shell and highly correlated atoms and molecules. The multiconfigurational spin-tensor electron propagator (MCSTEP) method developed and implemented by Yeager and co-workers in real space gives very accurate and reliable ionization potentials and attachment energies. The CMCSTEP method uses a complex-scaled multiconfigurational self-consistent field (CMCSCF) state as an initial state along with a dilated Hamiltonian where all of the electronic coordinates are scaled by a complex factor. The CMCSCF was developed and applied successfully to resonance problems earlier. We apply the CMCSTEP method to get 2P Be- shape resonance parameters using 14 s 11 p 5 d ,14 s 14 p 2 d , and 14 s 14 p 5 d basis sets with a 2 s 2 p 3 d complete active space. The obtained values of the resonance parameters are compared to previous results. Here CMCSTEP has been developed and used for a resonance problem. It appears to be among the most accurate and reliable techniques. Vertical ionization potentials and attachment energies in real space are typically within ±0.2 eV or better of excellent experimental results and full configuration-interaction calculations with a good basis set. We expect the same sort of agreement in complex space.

  2. Tangled Narratives and Wicked Problems: A Complex Case of Positioning and Politics in a Diverse School Community

    ERIC Educational Resources Information Center

    Nguyen, Thu Suong Thi; Scribner, Samantha M. Paredes; Crow, Gary M.

    2012-01-01

    The case of Allen Elementary School presents tangled narratives and wicked problems describing the multidimensionality of school community work. Using multiple converging and diverging vignettes, the case points to the distinctiveness of individual experience in schools; the ways institutionalized organizational narratives become cultural…

  3. Advising a Bus Company on Number of Needed Buses: How High-School Physics Students' Deal With a "Complex Problem"?

    ERIC Educational Resources Information Center

    Balukovic, Jasmina; Slisko, Josip; Hadzibegovic, Zalkida

    2011-01-01

    Since 2003, international project PISA evaluates 15-year old students in solving problems that include "decision taking", "analysis and design of systems" and "trouble-shooting". This article presents the results of a pilot research conducted with 215 students from first to fourth grade of a high school in Sarajevo…

  4. Simple Solutions to Complex Problems: Moral Panic and the Fluid Shift from "Equity" to "Quality" in Education

    ERIC Educational Resources Information Center

    Mockler, Nicole

    2014-01-01

    Education is increasingly conceptualised by governments and policymakers in western democracies in terms of productivity and human capital, emphasising elements of individualism and competition over concerns around democracy and equity. More and more, solutions to intransigent educational problems related to equity are seen in terms of quality and…

  5. The Management of Cognitive Load During Complex Cognitive Skill Acquisition by Means of Computer-Simulated Problem Solving

    ERIC Educational Resources Information Center

    Kester, Liesbeth; Kirschner, Paul A.; van Merrienboer, Jeroen J.G.

    2005-01-01

    This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram, while information in the integrated format condition…

  6. The Use of the Solihull Approach with Children with Complex Neurodevelopmental Difficulties and Sleep Problems: A Case Study

    ERIC Educational Resources Information Center

    Williams, Laura; Newell, Reetta

    2013-01-01

    The following article introduces the Solihull Approach, a structured framework for intervention work with families (Douglas, "Solihull resource pack; the first five years." Cambridge: Jill Rogers Associates, 2001) and aims to demonstrate the usefulness of this approach in working with school-age children with complex neurodevelopmental…

  7. The problem of the age and structural position of the Blyb metamorphic complex (Fore Range zone, Great Caucasus) granitoids.

    NASA Astrophysics Data System (ADS)

    Kamzolkin, Vladimir; Latyshev, Anton; Ivanov, Stanislav

    2016-04-01

    The Blyb metamorphic complex (BMC) of the Fore Range zone is one of the most high-grade metamorphosed element of the Great Caucasus fold belt. Determination of the timing and the mechanism of formation of the Fore Range fold-thrust structures are not possible without investigation of the BMC located at the basement of its section. At the same time, the conceptions about its structure and age are outdated and need revision. Somin (2011) determined the age of the protolith and metamorphism of the Blyb complex as the Late Devonian - Early Carboniferous. We have recently shown that the BMC has not the dome, as previously thought, but nappe structure (Vidjapin, Kamzolkin, 2015), and is metamorphically coherent with the peak metamorphism pressures up to 22 kbar (Kamzolkin et al., 2015; Konilov et al., 2013). Considering the age and structure of the Blyb complex it is necessary to revise the age of granitoid intrusions and their relations with gneisses and schists, which constitute the main part of the section of the complex. Most authors (Gamkrelidze, Shengelia, 2007; Lavrischev, 2002; Baranov, 1967) adheres to Early Paleozoic age of intrusives, which is doubtful, considering the younger age of metamorphic rocks. We suppose, that the intrusive bodies broke through a BMC nappe structure during the exhumation of the complex (Perchuk, 1991) at the Devonian - Carboniferous boundary. Seemingly, the massive monzodiorites body (Lavrischev, 2002), intruding garnet-muscovite schists and amphibolite gneisses of the Blyb complex and cut by the Main Caucasian fault (MCF), are younger. Given the timing of termination of the MCF movement activity as the Middle Jurassic (Greater Caucasus..., 2005), their age should be in the Early Carboniferous - Middle Jurassic interval. At the same time, on the modern geological map (Lavrischev, 2002) monzodiorites body is assigned to the Middle Paleozoic. The study of the BMC granitoids and monzodiorites will help in determining of the mechanism and

  8. A Case-Based, Problem-Based Learning Approach to Prepare Master of Public Health Candidates for the Complexities of Global Health

    PubMed Central

    Winskell, Kate; McFarland, Deborah A.; del Rio, Carlos

    2015-01-01

    Global health is a dynamic, emerging, and interdisciplinary field. To address current and emerging global health challenges, we need a public health workforce with adaptable and collaborative problem-solving skills. In the 2013–2014 academic year, the Hubert Department of Global Health at the Rollins School of Public Health–Emory University launched an innovative required core course for its first-year Master of Public Health students in the global health track. The course uses a case-based, problem-based learning approach to develop global health competencies. Small teams of students propose solutions to these problems by identifying learning issues and critically analyzing and synthesizing new information. We describe the course structure and logistics used to apply this approach in the context of a large class and share lessons learned. PMID:25706029

  9. A case-based, problem-based learning approach to prepare master of public health candidates for the complexities of global health.

    PubMed

    Leon, Juan S; Winskell, Kate; McFarland, Deborah A; del Rio, Carlos

    2015-03-01

    Global health is a dynamic, emerging, and interdisciplinary field. To address current and emerging global health challenges, we need a public health workforce with adaptable and collaborative problem-solving skills. In the 2013-2014 academic year, the Hubert Department of Global Health at the Rollins School of Public Health-Emory University launched an innovative required core course for its first-year Master of Public Health students in the global health track. The course uses a case-based, problem-based learning approach to develop global health competencies. Small teams of students propose solutions to these problems by identifying learning issues and critically analyzing and synthesizing new information. We describe the course structure and logistics used to apply this approach in the context of a large class and share lessons learned. PMID:25706029

  10. Downhole Fluid Analyzer Development

    SciTech Connect

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  11. The ecological validity of the Rey-Osterrieth Complex Figure: predicting everyday problems in children with neuropsychological disorders.

    PubMed

    Davies, Simon R; Field, Alan R J; Andersen, Thorvald; Pestell, Carmela

    2011-08-01

    Despite its extensive use, the validity of the Rey Complex Figure in predicting everyday behavioral executive dysfunction displayed by children with various neuropsychological disorders is unknown. The copied figures of 263 children aged 6 to 17 years were rescored using an accuracy approach that measured visuospatial ability and three process approaches developed to measure executive functioning. Controlling for age and IQ, partial correlations between scores derived by all scoring methods and the parent ratings on an executive function inventory were zero. In contrast, all four scoring approaches were associated with parent ratings on questionnaires that indexed children's academic achievement, developmental status, and adaptive ability. The findings suggest that the ecological validity of the Rey Complex Figure for children with various central nervous system disorders is more associated with visual-motor integration skills than executive functioning. PMID:21957867

  12. Analyzing Software Piracy in Education.

    ERIC Educational Resources Information Center

    Lesisko, Lee James

    This study analyzes the controversy of software piracy in education. It begins with a real world scenario that presents the setting and context of the problem. The legalities and background of software piracy are explained and true court cases are briefly examined. Discussion then focuses on explaining why individuals and organizations pirate…

  13. FORTRAN Static Source Code Analyzer

    NASA Technical Reports Server (NTRS)

    Merwarth, P.

    1984-01-01

    FORTRAN Static Source Code Analyzer program, SAP (DEC VAX version), automatically gathers statistics on occurrences of statements and structures within FORTRAN program and provides reports of those statistics. Provisions made for weighting each statistic and provide an overall figure of complexity.

  14. TRAINING PRESCHOOL CHILDREN TO USE VISUAL IMAGINING AS A PROBLEM-SOLVING STRATEGY FOR COMPLEX CATEGORIZATION TASKS

    PubMed Central

    Kisamore, April N; Carr, James E; LeBlanc, Linda A

    2011-01-01

    It has been suggested that verbally sophisticated individuals engage in a series of precurrent behaviors (e.g., covert intraverbal behavior, grouping stimuli, visual imagining) to solve problems such as answering questions (Palmer, 1991; Skinner, 1953). We examined the effects of one problem solving strategy—visual imagining—on increasing responses to intraverbal categorization questions. Participants were 4 typically developing preschoolers between the ages of 4 and 5 years. Visual imagining training was insufficient to produce a substantial increase in target responses. It was not until the children were prompted to use the visual imagining strategy that a large and immediate increase in the number of target responses was observed. The number of prompts did not decrease until the children were given a rule describing the use of the visual imagining strategy. Within-session response patterns indicated that none of the children used visual imagining prior to being prompted to do so and that use of the strategy continued after introduction of the rule. These results were consistent for 3 of 4 children. Within-session response patterns suggested that the 4th child occasionally imagined when prompted to do so, but the gains were not maintained. The results are discussed in terms of Skinner's analysis of problem solving and the development of visual imagining. PMID:21709783

  15. Experiences with explicit finite-difference schemes for complex fluid dynamics problems on STAR-100 and CYBER-203 computers

    NASA Technical Reports Server (NTRS)

    Kumar, A.; Rudy, D. H.; Drummond, J. P.; Harris, J. E.

    1982-01-01

    Several two- and three-dimensional external and internal flow problems solved on the STAR-100 and CYBER-203 vector processing computers are described. The flow field was described by the full Navier-Stokes equations which were then solved by explicit finite-difference algorithms. Problem results and computer system requirements are presented. Program organization and data base structure for three-dimensional computer codes which will eliminate or improve on page faulting, are discussed. Storage requirements for three-dimensional codes are reduced by calculating transformation metric data in each step. As a result, in-core grid points were increased in number by 50% to 150,000, with a 10% execution time increase. An assessment of current and future machine requirements shows that even on the CYBER-205 computer only a few problems can be solved realistically. Estimates reveal that the present situation is more storage limited than compute rate limited, but advancements in both storage and speed are essential to realistically calculate three-dimensional flow.

  16. Megacities in the coastal zone: Using a driver-pressure-state-impact-response framework to address complex environmental problems

    NASA Astrophysics Data System (ADS)

    Sekovski, Ivan; Newton, Alice; Dennison, William C.

    2012-01-01

    The purpose of this study was to elaborate on the role of coastal megacities in environmental degradation and their contribution to global climate change. Although only less than 4 percent of the total world's population resides in coastal megacities, their impact on environment is significant due to their rapid development, high population densities and high consumption rate of their residents. This study was carried out by implementing a Drivers-Pressures-States-Impacts-Responses (DPSIR) framework. This analytical framework was chosen because of its potential to link the existing data, gathered from various previous studies, in causal relationship. In this text, coastal megacities have been defined as cities exceeding 10 million inhabitants, situated in "near-coastal zone". Their high rates of the consumption of food, water, space and energy were observed and linked to the high performance rates of related economic activities (industry, transportation, power generation, agriculture and water extraction). In many of the studied coastal megacities, deteriorated quality of air and water was perceived, which can, in combination with global warming, lead to health problems and economic and social disturbance among residents. The extent of problems varied between developing and developed countries, showing higher rates of population growth and certain harmful emissions in megacities of developing countries, as well as more problems regarding food and water shortages, sanitation, and health care support. Although certain projections predict slowdown of growth in most coastal megacities, their future impact on environment is still unclear due to the uncertainties regarding future climate change and trajectories of consumption patterns.

  17. Experiences with explicit finite-difference schemes for complex fluid dynamics problems on STAR-100 and CYBER-203 computers

    NASA Astrophysics Data System (ADS)

    Kumar, A.; Rudy, D. H.; Drummond, J. P.; Harris, J. E.

    1982-08-01

    Several two- and three-dimensional external and internal flow problems solved on the STAR-100 and CYBER-203 vector processing computers are described. The flow field was described by the full Navier-Stokes equations which were then solved by explicit finite-difference algorithms. Problem results and computer system requirements are presented. Program organization and data base structure for three-dimensional computer codes which will eliminate or improve on page faulting, are discussed. Storage requirements for three-dimensional codes are reduced by calculating transformation metric data in each step. As a result, in-core grid points were increased in number by 50% to 150,000, with a 10% execution time increase. An assessment of current and future machine requirements shows that even on the CYBER-205 computer only a few problems can be solved realistically. Estimates reveal that the present situation is more storage limited than compute rate limited, but advancements in both storage and speed are essential to realistically calculate three-dimensional flow.

  18. Self-adaptive difference method for the effective solution of computationally complex problems of boundary layer theory

    NASA Technical Reports Server (NTRS)

    Schoenauer, W.; Daeubler, H. G.; Glotz, G.; Gruening, J.

    1986-01-01

    An implicit difference procedure for the solution of equations for a chemically reacting hypersonic boundary layer is described. Difference forms of arbitrary error order in the x and y coordinate plane were used to derive estimates for discretization error. Computational complexity and time were minimized by the use of this difference method and the iteration of the nonlinear boundary layer equations was regulated by discretization error. Velocity and temperature profiles are presented for Mach 20.14 and Mach 18.5; variables are velocity profiles, temperature profiles, mass flow factor, Stanton number, and friction drag coefficient; three figures include numeric data.

  19. You Can't Get There From Here! Problems and Potential Solutions in Developing New Classes of Complex Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy

    2005-01-01

    The explosion of capabilities and new products within the sphere of Information Technology (IT) has fostered widespread, overly optimistic opinions regarding the industry, based on common but unjustified assumptions of quality and correctness of software. These assumptions are encouraged by software producers and vendors, who at this late date have not succeeded in finding a way to overcome the lack of an automated, mathematically sound way to develop correct systems from requirements. NASA faces this dilemma as it envisages advanced mission concepts that involve large swarms of small spacecraft that will engage cooperatively to acheve science goals. Such missions entail levels of complexity that beg for new methods for system development far beyond today's methods, which are inadequate for ensuring correct behavior of large numbers of interacting intelligent mission elements. New system development techniques recently devised through NASA-led research will offer some innovative approaches to achieving correctness in complex system development, including autonomous swarm missions that exhibit emergent behavior, as well as general software products created by the computing industry.

  20. A quasi-optimal coarse problem and an augmented Krylov solver for the variational theory of complex rays

    NASA Astrophysics Data System (ADS)

    Kovalevsky, Louis; Gosselet, Pierre

    2016-09-01

    The Variational Theory of Complex Rays (VTCR) is an indirect Trefftz method designed to study systems governed by Helmholtz-like equations. It uses wave functions to represent the solution inside elements, which reduces the dispersion error compared to classical polynomial approaches but the resulting system is prone to be ill conditioned. This paper gives a simple and original presentation of the VTCR using the discontinuous Galerkin framework and it traces back the ill-conditioning to the accumulation of eigenvalues near zero for the formulation written in terms of wave amplitude. The core of this paper presents an efficient solving strategy that overcomes this issue. The key element is the construction of a search subspace where the condition number is controlled at the cost of a limited decrease of attainable precision. An augmented LSQR solver is then proposed to solve efficiently and accurately the complete system. The approach is successfully applied to different examples.

  1. Attention-deficit hyperactivity disorder (ADHD), substance use disorders, and criminality: a difficult problem with complex solutions.

    PubMed

    Knecht, Carlos; de Alvaro, Raquel; Martinez-Raga, Jose; Balanza-Martinez, Vicent

    2015-05-01

    The association between attention-deficit hyperactivity disorder (ADHD) and criminality has been increasingly recognized as an important societal concern. Studies conducted in different settings have revealed high rates of ADHD among adolescent offenders. The risk for criminal behavior among individuals with ADHD is increased when there is psychiatric comorbidity, particularly conduct disorder and substance use disorder. In the present report, it is aimed to systematically review the literature on the epidemiological, neurobiological, and other risk factors contributing to this association, as well as the key aspects of the assessment, diagnosis, and treatment of ADHD among offenders. A systematic literature search of electronic databases (PubMed, EMBASE, and PsycINFO) was conducted to identify potentially relevant studies published in English, in peer-reviewed journals. Studies conducted in various settings within the judicial system and in many different countries suggest that the rate of adolescent and adult inmates with ADHD far exceeds that reported in the general population; however, underdiagnosis is common. Similarly, follow-up studies of children with ADHD have revealed high rates of criminal behaviors, arrests, convictions, and imprisonment in adolescence and adulthood. Assessment of ADHD and comorbid condition requires an ongoing and careful process. When treating offenders or inmates with ADHD, who commonly present other comorbid psychiatric disorder complex, comprehensive and tailored interventions, combining pharmacological and psychosocial strategies are likely to be needed. PMID:25411986

  2. Imbalance problem in community detection

    NASA Astrophysics Data System (ADS)

    Sun, Peng Gang

    2016-09-01

    Community detection gives us a simple way to understand complex networks' structures. However, there is an imbalance problem in community detection. This paper first introduces the imbalance problem and then proposes a new measure to alleviate the imbalance problem. In addition, we study two variants of the measure and further analyze the resolution scale of community detection. Finally, we compare our approach with some state of the art methods on random networks as well as real-world networks for community detection. Both the theoretical analysis and the experimental results show that our approach achieves better performance for community detection. We also find that our approach tends to separate densely connected subgroups preferentially.

  3. Collab-Analyzer: An Environment for Conducting Web-Based Collaborative Learning Activities and Analyzing Students' Information-Searching Behaviors

    ERIC Educational Resources Information Center

    Wu, Chih-Hsiang; Hwang, Gwo-Jen; Kuo, Fan-Ray

    2014-01-01

    Researchers have found that students might get lost or feel frustrated while searching for information on the Internet to deal with complex problems without real-time guidance or supports. To address this issue, a web-based collaborative learning system, Collab-Analyzer, is proposed in this paper. It is not only equipped with a collaborative…

  4. Explorations of the Concept of Local Capacity for Problem Solving: An Introduction to a Series of Papers Analyzing Nine School Improvement Projects. Draft. Documentation and Technical Assistance in Urban Schools.

    ERIC Educational Resources Information Center

    Wilson, Stephen H.

    A model for enhancing the local capacity of urban schools for solving problems by restructuring school settings is the subject of this paper. In identifying the strength and weaknesses of such a concept, the paper reviews data from nine sites studied by the Documentation and Technical Assistance (DTA) project for their applicability to other…

  5. A longitudinal study of higher-order thinking skills: working memory and fluid reasoning in childhood enhance complex problem solving in adolescence

    PubMed Central

    Greiff, Samuel; Wüstenberg, Sascha; Goetz, Thomas; Vainikainen, Mari-Pauliina; Hautamäki, Jarkko; Bornstein, Marc H.

    2015-01-01

    Scientists have studied the development of the human mind for decades and have accumulated an impressive number of empirical studies that have provided ample support for the notion that early cognitive performance during infancy and childhood is an important predictor of later cognitive performance during adulthood. As children move from childhood into adolescence, their mental development increasingly involves higher-order cognitive skills that are crucial for successful planning, decision-making, and problem solving skills. However, few studies have employed higher-order thinking skills such as complex problem solving (CPS) as developmental outcomes in adolescents. To fill this gap, we tested a longitudinal developmental model in a sample of 2,021 Finnish sixth grade students (M = 12.41 years, SD = 0.52; 1,041 female, 978 male, 2 missing sex). We assessed working memory (WM) and fluid reasoning (FR) at age 12 as predictors of two CPS dimensions: knowledge acquisition and knowledge application. We further assessed students’ CPS performance 3 years later as a developmental outcome (N = 1696; M = 15.22 years, SD = 0.43; 867 female, 829 male). Missing data partly occurred due to dropout and technical problems during the first days of testing and varied across indicators and time with a mean of 27.2%. Results revealed that FR was a strong predictor of both CPS dimensions, whereas WM exhibited only a small influence on one of the two CPS dimensions. These results provide strong support for the view that CPS involves FR and, to a lesser extent, WM in childhood and from there evolves into an increasingly complex structure of higher-order cognitive skills in adolescence. PMID:26283992

  6. Using a MODFLOW grid, generated with GMS, to solve a transport problem with TOUGH2 in complex geological environments: The intertidal deposits of the Venetian Lagoon

    NASA Astrophysics Data System (ADS)

    Borgia, A.; Cattaneo, L.; Marconi, D.; Delcroix, C.; Rossi, E. L.; Clemente, G.; Amoroso, C. G.; Lo Re, F.; Tozzato, E.

    2011-06-01

    The tides of the Venetian Lagoon generally vary between -0.5 and +0.7 m asl. Occasionally, they may reach a maximum of 1.5 m (acqua alta) and a minimum of -0.8 m asl (acqua bassa). Intertidal areas, called "barene," exist all along the coast of the Lagoon. These areas are characterized by canals that concentrate the flow of water (and the deposition of sands) during the rising and waning of the tides, and that inundate and drain the vegetated areas found between canals (where organic-rich clays are deposited). Therefore, since the area is subject to subsidence, in time, sand dykes (the original canals) become juxtaposed to clayey dykes (the original vegetated areas). In addition, the sands form a continuous hydrogeologic network within the clays, very similar to that of a vascular system that effectively drains the whole "barena" deposits. In order to be effective, measures for monitoring, confining, or remediating the transport of pollutants through this kind of environment must explicitly take into account the geologic complexity. The same complexity must be included in the numerical models that support remediation efforts. At the moment, there appears to be no off-the-shelf graphical interface that is able to manage such complexity for TOUGH2. To attempt to solve this problem we have used a calibrated USGS-MODFLOW model, of the barena of "Passo a Campalto" in the Venetian Lagoon, developed with the GMS graphical interface. The model is made of 42 layers, which, apart from the first layer, are 0.5 m thick; the first layer has the thickness distribution of a dump found on top of the barena deposit at Passo a Campalto. Each layer consists of 100×60 square cells, for a total of 252,000 cells, only about half of which are active. Using a FORTRAN routine, we translate this grid, with all the hydrogeologic boundary conditions, into a TOUGH2 input file, and we provide the additional necessary information for running a TOUGH2 simulation. The results are promising, in

  7. Portable automatic blood analyzer

    NASA Technical Reports Server (NTRS)

    Coleman, R. L.

    1975-01-01

    Analyzer employs chemical-sensing electrodes for determination of blood, gas, and ion concentrations. It is rugged, easily serviced, and comparatively simple to operate. System can analyze up to eight parameters and can be modified to measure other blood constituents including nonionic species, such as urea, glucose, and oxygen.

  8. Analyzing Peace Pedagogies

    ERIC Educational Resources Information Center

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  9. Analyzing Costs of Services.

    ERIC Educational Resources Information Center

    Cox, James O.; Black, Talbot

    A simplified method to gather and analyze cost data is presented for administrators of Handicapped Children's Early Education Programs, and specifically for members of the Technical Assistance Development System, North Carolina. After identifying benefits and liabilities associated with analyzing program costs, attention is focused on the internal…

  10. Analyzing water resources

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Report on water resources discusses problems in water measurement demand, use, and availability. Also discussed are sensing accuracies, parameter monitoring, and status of forecasting, modeling, and future measurement techniques.

  11. A Categorization of Dynamic Analyzers

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  12. Software Design Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  13. Automatic amino acid analyzer

    NASA Technical Reports Server (NTRS)

    Berdahl, B. J.; Carle, G. C.; Oyama, V. I.

    1971-01-01

    Analyzer operates unattended or up to 15 hours. It has an automatic sample injection system and can be programmed. All fluid-flow valve switching is accomplished pneumatically from miniature three-way solenoid pilot valves.

  14. Generating and Analyzing Data.

    ERIC Educational Resources Information Center

    Stevens, Jill

    1993-01-01

    Presents activities in which students develop and analyze scatterplots on graphing calculators to model corn growth, decay, a box of maximum volume, and weather prediction. Provides reproducible worksheets. (MDH)

  15. Soil Rock Analyzer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A redesigned version of a soil/rock analyzer developed by Martin Marietta under a Langley Research Center contract is being marketed by Aurora Tech, Inc. Known as the Aurora ATX-100, it has self-contained power, an oscilloscope, a liquid crystal readout, and a multichannel spectrum analyzer. It measures energy emissions to determine what elements in what percentages a sample contains. It is lightweight and may be used for mineral exploration, pollution monitoring, etc.

  16. Managing healthcare information: analyzing trust.

    PubMed

    Söderström, Eva; Eriksson, Nomie; Åhlfeldt, Rose-Mharie

    2016-08-01

    Purpose - The purpose of this paper is to analyze two case studies with a trust matrix tool, to identify trust issues related to electronic health records. Design/methodology/approach - A qualitative research approach is applied using two case studies. The data analysis of these studies generated a problem list, which was mapped to a trust matrix. Findings - Results demonstrate flaws in current practices and point to achieving balance between organizational, person and technology trust perspectives. The analysis revealed three challenge areas, to: achieve higher trust in patient-focussed healthcare; improve communication between patients and healthcare professionals; and establish clear terminology. By taking trust into account, a more holistic perspective on healthcare can be achieved, where trust can be obtained and optimized. Research limitations/implications - A trust matrix is tested and shown to identify trust problems on different levels and relating to trusting beliefs. Future research should elaborate and more fully address issues within three identified challenge areas. Practical implications - The trust matrix's usefulness as a tool for organizations to analyze trust problems and issues is demonstrated. Originality/value - Healthcare trust issues are captured to a greater extent and from previously unchartered perspectives. PMID:27477934

  17. Analyzing Mode Confusion via Model Checking

    NASA Technical Reports Server (NTRS)

    Luettgen, Gerald; Carreno, Victor

    1999-01-01

    Mode confusion is one of the most serious problems in aviation safety. Today's complex digital flight decks make it difficult for pilots to maintain awareness of the actual states, or modes, of the flight deck automation. NASA Langley leads an initiative to explore how formal techniques can be used to discover possible sources of mode confusion. As part of this initiative, a flight guidance system was previously specified as a finite Mealy automaton, and the theorem prover PVS was used to reason about it. The objective of the present paper is to investigate whether state-exploration techniques, especially model checking, are better able to achieve this task than theorem proving and also to compare several verification tools for the specific application. The flight guidance system is modeled and analyzed in Murphi, SMV, and Spin. The tools are compared regarding their system description language, their practicality for analyzing mode confusion, and their capabilities for error tracing and for animating diagnostic information. It turns out that their strengths are complementary.

  18. Fiber optic multiple blood gas analyzer

    NASA Astrophysics Data System (ADS)

    Rademaker, Diane M.; Zimmerman, Donald E.; James, Kenneth A.; Quick, William H.

    1994-07-01

    Blood gas analysis has been shown to be the most critical factor in determining patient survivability in a trauma care environment. Present techniques of non-invasive measurement of blood gases in the trauma care unit such as optical pulse oximetry and transcutaneous electrodes are inadequate due to complexity and inaccuracy. The crux of the solution to this problem is the application of a recent, DOD/NASA developed micro-optic spectrophotometer to perform blood gas analysis via fiber optic transmission. The newly developed blood gas analyzer described here will not only overcome the aforementioned drawbacks but also be highly accurate, durable, and safe in hazardous environments: e.g., oxygen rich environments. This spectrophotometer is driven by a microprocessor based `Kalman filter' algorithm which not only controls the monitoring of all the patients in the care center but also separates the patient's superimposed blood gas spectra into its individual components to allow a number of gases critical for trauma care to be analyzed simultaneously.

  19. The "Performance of Rotavirus and Oral Polio Vaccines in Developing Countries" (PROVIDE) study: description of methods of an interventional study designed to explore complex biologic problems.

    PubMed

    Kirkpatrick, Beth D; Colgate, E Ross; Mychaleckyj, Josyf C; Haque, Rashidul; Dickson, Dorothy M; Carmolli, Marya P; Nayak, Uma; Taniuchi, Mami; Naylor, Caitlin; Qadri, Firdausi; Ma, Jennie Z; Alam, Masud; Walsh, Mary Claire; Diehl, Sean A; Petri, William A

    2015-04-01

    Oral vaccines appear less effective in children in the developing world. Proposed biologic reasons include concurrent enteric infections, malnutrition, breast milk interference, and environmental enteropathy (EE). Rigorous study design and careful data management are essential to begin to understand this complex problem while assuring research subject safety. Herein, we describe the methodology and lessons learned in the PROVIDE study (Dhaka, Bangladesh). A randomized clinical trial platform evaluated the efficacy of delayed-dose oral rotavirus vaccine as well as the benefit of an injectable polio vaccine replacing one dose of oral polio vaccine. This rigorous infrastructure supported the additional examination of hypotheses of vaccine underperformance. Primary and secondary efficacy and immunogenicity measures for rotavirus and polio vaccines were measured, as well as the impact of EE and additional exploratory variables. Methods for the enrollment and 2-year follow-up of a 700 child birth cohort are described, including core laboratory, safety, regulatory, and data management practices. Intense efforts to standardize clinical, laboratory, and data management procedures in a developing world setting provide clinical trials rigor to all outcomes. Although this study infrastructure requires extensive time and effort, it allows optimized safety and confidence in the validity of data gathered in complex, developing country settings. PMID:25711607

  20. Total organic carbon analyzer

    NASA Technical Reports Server (NTRS)

    Godec, Richard G.; Kosenka, Paul P.; Smith, Brian D.; Hutte, Richard S.; Webb, Johanna V.; Sauer, Richard L.

    1991-01-01

    The development and testing of a breadboard version of a highly sensitive total-organic-carbon (TOC) analyzer are reported. Attention is given to the system components including the CO2 sensor, oxidation reactor, acidification module, and the sample-inlet system. Research is reported for an experimental reagentless oxidation reactor, and good results are reported for linearity, sensitivity, and selectivity in the CO2 sensor. The TOC analyzer is developed with gravity-independent components and is designed for minimal additions of chemical reagents. The reagentless oxidation reactor is based on electrolysis and UV photolysis and is shown to be potentially useful. The stability of the breadboard instrument is shown to be good on a day-to-day basis, and the analyzer is capable of 5 sample analyses per day for a period of about 80 days. The instrument can provide accurate TOC and TIC measurements over a concentration range of 20 ppb to 50 ppm C.

  1. Using Networks to Visualize and Analyze Process Data for Educational Assessment

    ERIC Educational Resources Information Center

    Zhu, Mengxiao; Shu, Zhan; von Davier, Alina A.

    2016-01-01

    New technology enables interactive and adaptive scenario-based tasks (SBTs) to be adopted in educational measurement. At the same time, it is a challenging problem to build appropriate psychometric models to analyze data collected from these tasks, due to the complexity of the data. This study focuses on process data collected from SBTs. We…

  2. HITCAN: High temperature composite analyzer

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.; Lackney, Joseph J.; Chamis, Christos C.; Murthy, Pappu L. N.

    1990-01-01

    A computer code, HITCAN (High Temperature Composite Analyzer) was developed to analyze/design metal matrix composite structures. HITCAN is based on composite mechanics theories and computer codes developed at NASA LeRC over the last two decades. HITCAN is a general purpose code for predicting the global structural and local stress-strain response of multilayered (arbitrarily oriented) metal matrix structures both at the constituent (fiber, matrix, and interphase) and the structure level and including the fabrication process effects. The thermomechanical properties of the constituents are considered to be nonlinearly dependent on several parameters including temperature, stress, and stress rate. The computational procedure employs an incremental iterative nonlinear approach utilizing a multifactor-interaction material behavior model. HITCAN features and analysis capabilities (static, load stepping, modal, and buckling) are demonstrated through typical example problems.

  3. Electronic sleep analyzer

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.

    1970-01-01

    Electronic instrument automatically monitors the stages of sleep of a human subject. The analyzer provides a series of discrete voltage steps with each step corresponding to a clinical assessment of level of consciousness. It is based on the operation of an EEG and requires very little telemetry bandwidth or time.

  4. List mode multichannel analyzer

    DOEpatents

    Archer, Daniel E.; Luke, S. John; Mauger, G. Joseph; Riot, Vincent J.; Knapp, David A.

    2007-08-07

    A digital list mode multichannel analyzer (MCA) built around a programmable FPGA device for onboard data analysis and on-the-fly modification of system detection/operating parameters, and capable of collecting and processing data in very small time bins (<1 millisecond) when used in histogramming mode, or in list mode as a list mode MCA.

  5. Ultrasonic Transducer Analyzer

    NASA Technical Reports Server (NTRS)

    Grounds, M. K.

    1982-01-01

    Ultrasonic transducer-beam-intensity distributions are determined by analyzing echoes from a spherical ball. Computers control equipment and process data. Important beam characteristics, such as location of best beam focus and beam diameter at focus, can be determined quickly from extensive set of plots generated by apparatus.

  6. Micro acoustic spectrum analyzer

    DOEpatents

    Schubert, W. Kent; Butler, Michael A.; Adkins, Douglas R.; Anderson, Larry F.

    2004-11-23

    A micro acoustic spectrum analyzer for determining the frequency components of a fluctuating sound signal comprises a microphone to pick up the fluctuating sound signal and produce an alternating current electrical signal; at least one microfabricated resonator, each resonator having a different resonant frequency, that vibrate in response to the alternating current electrical signal; and at least one detector to detect the vibration of the microfabricated resonators. The micro acoustic spectrum analyzer can further comprise a mixer to mix a reference signal with the alternating current electrical signal from the microphone to shift the frequency spectrum to a frequency range that is a better matched to the resonant frequencies of the microfabricated resonators. The micro acoustic spectrum analyzer can be designed specifically for portability, size, cost, accuracy, speed, power requirements, and use in a harsh environment. The micro acoustic spectrum analyzer is particularly suited for applications where size, accessibility, and power requirements are limited, such as the monitoring of industrial equipment and processes, detection of security intrusions, or evaluation of military threats.

  7. PULSE AMPLITUDE ANALYZER

    DOEpatents

    Greenblatt, M.H.

    1958-03-25

    This patent pertains to pulse amplitude analyzers for sorting and counting a serles of pulses, and specifically discloses an analyzer which ls simple in construction and presents the puise height distribution visually on an oscilloscope screen. According to the invention, the pulses are applied to the vertical deflection plates of an oscilloscope and trigger the horizontal sweep. Each pulse starts at the same point on the screen and has a maximum amplitude substantially along the same vertical line. A mask is placed over the screen except for a slot running along the line where the maximum amplitudes of the pulses appear. After the slot has been scanned by a photocell in combination with a slotted rotating disk, the photocell signal is displayed on an auxiliary oscilloscope as vertical deflection along a horizontal time base to portray the pulse amplitude distribution.

  8. Magnetoresistive Emulsion Analyzer

    PubMed Central

    Lin, Gungun; Baraban, Larysa; Han, Luyang; Karnaushenko, Daniil; Makarov, Denys; Cuniberti, Gianaurelio; Schmidt, Oliver G.

    2013-01-01

    We realize a magnetoresistive emulsion analyzer capable of detection, multiparametric analysis and sorting of ferrofluid-containing nanoliter-droplets. The operation of the device in a cytometric mode provides high throughput and quantitative information about the dimensions and magnetic content of the emulsion. Our method offers important complementarity to conventional optical approaches involving ferrofluids, and paves the way to the development of novel compact tools for diagnostics and nanomedicine including drug design and screening. PMID:23989504

  9. Magnetoresistive emulsion analyzer.

    PubMed

    Lin, Gungun; Baraban, Larysa; Han, Luyang; Karnaushenko, Daniil; Makarov, Denys; Cuniberti, Gianaurelio; Schmidt, Oliver G

    2013-01-01

    We realize a magnetoresistive emulsion analyzer capable of detection, multiparametric analysis and sorting of ferrofluid-containing nanoliter-droplets. The operation of the device in a cytometric mode provides high throughput and quantitative information about the dimensions and magnetic content of the emulsion. Our method offers important complementarity to conventional optical approaches involving ferrofluids, and paves the way to the development of novel compact tools for diagnostics and nanomedicine including drug design and screening. PMID:23989504

  10. PULSE AMPLITUDE ANALYZER

    DOEpatents

    Gray, G.W.; Jensen, A.S.

    1957-10-22

    A pulse-height analyzer system of improved design for sorting and counting a series of pulses, such as provided by a scintillation detector in nuclear radiation measurements, is described. The analyzer comprises a main transmission line, a cathode-ray tube for each section of the line with its deflection plates acting as the line capacitance; means to bias the respective cathode ray tubes so that the beam strikes a target only when a prearranged pulse amplitude is applied, with each tube progressively biased to respond to smaller amplitudes; pulse generating and counting means associated with each tube to respond when the beam is deflected; a control transmission line having the same time constant as the first line per section with pulse generating means for each tube for initiating a pulse on the second transmission line when a pulse triggers the tube of corresponding amplitude response, the former pulse acting to prevent successive tubes from responding to the pulse under test. This arrangement permits greater deflection sensitivity in the cathode ray tube and overcomes many of the disadvantages of prior art pulse-height analyzer circuits.

  11. Portable Gas Analyzer

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The Michromonitor M500 universal gas analyzer contains a series of miniature modules, each of which is a complete gas chromatograph, an instrument which separates a gaseous mixture into its components and measures the concentrations of each gas in the mixture. The system is manufactured by Microsensor Technology, and is used for environmental analysis, monitoring for gas leaks and chemical spills, compliance with pollution laws, etc. The technology is based on a Viking attempt to detect life on Mars. Ames/Stanford miniaturized the system and NIOSH funded further development. Three Stanford researchers commercialized the technology, which can be operated by unskilled personnel.

  12. Mineral/Water Analyzer

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An x-ray fluorescence spectrometer developed for the Viking Landers by Martin Marietta was modified for geological exploration, water quality monitoring, and aircraft engine maintenance. The aerospace system was highly miniaturized and used very little power. It irradiates the sample causing it to emit x-rays at various energies, then measures the energy levels for sample composition analysis. It was used in oceanographic applications and modified to identify element concentrations in ore samples, on site. The instrument can also analyze the chemical content of water, and detect the sudden development of excessive engine wear.

  13. Fluorescence analyzer for lignin

    DOEpatents

    Berthold, John W.; Malito, Michael L.; Jeffers, Larry

    1993-01-01

    A method and apparatus for measuring lignin concentration in a sample of wood pulp or black liquor comprises a light emitting arrangement for emitting an excitation light through optical fiber bundles into a probe which has an undiluted sensing end facing the sample. The excitation light causes the lignin concentration to produce fluorescent emission light which is then conveyed through the probe to analyzing equipment which measures the intensity of the emission light. Measures a This invention was made with Government support under Contract Number DOE: DE-FC05-90CE40905 awarded by the Department of Energy (DOE). The Government has certain rights in this invention.

  14. RELAPS desktop analyzer

    SciTech Connect

    Beelman, R.J.; Grush, W.H.; Mortensen, G.A.; Snider, D.M.; Wagner, K.L.

    1989-01-01

    The previously mainframe bound RELAP5 reactor safety computer code has been installed on a microcomputer. A simple color-graphic display driver has been developed to enable the user to view the code results as the calculation advances. In order to facilitate future interactive desktop applications, the Nuclear Plant Analyzer (NPA), also previously mainframe bound, is being redesigned to encompass workstation applications. The marriage of RELAP5 simulation capabilities with NPA interactive graphics on a desktop workstation promises to revolutionize reactor safety analysis methodology. 8 refs.

  15. Design Problems for Secondary Students

    ERIC Educational Resources Information Center

    Jonassen, David H.

    2011-01-01

    Are there different kinds of design problems? Jonassen (2011) argued that problems vary in terms of structuredness, complexity, and context. On the structuredness and complexity continua, design problems tend to be the most ill-structured and complex. Brown and Chandrasekaran suggest that design problems may vary along a continuum from…

  16. Gauge cooling for the singular-drift problem in the complex Langevin method — a test in Random Matrix Theory for finite density QCD

    NASA Astrophysics Data System (ADS)

    Nagata, Keitaro; Nishimura, Jun; Shimasaki, Shinji

    2016-07-01

    Recently, the complex Langevin method has been applied successfully to finite density QCD either in the deconfinement phase or in the heavy dense limit with the aid of a new technique called the gauge cooling. In the confinement phase with light quarks, however, convergence to wrong limits occurs due to the singularity in the drift term caused by small eigenvalues of the Dirac operator including the mass term. We propose that this singular-drift problem should also be overcome by the gauge cooling with different criteria for choosing the complexified gauge transformation. The idea is tested in chiral Random Matrix Theory for finite density QCD, where exact results are reproduced at zero temperature with light quarks. It is shown that the gauge cooling indeed changes drastically the eigenvalue distribution of the Dirac operator measured during the Langevin process. Despite its non-holomorphic nature, this eigenvalue distribution has a universal diverging behavior at the origin in the chiral limit due to a generalized Banks-Casher relation as we confirm explicitly.

  17. A study of the complex action problem in a simple model for dynamical compactification in superstring theory using the factorization method.

    NASA Astrophysics Data System (ADS)

    Anagnostopoulos, K.; Azuma, T.; Nishimura, J.

    The IIB matrix model proposes a mechanism for dynamically generating four dimensional space--time in string theory by spontaneous breaking of the ten dimensional rotational symmetry $\\textrm{SO}(10)$. Calculations using the Gaussian expansion method (GEM) lend support to this conjecture. We study a simple $\\textrm{SO}(4)$ invariant matrix model using Monte Carlo simulations and we confirm that its rotational symmetry breaks down, showing that lower dimensional configurations dominate the path integral. The model has a strong complex action problem and the calculations were made possible by the use of the factorization method on the density of states $\\rho_n(x)$ of properly normalized eigenvalues $\\tilde\\lambda_n$ of the space--time moment of inertia tensor. We study scaling properties of the factorized terms of $\\rho_n(x)$ and we find them in agreement with simple scaling arguments. These can be used in the finite size scaling extrapolation and in the study of the region of configuration space obscured by the large fluctuations of the phase. The computed values of $\\tilde\\lambda_n$ are in reasonable agreement with GEM calculations and a numerical method for comparing the free energy of the corresponding ansatze is proposed and tested.

  18. Ring Image Analyzer

    NASA Technical Reports Server (NTRS)

    Strekalov, Dmitry V.

    2012-01-01

    Ring Image Analyzer software analyzes images to recognize elliptical patterns. It determines the ellipse parameters (axes ratio, centroid coordinate, tilt angle). The program attempts to recognize elliptical fringes (e.g., Newton Rings) on a photograph and determine their centroid position, the short-to-long-axis ratio, and the angle of rotation of the long axis relative to the horizontal direction on the photograph. These capabilities are important in interferometric imaging and control of surfaces. In particular, this program has been developed and applied for determining the rim shape of precision-machined optical whispering gallery mode resonators. The program relies on a unique image recognition algorithm aimed at recognizing elliptical shapes, but can be easily adapted to other geometric shapes. It is robust against non-elliptical details of the image and against noise. Interferometric analysis of precision-machined surfaces remains an important technological instrument in hardware development and quality analysis. This software automates and increases the accuracy of this technique. The software has been developed for the needs of an R&TD-funded project and has become an important asset for the future research proposal to NASA as well as other agencies.

  19. Analyzing Aeroelasticity in Turbomachines

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Srivastava, R.

    2003-01-01

    ASTROP2-LE is a computer program that predicts flutter and forced responses of blades, vanes, and other components of such turbomachines as fans, compressors, and turbines. ASTROP2-LE is based on the ASTROP2 program, developed previously for analysis of stability of turbomachinery components. In developing ASTROP2- LE, ASTROP2 was modified to include a capability for modeling forced responses. The program was also modified to add a capability for analysis of aeroelasticity with mistuning and unsteady aerodynamic solutions from another program, LINFLX2D, that solves the linearized Euler equations of unsteady two-dimensional flow. Using LINFLX2D to calculate unsteady aerodynamic loads, it is possible to analyze effects of transonic flow on flutter and forced response. ASTROP2-LE can be used to analyze subsonic, transonic, and supersonic aerodynamics and structural mistuning for rotors with blades of differing structural properties. It calculates the aerodynamic damping of a blade system operating in airflow so that stability can be assessed. The code also predicts the magnitudes and frequencies of the unsteady aerodynamic forces on the airfoils of a blade row from incoming wakes. This information can be used in high-cycle fatigue analysis to predict the fatigue lives of the blades.

  20. Plutonium solution analyzer

    SciTech Connect

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  1. Multiple capillary biochemical analyzer

    DOEpatents

    Dovichi, Norman J.; Zhang, Jian Z.

    1995-01-01

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibres to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands.

  2. Field Deployable DNA analyzer

    SciTech Connect

    Wheeler, E; Christian, A; Marion, J; Sorensen, K; Arroyo, E; Vrankovich, G; Hara, C; Nguyen, C

    2005-02-09

    This report details the feasibility of a field deployable DNA analyzer. Steps for swabbing cells from surfaces and extracting DNA in an automatable way are presented. Since enzymatic amplification reactions are highly sensitive to environmental contamination, sample preparation is a crucial step to make an autonomous deployable instrument. We perform sample clean up and concentration in a flow through packed bed. For small initial samples, whole genome amplification is performed in the packed bed resulting in enough product for subsequent PCR amplification. In addition to DNA, which can be used to identify a subject, protein is also left behind, the analysis of which can be used to determine exposure to certain substances, such as radionuclides. Our preparative step for DNA analysis left behind the protein complement as a waste stream; we determined to learn if the proteins themselves could be analyzed in a fieldable device. We successfully developed a two-step lateral flow assay for protein analysis and demonstrate a proof of principle assay.

  3. Multiple capillary biochemical analyzer

    DOEpatents

    Dovichi, N.J.; Zhang, J.Z.

    1995-08-08

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibers to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands. 21 figs.

  4. Exploiting phase transitions for fusion optimization problems

    NASA Astrophysics Data System (ADS)

    Svenson, Pontus

    2005-05-01

    Many optimization problems that arise in multi-target tracking and fusion applications are known to be NP-complete, ie, believed to have worst-case complexities that are exponential in problem size. Recently, many such NP-complete problems have been shown to display threshold phenomena: it is possible to define a parameter such that the probability of a random problem instance having a solution jumps from 1 to 0 at a specific value of the parameter. It is also found that the amount of resources needed to solve the problem instance peaks at the transition point. Among the problems found to display this behavior are graph coloring (aka clustering, relevant for multi-target tracking), satisfiability (which occurs in resource allocation and planning problem), and the travelling salesperson problem. Physicists studying these problems have found intriguing similarities to phase transitions in spin models of statistical mechanics. Many methods previously used to analyze spin glasses have been used to explain some of the properties of the behavior at the transition point. It turns out that the transition happens because the fitness landscape of the problem changes as the parameter is varied. Some algorithms have been introduced that exploit this knowledge of the structure of the fitness landscape. In this paper, we review some of the experimental and theoretical work on threshold phenomena in optimization problems and indicate how optimization problems from tracking and sensor resource allocation could be analyzed using these results.

  5. Analyzing crime scene videos

    NASA Astrophysics Data System (ADS)

    Cunningham, Cindy C.; Peloquin, Tracy D.

    1999-02-01

    Since late 1996 the Forensic Identification Services Section of the Ontario Provincial Police has been actively involved in state-of-the-art image capture and the processing of video images extracted from crime scene videos. The benefits and problems of this technology for video analysis are discussed. All analysis is being conducted on SUN Microsystems UNIX computers, networked to a digital disk recorder that is used for video capture. The primary advantage of this system over traditional frame grabber technology is reviewed. Examples from actual cases are presented and the successes and limitations of this approach are explored. Suggestions to companies implementing security technology plans for various organizations (banks, stores, restaurants, etc.) will be made. Future directions for this work and new technologies are also discussed.

  6. Analyzing geographic clustered response

    SciTech Connect

    Merrill, D.W.; Selvin, S.; Mohr, M.S.

    1991-08-01

    In the study of geographic disease clusters, an alternative to traditional methods based on rates is to analyze case locations on a transformed map in which population density is everywhere equal. Although the analyst's task is thereby simplified, the specification of the density equalizing map projection (DEMP) itself is not simple and continues to be the subject of considerable research. Here a new DEMP algorithm is described, which avoids some of the difficulties of earlier approaches. The new algorithm (a) avoids illegal overlapping of transformed polygons; (b) finds the unique solution that minimizes map distortion; (c) provides constant magnification over each map polygon; (d) defines a continuous transformation over the entire map domain; (e) defines an inverse transformation; (f) can accept optional constraints such as fixed boundaries; and (g) can use commercially supported minimization software. Work is continuing to improve computing efficiency and improve the algorithm. 21 refs., 15 figs., 2 tabs.

  7. PULSE AMPLITUDE ANALYZERS

    DOEpatents

    Gray, G.W.; Jensen, A.S.

    1958-06-01

    An analyzer system incorporating a cathode-ray tube and linearly spaced targets masked by a plate having slits at points corresponding to the location of the targets is described. The advantages of the system include reduction in the required amplified band width and also the reduction in possible double counting of a pulse by striking two targets. The system comprises integrating means for each pulse, the signal from which is applied to a pair of deflection plates, and a control circuit for turning on the electron beam when the pulse has almost reached its maximum value. The mask prevents the beam from overlapping on a target adjacent to the proper one, while a control circuit responsive to the target output signals acts to cut off the beam immediately after the beam strikes a target to permit the beam to impinge on only one target.

  8. Moving particle composition analyzer

    NASA Technical Reports Server (NTRS)

    Auer, S. O. (Inventor)

    1976-01-01

    A mass spectrometry apparatus for analyzing the composition of moving microscopic particles is introduced. The apparatus includes a capacitor with a front electrode upon which the particles impinge, a back electrode, and a solid dielectric sandwiched between the front and back electrodes. In one embodiment, the electrodes and dielectric are arcuately shaped as concentric peripheral segments of different spheres having a common center and different radii. The front electrode and dielectric together have a thickness such that an impinging particle can penetrate them. In a second embodiment, the capacitor has planar, parallel electrodes, in which case the ejected positive ions are deflected downstream of a planar grid by a pair of spaced, arcuate capacitor plates having a region between them through which the ejected ions travel.

  9. Residual gas analyzer calibration

    NASA Technical Reports Server (NTRS)

    Lilienkamp, R. H.

    1972-01-01

    A technique which employs known gas mixtures to calibrate the residual gas analyzer (RGA) is described. The mass spectra from the RGA are recorded for each gas mixture. This mass spectra data and the mixture composition data each form a matrix. From the two matrices the calibration matrix may be computed. The matrix mathematics requires the number of calibration gas mixtures be equal to or greater than the number of gases included in the calibration. This technique was evaluated using a mathematical model of an RGA to generate the mass spectra. This model included shot noise errors in the mass spectra. Errors in the gas concentrations were also included in the valuation. The effects of these errors was studied by varying their magnitudes and comparing the resulting calibrations. Several methods of evaluating an actual calibration are presented. The effects of the number of gases in then, the composition of the calibration mixture, and the number of mixtures used are discussed.

  10. Analyzing Next to Nothing

    NASA Astrophysics Data System (ADS)

    Taylor, G. J.

    2000-04-01

    Analytical techniques have advanced so far that it is possible to slice up a sample only 10 micrometers across (with a mass of only a billionth of a gram) so that a dozen microanalytical techniques can be used to extract fascinating, crucial information about the sample's history. This astonishing ability is useful in analyzing interplanetary dust collected in the stratosphere, tiny interstellar grains in meteorites, sparse and wispy weathering products in Martian meteorites, and samples to be collected and returned to Earth by current and future sample return missions from comets, asteroids, Martian moons, and Mars. The importance of the array of techniques available to cosmochemists has been documented by Michael Zolensky (Johnson Space Center), Carle Pieters (Brown University), Benton Clark (Lockheed Martin Astronautics, Denver), and James Papike (University of New Mexico), with special attention to sample-return missions.

  11. Analyzing a Cometary 'Sneeze'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Figure 1: Analyzing a Cometary 'Sneeze'

    This display shows highly processed images of the outburst of comet Tempel 1 between June 22 and 23, 2005. The pictures were taken by Deep Impact's medium-resolution camera. An average image of the comet has been subtracted from each picture to provide an enhanced view of the outburst. The intensity has also been stretched to show the faintest parts. This processing enables measurement of the outflow speed and the details of the dissipation of the outburst. The left image was taken when the comet was very close to its normal, non-bursting state, so almost nothing is visible.

  12. Analyzing Water's Optical Absorption

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  13. Motion detector and analyzer

    DOEpatents

    Unruh, W.P.

    1987-03-23

    Method and apparatus are provided for deriving positive and negative Doppler spectrum to enable analysis of objects in motion, and particularly, objects having rotary motion. First and second returned radar signals are mixed with internal signals to obtain an in-phase process signal and a quadrature process signal. A broad-band phase shifter shifts the quadrature signal through 90/degree/ relative to the in-phase signal over a predetermined frequency range. A pair of signals is output from the broad-band phase shifter which are then combined to provide a first side band signal which is functionally related to a negative Doppler shift spectrum. The distinct positive and negative Doppler spectra may then be analyzed for the motion characteristics of the object being examined.

  14. ROBOT TASK SCENE ANALYZER

    SciTech Connect

    William R. Hamel; Steven Everett

    2000-08-01

    Environmental restoration and waste management (ER and WM) challenges in the United States Department of Energy (DOE), and around the world, involve radiation or other hazards which will necessitate the use of remote operations to protect human workers from dangerous exposures. Remote operations carry the implication of greater costs since remote work systems are inherently less productive than contact human work due to the inefficiencies/complexities of teleoperation. To reduce costs and improve quality, much attention has been focused on methods to improve the productivity of combined human operator/remote equipment systems; the achievements to date are modest at best. The most promising avenue in the near term is to supplement conventional remote work systems with robotic planning and control techniques borrowed from manufacturing and other domains where robotic automation has been used. Practical combinations of teleoperation and robotic control will yield telerobotic work systems that outperform currently available remote equipment. It is believed that practical telerobotic systems may increase remote work efficiencies significantly. Increases of 30% to 50% have been conservatively estimated for typical remote operations. It is important to recognize that the basic hardware and software features of most modern remote manipulation systems can readily accommodate the functionality required for telerobotics. Further, several of the additional system ingredients necessary to implement telerobotic control--machine vision, 3D object and workspace modeling, automatic tool path generation and collision-free trajectory planning--are existent.

  15. Monte Carlo techniques for analyzing deep-penetration problems

    SciTech Connect

    Cramer, S.N.; Gonnord, J.; Hendricks, J.S.

    1986-02-01

    Current methods and difficulties in Monte Carlo deep-penetration calculations are reviewed, including statistical uncertainty and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multigroup Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications.

  16. Monte Carlo techniques for analyzing deep penetration problems

    SciTech Connect

    Cramer, S.N.; Gonnord, J.; Hendricks, J.S.

    1985-01-01

    A review of current methods and difficulties in Monte Carlo deep-penetration calculations is presented. Statistical uncertainty is discussed, and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing is reviewed. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multi-group Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications. 29 refs.

  17. Analyzing Human Communication Networks in Organizations: Applications to Management Problems.

    ERIC Educational Resources Information Center

    Farace, Richard V.; Danowski, James A.

    Investigating the networks of communication in organizations leads to an understanding of efficient and inefficient information dissemination as practiced in large systems. Most important in organizational communication is the role of the "liaison person"--the coordinator of intercommunication. When functioning efficiently, coordinators maintain…

  18. Use of a genetic algorithm to analyze robust stability problems

    SciTech Connect

    Murdock, T.M.; Schmitendorf, W.E.; Forrest, S.

    1990-01-01

    This note resents a genetic algorithm technique for testing the stability of a characteristic polynomial whose coefficients are functions of unknown but bounded parameters. This technique is fast and can handle a large number of parametric uncertainties. We also use this method to determine robust stability margins for uncertain polynomials. Several benchmark examples are included to illustrate the two uses of the algorithm. 27 refs., 4 figs.

  19. Nonlinear Single Spin Spectrum Analyzer

    NASA Astrophysics Data System (ADS)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2014-03-01

    Qubits have been used as linear spectrum analyzers of their environments, through the use of decoherence spectroscopy. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis. Phys. Rev. Lett. 110, 110503 (2013). Synopsis at http://physics.aps.org/synopsis-for/10.1103/PhysRevLett.110.110503 Current position: NIST, Boulder, CO.

  20. Gradients in analyzability.

    PubMed

    Grotstein, J S

    A discussion of "Some Communicative Properties of the Bipersonal Field" by Robert Langs, M.D. In response to Dr. Langs' delineation of the bipersonal field concept and his clinical elaboration of a triad of disorders which are graded into classifications of descending analyzability: Types A,B, and C fields. I confirm his thesis and endeavor to demonstrate some underlying foundations of his categorical assumptions, namely the conceptions of projective identification, of the intactness of the background object of primary identification, the conception of a dual-track theory of infantile development in order to delineate the parallel between the separated self and the continuation of primary identification, and the postulation of manic and schizoid types of narcissistic character disorders (Types B and C respectively). All of these conceptions are vicissitudes of the varying ways in which patients confront the depressive position of separation-individuation with rapprochement and, thereby, conform to a gradient in which symbolization interpretations can be utilized in analytic treatment. PMID:738806

  1. Analyzing Atmospheric Neutrino Oscillations

    SciTech Connect

    Escamilla, J.; Ernst, D. J.; Latimer, D. C.

    2007-10-26

    We provide a pedagogic derivation of the formula needed to analyze atmospheric data and then derive, for the subset of the data that are fully-contained events, an analysis tool that is quantitative and numerically efficient. Results for the full set of neutrino oscillation data are then presented. We find the following preliminary results: 1.) the sub-dominant approximation provides reasonable values for the best fit parameters for {delta}{sub 32}, {theta}{sub 23}, and {theta}{sub 13} but does not quantitatively provide the errors for these three parameters; 2.) the size of the MSW effect is suppressed in the sub-dominant approximation; 3.) the MSW effect reduces somewhat the extracted error for {delta}{sub 32}, more so for {theta}{sub 23} and {theta}{sub 13}; 4.) atmospheric data alone constrains the allowed values of {theta}{sub 13} only in the sub-dominant approximation, the full three neutrino calculations requires CHOOZ to get a clean constraint; 5.) the linear in {theta}{sub 13} terms are not negligible; and 6.) the minimum value of {theta}{sub 13} is found to be negative, but at a statistically insignificant level.

  2. PULSE HEIGHT ANALYZER

    DOEpatents

    Johnstone, C.W.

    1958-01-21

    An anticoincidence device is described for a pair of adjacent channels of a multi-channel pulse height analyzer for preventing the lower channel from generating a count pulse in response to an input pulse when the input pulse has sufficient magnitude to reach the upper level channel. The anticoincidence circuit comprises a window amplifier, upper and lower level discriminators, and a biased-off amplifier. The output of the window amplifier is coupled to the inputs of the discriminators, the output of the upper level discriminator is connected to the resistance end of a series R-C network, the output of the lower level discriminator is coupled to the capacitance end of the R-C network, and the grid of the biased-off amplifier is coupled to the junction of the R-C network. In operation each discriminator produces a negative pulse output when the input pulse traverses its voltage setting. As a result of the connections to the R-C network, a trigger pulse will be sent to the biased-off amplifier when the incoming pulse level is sufficient to trigger only the lower level discriminator.

  3. Analyzing Spacecraft Telecommunication Systems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  4. Pseudostupidity and analyzability.

    PubMed

    Cohn, L S

    1989-01-01

    This paper seeks to heighten awareness of pseudostupidity and the potential analyzability of patients who manifest it by defining and explicating it, reviewing the literature, and presenting in detail the psychoanalytic treatment of a pseudostupid patient. Pseudostupidity is caused by an inhibition of the integration and synthesis of thoughts resulting in a discrepancy between intellectual capacity and apparent intellect. The patient's pseudostupidity was determined in part by his need to prevent his being more successful than father, i.e., defeating his oedipal rival. Knowing and learning were instinctualized. The patient libidinally and defensively identified with father's passive, masochistic position. He needed to frustrate the analyst as he had felt excited and frustrated by his parents' nudity and thwarted by his inhibitions. He wanted to cause the analyst to feel as helpless as he, the patient, felt. Countertransference frustration was relevant and clinically useful in the analysis. Interpretation of evolving relevant issues led to more anxiety and guilt, less pseudostupidity, a heightened alliance, and eventual working through. Negative therapeutic reactions followed the resolution of pseudostupidity. PMID:2708771

  5. TEAMS Model Analyzer

    NASA Technical Reports Server (NTRS)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  6. Lorentz force particle analyzer

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Thess, André; Moreau, René; Tan, Yanqing; Dai, Shangjun; Tao, Zhen; Yang, Wenzhi; Wang, Bo

    2016-07-01

    A new contactless technique is presented for the detection of micron-sized insulating particles in the flow of an electrically conducting fluid. A transverse magnetic field brakes this flow and tends to become entrained in the flow direction by a Lorentz force, whose reaction force on the magnetic-field-generating system can be measured. The presence of insulating particles suspended in the fluid produce changes in this Lorentz force, generating pulses in it; these pulses enable the particles to be counted and sized. A two-dimensional numerical model that employs a moving mesh method demonstrates the measurement principle when such a particle is present. Two prototypes and a three-dimensional numerical model are used to demonstrate the feasibility of a Lorentz force particle analyzer (LFPA). The findings of this study conclude that such an LFPA, which offers contactless and on-line quantitative measurements, can be applied to an extensive range of applications. These applications include measurements of the cleanliness of high-temperature and aggressive molten metal, such as aluminum and steel alloys, and the clean manufacturing of semiconductors.

  7. PULSE HEIGHT ANALYZER

    DOEpatents

    Goldsworthy, W.W.

    1958-06-01

    A differential pulse-height discriminator circuit is described which is readily adaptable for operation in a single-channel pulse-height analyzer. The novel aspect of the circuit lies in the specific arrangement of differential pulse-height discriminator which includes two pulse-height discriminators having a comnnon input and an anticoincidence circuit having two interconnected vacuum tubes with a common cathode resistor. Pulses from the output of one discriminator circuit are delayed and coupled to the grid of one of the anticoincidence tubes by a resistor. The output pulses from the other discriminator circuit are coupled through a cathode follower circuit, which has a cathode resistor of such value as to provide a long time constant with the interelectrode capacitance of the tube, to lenthen the output pulses. The pulses are then fed to the grid of the other anticoincidence tube. With such connections of the circuits, only when the incoming pulse has a pesk value between the operating levels of the two discriminators does an output pulse occur from the anticoincidence circuit.

  8. Analyzing Visibility Configurations.

    PubMed

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications. PMID:20498504

  9. Analyzing large biological datasets with association networks

    SciTech Connect

    Karpinets, T. V.; Park, B. H.; Uberbacher, E. C.

    2012-05-25

    Due to advances in high throughput biotechnologies biological information is being collected in databases at an amazing rate, requiring novel computational approaches for timely processing of the collected data into new knowledge. In this study we address this problem by developing a new approach for discovering modular structure, relationships and regularities in complex data. These goals are achieved by converting records of biological annotations of an object, like organism, gene, chemical, sequence, into networks (Anets) and rules (Arules) of the associated annotations. Anets are based on similarity of annotation profiles of objects and can be further analyzed and visualized providing a compact birds-eye view of most significant relationships in the collected data and a way of their clustering and classification. Arules are generated by Apriori considering each record of annotations as a transaction and augmenting each annotation item by its type. Arules provide a way to validate relationships discovered by Anets producing comprehensive statistics on frequently associated annotations and specific confident relationships among them. A combination of Anets and Arules represents condensed information on associations among the collected data, helping to discover new knowledge and generate hypothesis. As an example we have applied the approach to analyze bacterial metadata from the Genomes OnLine Database. The analysis allowed us to produce a map of sequenced bacterial and archaeal organisms based on their genomic, metabolic and physiological characteristics with three major clusters of metadata representing bacterial pathogens, environmental isolates, and plant symbionts. A signature profile of clustered annotations of environmental bacteria if compared with pathogens linked the aerobic respiration, the high GC content and the large genome size to diversity of metabolic activities and physiological features of the organisms.

  10. Nonlinear dynamical systems analyzer

    NASA Astrophysics Data System (ADS)

    Coffey, Adrian S.; Johnson, Martin; Jones, Robin

    1994-10-01

    Computationally intensive algorithms are an ever more common requirement of modern signal processing. Following the work of Gentleman and Kung, McWhirter, Shepherd and Proudler suggested that certain matrix-orientated algorithms can be mapped onto systolic array architectures for adaptive linear signal processing. This has been extended by Broomhead et al. to the calculation of nonlinear predictive models and applied by Jones et al. to target identification and recognition. We shall show that predictive models are extremely sharp discriminators. Our chosen problem, if implemented as a systolic array, would require 3403 processors which would result in high through-put rate at excessive cost. We are developing an efficient sub-optimally implemented systolic array; one processor servicing more than one systolic node. We describe a prototype Heuristic Processor which computes a multi- dimensional, nonlinear, predictive model. It consists of a Radial Basis Function Network and a least squares optimizer using QR decomposition. The optimized solution of a set of simultaneous equations in 81 unknowns is calculated in 150 (mu) S. The QR section emulates a triangular systolic array by the novel use of an array of 40 mature silicon DSP chips costing under DOL100 each. The DSP chips operate in synchronism at a 50 MHz clock rate passing data to each other through multi-port memories on a dead-letter box principle; there are no memory access conflicts and only two-port and three-port memories are required. The processor provides 1-GFlop of computing power per cubic-foot of electronics for a component cost of approximately DOL15,000.

  11. Digital Microfluidics Sample Analyzer

    NASA Technical Reports Server (NTRS)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  12. Predicting and Analyzing Cellular Networks

    NASA Astrophysics Data System (ADS)

    Singh, Mona

    High-throughput experimental technologies, along with computational predictions, have resulted in large-scale biological networks for numerous organisms. Global analyses of biological networks provide new opportunities for revealing protein functions and pathways, and for uncovering cellular organization principles. In my talk, I will discuss a number of approaches we have developed over the years for the complementary problems of predicting interactions and analyzing interaction networks. First, I will describe a genomic approach for uncovering high-confidence regulatory interactions, and show how it can be effectively combined with a framework for predicting regulatory interactions for proteins with known structural domains but unknown binding specificity. Next, I will describe algorithms for analyzing protein interaction networks in order to uncover protein function and functional modules, and demonstrate the importance of considering the topological structure of interaction networks in order to make high quality predictions. Finally, I will present a framework for explicitly incorporating known attributes of individual proteins into the analysis of biological networks, and utilize it to discover recurring network patterns underlying a range of biological processes.

  13. Some easily analyzable convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R.; Dolinar, S.; Pollara, F.; Vantilborg, H.

    1989-01-01

    Convolutional codes have played and will play a key role in the downlink telemetry systems on many NASA deep-space probes, including Voyager, Magellan, and Galileo. One of the chief difficulties associated with the use of convolutional codes, however, is the notorious difficulty of analyzing them. Given a convolutional code as specified, say, by its generator polynomials, it is no easy matter to say how well that code will perform on a given noisy channel. The usual first step in such an analysis is to computer the code's free distance; this can be done with an algorithm whose complexity is exponential in the code's constraint length. The second step is often to calculate the transfer function in one, two, or three variables, or at least a few terms in its power series expansion. This step is quite hard, and for many codes of relatively short constraint lengths, it can be intractable. However, a large class of convolutional codes were discovered for which the free distance can be computed by inspection, and for which there is a closed-form expression for the three-variable transfer function. Although for large constraint lengths, these codes have relatively low rates, they are nevertheless interesting and potentially useful. Furthermore, the ideas developed here to analyze these specialized codes may well extend to a much larger class.

  14. Soft Decision Analyzer

    NASA Technical Reports Server (NTRS)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  15. Regolith Evolved Gas Analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, John H.; Hedgecock, Jud; Nienaber, Terry; Cooper, Bonnie; Allen, Carlton; Ming, Doug

    2000-01-01

    The Regolith Evolved Gas Analyzer (REGA) is a high-temperature furnace and mass spectrometer instrument for determining the mineralogical composition and reactivity of soil samples. REGA provides key mineralogical and reactivity data that is needed to understand the soil chemistry of an asteroid, which then aids in determining in-situ which materials should be selected for return to earth. REGA is capable of conducting a number of direct soil measurements that are unique to this instrument. These experimental measurements include: (1) Mass spectrum analysis of evolved gases from soil samples as they are heated from ambient temperature to 900 C; and (2) Identification of liberated chemicals, e.g., water, oxygen, sulfur, chlorine, and fluorine. REGA would be placed on the surface of a near earth asteroid. It is an autonomous instrument that is controlled from earth but does the analysis of regolith materials automatically. The REGA instrument consists of four primary components: (1) a flight-proven mass spectrometer, (2) a high-temperature furnace, (3) a soil handling system, and (4) a microcontroller. An external arm containing a scoop or drill gathers regolith samples. A sample is placed in the inlet orifice where the finest-grained particles are sifted into a metering volume and subsequently moved into a crucible. A movable arm then places the crucible in the furnace. The furnace is closed, thereby sealing the inner volume to collect the evolved gases for analysis. Owing to the very low g forces on an asteroid compared to Mars or the moon, the sample must be moved from inlet to crucible by mechanical means rather than by gravity. As the soil sample is heated through a programmed pattern, the gases evolved at each temperature are passed through a transfer tube to the mass spectrometer for analysis and identification. Return data from the instrument will lead to new insights and discoveries including: (1) Identification of the molecular masses of all of the gases

  16. Crew Activity Analyzer

    NASA Technical Reports Server (NTRS)

    Murray, James; Kirillov, Alexander

    2008-01-01

    The crew activity analyzer (CAA) is a system of electronic hardware and software for automatically identifying patterns of group activity among crew members working together in an office, cockpit, workshop, laboratory, or other enclosed space. The CAA synchronously records multiple streams of data from digital video cameras, wireless microphones, and position sensors, then plays back and processes the data to identify activity patterns specified by human analysts. The processing greatly reduces the amount of time that the analysts must spend in examining large amounts of data, enabling the analysts to concentrate on subsets of data that represent activities of interest. The CAA has potential for use in a variety of governmental and commercial applications, including planning for crews for future long space flights, designing facilities wherein humans must work in proximity for long times, improving crew training and measuring crew performance in military settings, human-factors and safety assessment, development of team procedures, and behavioral and ethnographic research. The data-acquisition hardware of the CAA (see figure) includes two video cameras: an overhead one aimed upward at a paraboloidal mirror on the ceiling and one mounted on a wall aimed in a downward slant toward the crew area. As many as four wireless microphones can be worn by crew members. The audio signals received from the microphones are digitized, then compressed in preparation for storage. Approximate locations of as many as four crew members are measured by use of a Cricket indoor location system. [The Cricket indoor location system includes ultrasonic/radio beacon and listener units. A Cricket beacon (in this case, worn by a crew member) simultaneously transmits a pulse of ultrasound and a radio signal that contains identifying information. Each Cricket listener unit measures the difference between the times of reception of the ultrasound and radio signals from an identified beacon

  17. Communication complexity and information complexity

    NASA Astrophysics Data System (ADS)

    Pankratov, Denis

    Information complexity enables the use of information-theoretic tools in communication complexity theory. Prior to the results presented in this thesis, information complexity was mainly used for proving lower bounds and direct-sum theorems in the setting of communication complexity. We present three results that demonstrate new connections between information complexity and communication complexity. In the first contribution we thoroughly study the information complexity of the smallest nontrivial two-party function: the AND function. While computing the communication complexity of AND is trivial, computing its exact information complexity presents a major technical challenge. In overcoming this challenge, we reveal that information complexity gives rise to rich geometrical structures. Our analysis of information complexity relies on new analytic techniques and new characterizations of communication protocols. We also uncover a connection of information complexity to the theory of elliptic partial differential equations. Once we compute the exact information complexity of AND, we can compute exact communication complexity of several related functions on n-bit inputs with some additional technical work. Previous combinatorial and algebraic techniques could only prove bounds of the form theta( n). Interestingly, this level of precision is typical in the area of information theory, so our result demonstrates that this meta-property of precise bounds carries over to information complexity and in certain cases even to communication complexity. Our result does not only strengthen the lower bound on communication complexity of disjointness by making it more exact, but it also shows that information complexity provides the exact upper bound on communication complexity. In fact, this result is more general and applies to a whole class of communication problems. In the second contribution, we use self-reduction methods to prove strong lower bounds on the information

  18. Modular thermal analyzer routine, volume 1

    NASA Technical Reports Server (NTRS)

    Oren, J. A.; Phillips, M. A.; Williams, D. R.

    1972-01-01

    The Modular Thermal Analyzer Routine (MOTAR) is a general thermal analysis routine with strong capabilities for performing thermal analysis of systems containing flowing fluids, fluid system controls (valves, heat exchangers, etc.), life support systems, and thermal radiation situations. Its modular organization permits the analysis of a very wide range of thermal problems for simple problems containing a few conduction nodes to those containing complicated flow and radiation analysis with each problem type being analyzed with peak computational efficiency and maximum ease of use. The organization and programming methods applied to MOTAR achieved a high degree of computer utilization efficiency in terms of computer execution time and storage space required for a given problem. The computer time required to perform a given problem on MOTAR is approximately 40 to 50 percent that required for the currently existing widely used routines. The computer storage requirement for MOTAR is approximately 25 percent more than the most commonly used routines for the most simple problems but the data storage techniques for the more complicated options should save a considerable amount of space.

  19. Classroom Learning and Achievement: How the Complexity of Classroom Interaction Impacts Students' Learning

    ERIC Educational Resources Information Center

    Podschuweit, Sören; Bernholt, Sascha; Brückmann, Maja

    2016-01-01

    Background: Complexity models have provided a suitable framework in various domains to assess students' educational achievement. Complexity is often used as the analytical focus when regarding learning outcomes, i.e. when analyzing written tests or problem-centered interviews. Numerous studies reveal negative correlations between the complexity of…

  20. Cognitive Complexity of Mathematics Instructional Tasks in a Taiwanese Classroom: An Examination of Task Sources

    ERIC Educational Resources Information Center

    Hsu, Hui-Yu; Silver, Edward A.

    2014-01-01

    We examined geometric calculation with number tasks used within a unit of geometry instruction in a Taiwanese classroom, identifying the source of each task used in classroom instruction and analyzing the cognitive complexity of each task with respect to 2 distinct features: diagram complexity and problem-solving complexity. We found that…

  1. The Universal Traveler, A Soft-Systems Guide to: Creativity, Problem Solving, and the Process of Reaching Goals. [Revised Edition.

    ERIC Educational Resources Information Center

    Koberg, Don; Bagnall, Jim

    This publication provides an organizational scheme for a creative problem solving process. The authors indicate that all problems can benefit from the same logical and orderly process now employed to solve many complex problems. The principles remain constant; only specific methods change. Chapter 1 analyzes the development of creativity and fear…

  2. PROBLEM OF COMPLEX EIGENSYSTEMS INTHE SEMIANALYTICAL SOLUTION FOR ADVANCEMENT OF TIME IN SOLUTE TRANSPORT SIMULATIONS: A NEW METHOD USING REAL ARITHMETIC.

    USGS Publications Warehouse

    Umari, Amjad M. J.; Gorelick, Steven M.

    1986-01-01

    In the numerical modeling of groundwater solute transport, explicit solutions may be obtained for the concentration field at any future time without computing concentrations at intermediate times. The spatial variables are discretized and time is left continuous inthe governing differnetial equation. These semianalytical solutions have been presented in the literature and involve the eigensystem of a coefficient matrix. This eigensystem may be complex (i. e. , have imaginary components) due to the asymmetry created by the advection term in the governing advection-dispersionequation. It is shown here that the error due to ignoring the imaginary components of complex eigenvalues is large for small dispersivity values. A new algorithm that represents the complex eigensystem by converting it to a real eigensystem is presented. The method requires only real arithmetic.

  3. Analyzes Data from Semiconductor Wafers

    Energy Science and Technology Software Center (ESTSC)

    2002-07-23

    This program analyzes reflectance data from semiconductor wafers taken during the deposition or evolution of a thin film, typically via chemical vapor deposition (CVD) or molecular beam epitaxy (MBE). It is used to determine the growth rate and optical constants of the deposited thin films using a virtual interface concept. Growth rates and optical constants of multiple-layer structures is possible by selecting appropriate sections in the reflectance vs time waveform. No prior information or estimatesmore » of growth rates and materials properties is required if an absolute reflectance waveform is used. If the optical constants of a thin film are known, then the growth rate may be extracted from a relative reflectance data set. The analysis is valid for either s or p polarized light at any incidence angle and wavelength. The analysis package is contained within an easy-to-use graphical user interface. The program is based on the algorighm described in the following two publications: W.G. Breiland and K.P. Killen, J. Appl. Phys. 78 (1995) 6726, and W. G. Breiland, H.Q. Hou, B.E. Hammons, and J.F. Klem, Proc. XXVIII SOTAPOCS Symp. Electrochem. Soc. San Diego, May 3-8, 1998. It relies on the fact that any multiple-layer system has a reflectance spectrum that is mathematically equivalent to a single-layer thin film on a virtual substrate. The program fits the thin film reflectance with five adjustable parameters: 1) growth rate, 2) real part of complex refractive index, 3) imaginary part of refractive index, 4) amplitude of virtual interface reflectance, 5) phase of virtual interface reflectance.« less

  4. Method for analyzing signaling networks in complex cellular systems.

    PubMed

    Plavec, Ivan; Sirenko, Oksana; Privat, Sylvie; Wang, Yuker; Dajee, Maya; Melrose, Jennifer; Nakao, Brian; Hytopoulos, Evangelos; Berg, Ellen L; Butcher, Eugene C

    2004-02-01

    Now that the human genome has been sequenced, the challenge of assigning function to human genes has become acute. Existing approaches using microarrays or proteomics frequently generate very large volumes of data not directly related to biological function, making interpretation difficult. Here, we describe a technique for integrative systems biology in which: (i) primary cells are cultured under biologically meaningful conditions; (ii) a limited number of biologically meaningful readouts are measured; and (iii) the results obtained under several different conditions are combined for analysis. Studies of human endothelial cells overexpressing different signaling molecules under multiple inflammatory conditions show that this system can capture a remarkable range of functions by a relatively small number of simple measurements. In particular, measurement of seven different protein levels by ELISA under four different conditions is capable of reconstructing pathway associations of 25 different proteins representing four known signaling pathways, implicating additional participants in the NF-kappaBorRAS/mitogen-activated protein kinase pathways and defining additional interactions between these pathways. PMID:14745015

  5. Method for analyzing signaling networks in complex cellular systems

    PubMed Central

    Plavec, Ivan; Sirenko, Oksana; Privat, Sylvie; Wang, Yuker; Dajee, Maya; Melrose, Jennifer; Nakao, Brian; Hytopoulos, Evangelos; Berg, Ellen L.; Butcher, Eugene C.

    2004-01-01

    Now that the human genome has been sequenced, the challenge of assigning function to human genes has become acute. Existing approaches using microarrays or proteomics frequently generate very large volumes of data not directly related to biological function, making interpretation difficult. Here, we describe a technique for integrative systems biology in which: (i) primary cells are cultured under biologically meaningful conditions; (ii) a limited number of biologically meaningful readouts are measured; and (iii) the results obtained under several different conditions are combined for analysis. Studies of human endothelial cells overexpressing different signaling molecules under multiple inflammatory conditions show that this system can capture a remarkable range of functions by a relatively small number of simple measurements. In particular, measurement of seven different protein levels by ELISA under four different conditions is capable of reconstructing pathway associations of 25 different proteins representing four known signaling pathways, implicating additional participants in the NF-κBorRAS/mitogen-activated protein kinase pathways and defining additional interactions between these pathways. PMID:14745015

  6. TECHNIQUES FOR ANALYZING COMPLEX MIXTURES OF DRINKING WATER DBPS

    EPA Science Inventory

    Although chlorine has been used to disinfect drinking water for approximately 100 years, there have been concerns raised over its use, due to the formation of potentially hazardous by-products. Trihalomethanes (THMs) were the first disinfection by-products (DBPs) identified and ...

  7. An approach to 1,3,4-dioxaphospholane complexes through an acid-induced ring expansion of an oxaphosphirane complex: the problem of construction and deconstruction of O,P-heterocycles.

    PubMed

    Pérez, Janaina Marinas; Helten, Holger; Schnakenburg, Gregor; Streubel, Rainer

    2011-06-01

    Treatment of oxaphosphirane complex 1, triflic acid (TfOH), and various aldehydes yielded 1,3,4-dioxaphospholane complexes 5a,b-7a,b after deprotonation with NEt(3). In addition to NMR spectroscopy, IR spectroscopy, and MS data, the X-ray structures of complexes 5a and 7a were determined. (31)P NMR spectroscopic monitoring and DFT calculations provided insight into the reaction course and revealed the transient TfOH 1,3,4-dioxaphospholanium association complex TfOH-5a,b and/or TfOH-5a,b' as key reactive intermediates. Furthermore, it was observed that the five-membered ring system was cleaved upon warming and yielded side-on (E,Z)-methylenephosphonium complexes 8a,b if deprotonation did not occur at low temperature. Overall, a novel temperature- and acid-dependent construction and deconstruction process of the 1,3,4-dioxaphospholane ring system is described. PMID:21433300

  8. Droplet actuator analyzer with cartridge

    NASA Technical Reports Server (NTRS)

    Smith, Gregory F. (Inventor); Sturmer, Ryan A. (Inventor); Paik, Philip Y. (Inventor); Srinivasan, Vijay (Inventor); Pollack, Michael G. (Inventor); Pamula, Vamsee K. (Inventor); Brafford, Keith R. (Inventor); West, Richard M. (Inventor)

    2011-01-01

    A droplet actuator with cartridge is provided. According to one embodiment, a sample analyzer is provided and includes an analyzer unit comprising electronic or optical receiving means, a cartridge comprising self-contained droplet handling capabilities, and a wherein the cartridge is coupled to the analyzer unit by a means which aligns electronic and/or optical outputs from the cartridge with electronic or optical receiving means on the analyzer unit. According to another embodiment, a sample analyzer is provided and includes a sample analyzer comprising a cartridge coupled thereto and a means of electrical interface and/or optical interface between the cartridge and the analyzer, whereby electrical signals and/or optical signals may be transmitted from the cartridge to the analyzer.

  9. Soft Decision Analyzer and Method

    NASA Technical Reports Server (NTRS)

    Steele, Glen F. (Inventor); Lansdowne, Chatwin (Inventor); Zucha, Joan P. (Inventor); Schlesinger, Adam M. (Inventor)

    2015-01-01

    A soft decision analyzer system is operable to interconnect soft decision communication equipment and analyze the operation thereof to detect symbol wise alignment between a test data stream and a reference data stream in a variety of operating conditions.

  10. Detector verifier for circuit analyzers

    NASA Technical Reports Server (NTRS)

    Pope, D. L.; Wooters, R. L.

    1980-01-01

    Economical tool checks operation of automatic circuit analyzer. Each loop is addressed directly from analyzer console by switching internal analyzer bridge to resistance equal that of connecting cable plus specified limiting test value. Procedure verifies whether detected faults in circuit under test are actually due to analyzer malfunction. Standard-length universal test cables make it possible to shift detector tool from cable to cable without resistance compensation.

  11. L.E.A.D.: A Framework for Evidence Gathering and Use for the Prevention of Obesity and Other Complex Public Health Problems

    ERIC Educational Resources Information Center

    Chatterji, Madhabi; Green, Lawrence W.; Kumanyika, Shiriki

    2014-01-01

    This article summarizes a comprehensive, systems-oriented framework designed to improve the use of a wide variety of evidence sources to address population-wide obesity problems. The L.E.A.D. framework (for "Locate" the evidence, "Evaluate" the evidence, "Assemble" the evidence, and inform "Decisions"),…

  12. Tourette Syndrome: Overview and Classroom Interventions. A Complex Neurobehavioral Disorder Which May Involve Learning Problems, Attention Deficit Hyperactivity Disorder, Obsessive Compulsive Symptoms, and Stereotypical Behaviors.

    ERIC Educational Resources Information Center

    Fisher, Ramona A.; Collins, Edward C.

    Tourette Syndrome is conceptualized as a neurobehavioral disorder, with behavioral aspects that are sometimes difficult for teachers to understand and deal with. The disorder has five layers of complexity: (1) observable multiple motor, vocal, and cognitive tics and sensory involvement; (2) Attention Deficit Hyperactivity Disorder; (3)…

  13. Developing an Approach for Analyzing and Verifying System Communication

    NASA Technical Reports Server (NTRS)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  14. Six-month outcomes following an emergency hospital admission for older adults with co-morbid mental health problems indicate complexity of care needs

    PubMed Central

    Bradshaw, Lucy E.; Goldberg, Sarah E.; Lewis, Sarah A.; Whittamore, Kathy; Gladman, John R. F.; Jones, Rob G.; Harwood, Rowan H.

    2013-01-01

    Background: two-thirds of older patients admitted as an emergency to a general hospital have co-existing mental health problems including delirium, dementia and depression. This study describes the outcomes of older adults with co-morbid mental health problems after an acute hospital admission. Methods: a follow-up study of 250 patients aged over 70 admitted to 1 of 12 wards (geriatric, medical or orthopaedic) of an English acute general hospital with a co-morbid mental health problem and followed up at 180 days. Results: twenty-seven per cent did not return to their original place of residence after the hospital admission. After 180 days 31% had died, 42% had been readmitted and 24% of community residents had moved to a care home. Only 31% survived without being readmitted or moving to a care home. However, 16% spent >170 of the 180 days at home. Significant predictors for poor outcomes were co-morbidity, nutrition, cognitive function, reduction in activities of daily living ability prior to admission, behavioural and psychiatric problems and depression. Only 42% of survivors recovered to their pre-acute illness level of function. Clinically significant behavioural and psychiatric symptoms were present at follow-up in 71% of survivors with baseline cognitive impairment, and new symptoms developed frequently in this group. Conclusions: the variable, but often adverse, outcomes in this group implies a wide range of health and social care needs. Community and acute services to meet these needs should be anticipated and provided for. PMID:23800454

  15. Problem Periods

    MedlinePlus

    ... gov/ Home Body Getting your period Problem periods Problem periods It’s common to have cramps or feel ... doctor Some common period problems Signs of period problems top One way to know if you may ...

  16. Balance Problems

    MedlinePlus

    ... it could be a sign of a balance problem. Balance problems can make you feel unsteady or as if ... related injuries, such as hip fracture. Some balance problems are due to problems in the inner ear. ...

  17. Balance Problems

    MedlinePlus

    ... often, it could be a sign of a balance problem. Balance problems can make you feel unsteady or as ... fall-related injuries, such as hip fracture. Some balance problems are due to problems in the inner ...

  18. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information. Part 2—Application to crosshole GPR tomography

    NASA Astrophysics Data System (ADS)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Looms, Majken Caroline; Mosegaard, Klaus

    2013-03-01

    We present an application of the SIPPI Matlab toolbox, to obtain a sample from the a posteriori probability density function for the classical tomographic inversion problem. We consider a number of different forward models, linear and non-linear, such as ray based forward models that rely on the high frequency approximation of the wave-equation and 'fat' ray based forward models relying on finite frequency theory. In order to sample the a posteriori probability density function we make use of both least squares based inversion, for linear Gaussian inverse problems, and the extended Metropolis sampler, for non-linear non-Gaussian inverse problems. To illustrate the applicability of the SIPPI toolbox to a tomographic field data set we use a cross-borehole traveltime data set from Arrenæs, Denmark. Both the computer code and the data are released in the public domain using open source and open data licenses. The code has been developed to facilitate inversion of 2D and 3D travel time tomographic data using a wide range of possible a priori models and choices of forward models.

  19. Interpolation Errors in Spectrum Analyzers

    NASA Technical Reports Server (NTRS)

    Martin, J. L.

    1996-01-01

    To obtain the proper measurement amplitude with a spectrum analyzer, the correct frequency-dependent transducer factor must be added to the voltage measured by the transducer. This report examines how entering transducer factors into a spectrum analyzer can cause significant errors in field amplitude due to the misunderstanding of the analyzer's interpolation methods. It also discusses how to reduce these errors to obtain a more accurate field amplitude reading.

  20. Going beyond the hero in leadership development: the place of healthcare context, complexity and relationships: Comment on "Leadership and leadership development in healthcare settings - a simplistic solution to complex problems?".

    PubMed

    Ford, Jackie

    2015-04-01

    There remains a conviction that the torrent of publications and the financial outlay on leadership development will create managers with the skills and characters of perfect leaders, capable of guiding healthcare organisations through the challenges and crises of the 21st century. The focus of much attention continues to be the search for the (illusory) core set of heroic qualities, abilities or competencies that will enable the development of leaders to achieve levels of supreme leadership and organisational performance. This brief commentary adds support to McDonald's (1) call for recognition of the complexity of the undertaking. PMID:25844391

  1. Problems of Advocacy.

    ERIC Educational Resources Information Center

    Ross, Robert

    This paper focusses on problems involved in adopting the newer modes of social science and professionalism. The complexities which precede these problems are recognized. It is mentioned that many recent entrants to these fields have made value commitments in their professionalism without very long detention at the portals of objectivity or…

  2. Problem-Based Learning

    ERIC Educational Resources Information Center

    Allen, Deborah E.; Donham, Richard S.; Bernhardt, Stephen A.

    2011-01-01

    In problem-based learning (PBL), students working in collaborative groups learn by resolving complex, realistic problems under the guidance of faculty. There is some evidence of PBL effectiveness in medical school settings where it began, and there are numerous accounts of PBL implementation in various undergraduate contexts, replete with…

  3. The problems associated with the monitoring of complex workplace radiation fields at European high-energy accelerators and thermonuclear fusion facilities.

    PubMed

    Bilski, P; Blomgren, J; d'Errico, F; Esposito, A; Fehrenbacher, G; Fernàndez, F; Fuchs, A; Golnik, N; Lacoste, V; Leuschner, A; Sandri, S; Silari, M; Spurny, F; Wiegel, B; Wright, P

    2007-01-01

    The European Commission is funding within its Sixth Framework Programme a three-year project (2005-2007) called CONRAD, COordinated Network for RAdiation Dosimetry. The organisational framework for this project is provided by the European Radiation Dosimetry Group EURADOS. One task within the CONRAD project, Work Package 6 (WP6), was to provide a report outlining research needs and research activities within Europe to develop new and improved methods and techniques for the characterisation of complex radiation fields at workplaces around high-energy accelerators, but also at the next generation of thermonuclear fusion facilities. The paper provides an overview of the report, which will be available as CERN Yellow Report. PMID:17496292

  4. The possibility of generating focal regions of complex configurations in application to the problems of stimulation of human receptor structures by focused ultrasound

    NASA Astrophysics Data System (ADS)

    Gavrilov, L. R.

    2008-03-01

    Studies of the stimulating effect of ultrasound on human receptor structures have recently become more intensive in connection with the development of promising robotic techniques and systems, sensors, and automated control systems, as well as with the use of taction in the design of a human-machine interface. One of the promising fields of research is the development of tactile displays for transmission of sensory data to a human by an acoustic method based on the effect of radiation pressure. In this case, it is necessary to generate rapidly changing patterns on a display (symbols, letters, digits, etc.), which may often have a complex shape. It is demonstrated that such patterns can be created by the generation of multiple-focus ultrasonic fields with the help of two-dimensional phased arrays whose elements are randomly positioned on the surface. The parameters for such an array are presented. It is shown that the arrays make it possible to form the regions of action by focused ultrasound with various necessary shapes and the sidelobe (or other secondary peak) intensity level acceptable for practical purposes. Using these arrays, it is possible to move the set of foci off the array axis to a distance of at least ±5 mm, which corresponds to the display dimensions. It is possible, on the screen of a tactile display, to generate the regions of action with a very complex shape, for example, Latin letters. This opportunity may be of interest, for example, for the development of systems that enable a blind person to perceive the displayed text information by using the sense of touch.

  5. Low cost methodologies to analyze and correct abnormal production decline in stripper gas wells

    SciTech Connect

    James, J.; Huck, G.; Knobloch, T.

    2000-07-01

    The goal of this research program is to develop and deliver a procedure guide of low cost methodologies to analyze and correct problems with stripper wells experiencing abnormal production declines. A study group of wells will provide data to determine the historic frequency of the problem of abnormal production declines in stripper gas wells and the historic frequency of the causes of the production problems. Once the most frequently occurring causes of the production problems are determined, data collection forms and decision trees will be designed to cost-effectively diagnose these problems and suggest corrective action. Finally, economic techniques to solve the most frequently occurring problems will be researched and implemented. These systematic methodologies and techniques will increase the efficiency of problem assessment and implementation of solutions for stripper gas wells. This third quarterly technical report was to describe the data reduction and methodologies to develop decision trees, identify cost effective techniques to solve the most frequently experienced problems and then apply the methodology to a group of wells where recent problems have developed. Further, this third quarterly technical report was to describe the data reduction and methodologies to select the two wells with the greatest potential for increase and also having the most frequently occurring problem, and evaluate the results of the methodology and the implemented procedure. However, preparation and analysis of the decision trees is more complex than initially anticipated due to the combination of problems rather than identifiable individual problems. Therefore, this portion of the study is still in progress. We have requested and been granted verbal approval for a six month no cost extension to allow more time to thoroughly investigate this portion of the study. The delivery of the decision trees will be included in future technical reports. Work on the other tasks to be

  6. Nuclear fuel microsphere gamma analyzer

    DOEpatents

    Valentine, Kenneth H.; Long, Jr., Ernest L.; Willey, Melvin G.

    1977-01-01

    A gamma analyzer system is provided for the analysis of nuclear fuel microspheres and other radioactive particles. The system consists of an analysis turntable with means for loading, in sequence, a plurality of stations within the turntable; a gamma ray detector for determining the spectrum of a sample in one section; means for analyzing the spectrum; and a receiver turntable to collect the analyzed material in stations according to the spectrum analysis. Accordingly, particles may be sorted according to their quality; e.g., fuel particles with fractured coatings may be separated from those that are not fractured, or according to other properties.

  7. Problem-Based Learning Tools

    ERIC Educational Resources Information Center

    Chin, Christine; Chia, Li-Gek

    2008-01-01

    One way of implementing project-based science (PBS) is to use problem-based learning (PBL), in which students formulate their own problems. These problems are often ill-structured, mirroring complex real-life problems where data are often messy and inclusive. In this article, the authors describe how they used PBL in a ninth-grade biology class in…

  8. Problem Solving in the Professions.

    ERIC Educational Resources Information Center

    Jackling, Noel; And Others

    1990-01-01

    It is proposed that algorithms and heuristics are useful in improving professional problem-solving abilities when contextualized within the academic discipline. A basic algorithm applied to problem solving in undergraduate engineering education and a similar algorithm applicable to legal problems are used as examples. Problem complexity and…

  9. Market study: Whole blood analyzer

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A market survey was conducted to develop findings relative to the commercialization potential and key market factors of the whole blood analyzer which is being developed in conjunction with NASA's Space Shuttle Medical System.

  10. Molecular wake shield gas analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, J. H.

    1980-01-01

    Techniques for measuring and characterizing the ultrahigh vacuum in the wake of an orbiting spacecraft are studied. A high sensitivity mass spectrometer that contains a double mass analyzer consisting of an open source miniature magnetic sector field neutral gas analyzer and an identical ion analyzer is proposed. These are configured to detect and identify gas and ion species of hydrogen, helium, nitrogen, oxygen, nitric oxide, and carbon dioxide and any other gas or ion species in the 1 to 46 amu mass range. This range covers the normal atmospheric constituents. The sensitivity of the instrument is sufficient to measure ambient gases and ion with a particle density of the order of one per cc. A chemical pump, or getter, is mounted near the entrance aperture of the neutral gas analyzer which integrates the absorption of ambient gases for a selectable period of time for subsequent release and analysis. The sensitivity is realizable for all but rare gases using this technique.

  11. A Stochastic Employment Problem

    ERIC Educational Resources Information Center

    Wu, Teng

    2013-01-01

    The Stochastic Employment Problem(SEP) is a variation of the Stochastic Assignment Problem which analyzes the scenario that one assigns balls into boxes. Balls arrive sequentially with each one having a binary vector X = (X[subscript 1], X[subscript 2],...,X[subscript n]) attached, with the interpretation being that if X[subscript i] = 1 the ball…

  12. Childbirth Problems

    MedlinePlus

    ... labor starts before 37 completed weeks of pregnancy Problems with the umbilical cord Problems with the position of the baby, such as ... feet first Birth injuries For some of these problems, the baby may need to be delivered surgically ...

  13. Balance Problems

    MedlinePlus

    ... version of this page please turn Javascript on. Balance Problems About Balance Problems Have you ever felt dizzy, lightheaded, or ... dizziness problem during the past year. Why Good Balance is Important Having good balance means being able ...

  14. Non starch polysaccharide hydrolyzing enzymes as feed additives: detection of enzyme activities and problems encountered with quantitative determination in complex samples.

    PubMed

    Vahjen, W; Gläser, K; Froeck, M; Simon, O

    1997-01-01

    Chromogenic substrates, an agar diffusion assay and viscosity reduction were used to estimate beta-glucanase and xylanase activities in water soluble extracts of different feedstuffs and digesta supernatants. The dinitrosalicylic acid reducing sugar method was employed to calibrate results from different methods based on international units (IU, glucose equivalents). The detection of dye release from chromogenic substrates was a suitable method, allowing the detection of 0.05 IU of enzyme activity per ml of extract, although measurements in digesta supernatants were limited in linearity (0.1-0.5 IU/ml supernatant). With the agar diffusion assay the detection of enzyme activity was possible over a wider concentration range (extracts: 0.05-1 IU/ml, digesta supernatants: 0.1-1 IU/ml), but visual evaluation led to inaccurate measurement. Accuracy can be improved by computer based evaluation of digital images. The use of viscosity reduction produced linear standard curves from 0.01 to 0.5 IU/ml in feed extracts, but reliability of measurements depended on modification of substrates. Quantification of enzyme activities was influenced by matrix effects of complex samples. Cereal dependant differences were found in various extracts of feed mixtures and cereal extracts. Digesta supernatants partly inhibited enzyme activity, depending on the origin of the sample. Interaction of substrates with digesta components varied between methods. The sensitivity of the methods is comparable, however, all methods require specific calibrations to account for matrix- and enzyme specific effects. PMID:9345597

  15. Traumatic Brain Injury and Aging: Is a Combination of Progesterone and Vitamin D Hormone a Simple Solution to a Complex Problem?

    PubMed Central

    Cekic, Milos; Stein, Donald G.

    2010-01-01

    Summary Although progress is being made in the development of new clinical treatments for traumatic brain injury (TBI), little is known about whether such treatments are effective in older patients, in whom frailty, prior medical conditions, altered metabolism, and changing sensitivity to medications all can affect outcomes following a brain injury. In this review we consider TBI to be a complex, highly variable, and systemic disorder that may require a new pharmacotherapeutic approach, one using combinations or cocktails of drugs to treat the many components of the injury cascade. We review some recent research on the role of vitamin D hormone and vitamin D deficiency in older subjects, and on the interactions of these factors with progesterone, the only treatment for TBI that has shown clinical effectiveness. Progesterone is now in phase III multicenter trial testing in the United States. We also discuss some of the potential mechanisms and pathways through which the combination of hormones may work, singly and in synergy, to enhance survival and recovery after TBI. PMID:20129500

  16. On-Demand Urine Analyzer

    NASA Technical Reports Server (NTRS)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  17. Rotor for centrifugal fast analyzers

    DOEpatents

    Lee, N.E.

    1984-01-01

    The invention is an improved photometric analyzer of the rotary cuvette type, the analyzer incorporating a multicuvette rotor of novel design. The rotor (a) is leaktight, (b) permits operation in the 90/sup 0/ and 180/sup 0/ excitation modes, (c) is compatible with extensively used Centrifugal Fast Analyzers, and (d) can be used thousands of times. The rotor includes an assembly comprising a top plate, a bottom plate, and a central plate, the rim of the central plate being formed with circumferentially spaced indentations. A uv-transmitting ring is sealably affixed to the indented rim to define with the indentations an array of cuvettes. The ring serves both as a sealing means and an end window for the cuvettes.

  18. Rotor for centrifugal fast analyzers

    DOEpatents

    Lee, Norman E.

    1985-01-01

    The invention is an improved photometric analyzer of the rotary cuvette type, the analyzer incorporating a multicuvette rotor of novel design. The rotor (a) is leaktight, (b) permits operation in the 90.degree. and 180.degree. excitation modes, (c) is compatible with extensively used Centrifugal Fast Analyzers, and (d) can be used thousands of times. The rotor includes an assembly comprising a top plate, a bottom plate, and a central plate, the rim of the central plate being formed with circumferentially spaced indentations. A UV-transmitting ring is sealably affixed to the indented rim to define with the indentations an array of cuvettes. The ring serves both as a sealing means and an end window for the cuvettes.

  19. Structural qualia: a solution to the hard problem of consciousness

    PubMed Central

    Loorits, Kristjan

    2014-01-01

    The hard problem of consciousness has been often claimed to be unsolvable by the methods of traditional empirical sciences. It has been argued that all the objects of empirical sciences can be fully analyzed in structural terms but that consciousness is (or has) something over and above its structure. However, modern neuroscience has introduced a theoretical framework in which also the apparently non-structural aspects of consciousness, namely the so called qualia or qualitative properties, can be analyzed in structural terms. That framework allows us to see qualia as something compositional with internal structures that fully determine their qualitative nature. Moreover, those internal structures can be identified which certain neural patterns. Thus consciousness as a whole can be seen as a complex neural pattern that misperceives some of its own highly complex structural properties as monadic and qualitative. Such neural pattern is analyzable in fully structural terms and thereby the hard problem is solved. PMID:24672510

  20. Thermal Radiation Analyzer System (TRASYS)

    NASA Technical Reports Server (NTRS)

    Vogt, R. A.

    1993-01-01

    Working alone or with SINDA '85/FLUINT, TRASYS solves radiation components of thermal analysis problems. Calculates both internode radiation exchange and incident and absorbed heat rate due to sunlight. Used in satellite design, program handles situations where one surface wholly or partially shades another from direct sunlight.

  1. An Astronomical Data Analyzing Monitor

    NASA Astrophysics Data System (ADS)

    Teuber, D.

    ThP need for exchange of programmes and data between astronomical facilities is generally recognized, but practicable concepts concerning its realization are rare. Standardization of data formats through FITS is widely accepted; for (interactive) programs, however, identical hardware configurations seem to be the favoured solution. As an alternative, a software approach to the problem is presented.

  2. Analyzing Media: Metaphors as Methodologies.

    ERIC Educational Resources Information Center

    Meyrowitz, Joshua

    Students have little intuitive insight into the process of thinking and structuring ideas. The image of metaphor for a phenomenon acts as a kind of methodology for the study of the phenomenon by (1) defining the key issues or problems; (2) shaping the type of research questions that are asked; (3) defining the type of data that are searched out;…

  3. Systems improved numerical differencing analyzer

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Program solves physical problems governed by diffusion-type equations, provided that equations can be modeled by lumped-parameter representation. Program is used for thermal analysis, and could be adapted to solve Fourier, Poisson, and Laplace differential equations. Program is in FORTRAN IV and Assembler for execution on UNIVAC 1100-series or CYBER 175.

  4. Real time infrared aerosol analyzer

    DOEpatents

    Johnson, Stanley A.; Reedy, Gerald T.; Kumar, Romesh

    1990-01-01

    Apparatus for analyzing aerosols in essentially real time includes a virtual impactor which separates coarse particles from fine and ultrafine particles in an aerosol sample. The coarse and ultrafine particles are captured in PTFE filters, and the fine particles impact onto an internal light reflection element. The composition and quantity of the particles on the PTFE filter and on the internal reflection element are measured by alternately passing infrared light through the filter and the internal light reflection element, and analyzing the light through infrared spectrophotometry to identify the particles in the sample.

  5. Analyzing epithelial and endothelial kisses in Merida

    PubMed Central

    Nusrat, Asma; Quiros, Miguel; González-Mariscal, Lorenza

    2013-01-01

    Last November a group of principal investigators, postdoctoral fellows and PhD students from around the world got together in the city of Merida in Southeastern Mexico in a State of the Art meeting on the “Molecular structure and function of the apical junctional complex in epithelial and endothelia.” They analyzed diverse tissue barriers including those in the gastrointestinal tract, the blood brain barrier, blood neural and blood retinal barriers. The talks revealed exciting new findings in the field, novel technical approaches and unpublished data and highlighted the importance of studying junctional complexes to better understand several pathogenesis and to develop therapeutic approaches that can be utilized for drug delivery. This meeting report has the purpose of highlighting the results and advances discussed by the speakers at the Merida Meeting.

  6. Six Questions on Complex Systems

    NASA Astrophysics Data System (ADS)

    Symons, John F.; Sanayei, Ali

    2011-09-01

    This paper includes an interview with John F. Symons regarding some important questions in "complex systems" and "complexity". In addition, he has stated some important open problems concerning complex systems in his research area from a philosophical point of view.

  7. Nonlinear single-spin spectrum analyzer.

    PubMed

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-15

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis. PMID:25166519

  8. Using SCR methods to analyze requirements documentation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Morrison, Jeffery

    1995-01-01

    Software Cost Reduction (SCR) methods are being utilized to analyze and verify selected parts of NASA's EOS-DIS Core System (ECS) requirements documentation. SCR is being used as a spot-inspection tool. Through this formal and systematic approach of the SCR requirements methods, insights as to whether the requirements are internally inconsistent or incomplete as the scenarios of intended usage evolve in the OC (Operations Concept) documentation. Thus, by modelling the scenarios and requirements as mode charts using the SCR methods, we have been able to identify problems within and between the documents.

  9. Nonlinear Single-Spin Spectrum Analyzer

    NASA Astrophysics Data System (ADS)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-01

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  10. Pollution Analyzing and Monitoring Instruments.

    ERIC Educational Resources Information Center

    1972

    Compiled in this book is basic, technical information useful in a systems approach to pollution control. Descriptions and specifications are given of what is available in ready made, on-the-line commercial equipment for sampling, monitoring, measuring and continuously analyzing the multitudinous types of pollutants found in the air, water, soil,…

  11. Therapy Talk: Analyzing Therapeutic Discourse

    ERIC Educational Resources Information Center

    Leahy, Margaret M.

    2004-01-01

    Therapeutic discourse is the talk-in-interaction that represents the social practice between clinician and client. This article invites speech-language pathologists to apply their knowledge of language to analyzing therapy talk and to learn how talking practices shape clinical roles and identities. A range of qualitative research approaches,…

  12. Software-Design-Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    CRISP-90 software-design-analyzer system, update of CRISP-80, is set of computer programs constituting software tool for design and documentation of other software and supporting top-down, hierarchical, modular, structured methodologies for design and programming. Written in Microsoft QuickBasic.

  13. Strategies for Analyzing Tone Languages

    ERIC Educational Resources Information Center

    Coupe, Alexander R.

    2014-01-01

    This paper outlines a method of auditory and acoustic analysis for determining the tonemes of a language starting from scratch, drawing on the author's experience of recording and analyzing tone languages of north-east India. The methodology is applied to a preliminary analysis of tone in the Thang dialect of Khiamniungan, a virtually undocumented…

  14. Microcomputer to Multichannel Analyzer Interface.

    ERIC Educational Resources Information Center

    Metz, Roger N.

    1982-01-01

    Describes a microcomputer-based multichannel analyzer (MCA) in which the front end is connected to a microcomputer through a custom interface. Thus an MCA System of 1024 channel resolution, programmable in Basic rather than in machine language and having moderate cost, is achieved. (Author/SK)

  15. Helping Students Analyze Business Documents.

    ERIC Educational Resources Information Center

    Devet, Bonnie

    2001-01-01

    Notes that student writers gain greater insight into the importance of audience by analyzing business documents. Discusses how business writing teachers can help students understand the rhetorical refinements of writing to an audience. Presents an assignment designed to lead writers systematically through an analysis of two advertisements. (SG)

  16. Developpement d'une plateforme de calcul d'equilibres chimiques complexes et adaptation aux problemes electrochimiques et d'equilibres contraints

    NASA Astrophysics Data System (ADS)

    Neron, Alex

    Avec l'arrivée de l'environnement comme enjeu mondial, le secteur de l'efficacité énergétique prend une place de plus en plus importante pour les entreprises autant au niveau économique que pour l'image de la compagnie. Par le fait même, le domaine des technologies de l'énergie est un créneau de recherche dont les projets en cours se multiplient. D'ailleurs, un des problèmes qui peut survenir fréquemment dans certaines entreprises est d'aller mesurer la composition des matériaux dans des conditions difficiles d'accès. C'est le cas par exemple de l'électrolyse de l'aluminium qui se réalise à des températures très élevées. Pour pallier à ce problème, il faut créer et valider des modèles mathématiques qui vont calculer la composition et les propriétés à l'équilibre du système chimique. Ainsi, l'objectif global du projet de recherche est de développer un outil de calcul d'équilibres chimiques complexes (plusieurs réactions et plusieurs phases) et l'adapter aux problèmes électrochimiques et d'équilibres contraints. Plus spécifiquement, la plateforme de calcul doit tenir compte de la variation de température due à un gain ou une perte en énergie du système. Elle doit aussi considérer la limitation de l'équilibre due à un taux de réaction et enfin, résoudre les problèmes d'équilibres électrochimiques. Pour y parvenir, les propriétés thermodynamiques telles que l'énergie libre de Gibbs, la fugacité et l'activité sont tout d'abord étudiées pour mieux comprendre les interactions moléculaires qui régissent les équilibres chimiques. Ensuite, un bilan énergétique est inséré à la plateforme de calcul, ce qui permet de calculer la température à laquelle le système est le plus stable en fonction d'une température initiale et d'une quantité d'énergie échangée. Puis, une contrainte cinétique est ajoutée au système afin de calculer les équilibres pseudo-stationnaires en évolution dans le temps. De plus, la

  17. The Statistical Loop Analyzer (SLA)

    NASA Technical Reports Server (NTRS)

    Lindsey, W. C.

    1985-01-01

    The statistical loop analyzer (SLA) is designed to automatically measure the acquisition, tracking and frequency stability performance characteristics of symbol synchronizers, code synchronizers, carrier tracking loops, and coherent transponders. Automated phase lock and system level tests can also be made using the SLA. Standard baseband, carrier and spread spectrum modulation techniques can be accomodated. Through the SLA's phase error jitter and cycle slip measurements the acquisition and tracking thresholds of the unit under test are determined; any false phase and frequency lock events are statistically analyzed and reported in the SLA output in probabilistic terms. Automated signal drop out tests can be performed in order to trouble shoot algorithms and evaluate the reacquisition statistics of the unit under test. Cycle slip rates and cycle slip probabilities can be measured using the SLA. These measurements, combined with bit error probability measurements, are all that are needed to fully characterize the acquisition and tracking performance of a digital communication system.

  18. DEEP WATER ISOTOPIC CURRENT ANALYZER

    DOEpatents

    Johnston, W.H.

    1964-04-21

    A deepwater isotopic current analyzer, which employs radioactive isotopes for measurement of ocean currents at various levels beneath the sea, is described. The apparatus, which can determine the direction and velocity of liquid currents, comprises a shaft having a plurality of radiation detectors extending equidistant radially therefrom, means for releasing radioactive isotopes from the shaft, and means for determining the time required for the isotope to reach a particular detector. (AEC)

  19. Computational complexity in entanglement transformations

    NASA Astrophysics Data System (ADS)

    Chitambar, Eric A.

    In physics, systems having three parts are typically much more difficult to analyze than those having just two. Even in classical mechanics, predicting the motion of three interacting celestial bodies remains an insurmountable challenge while the analogous two-body problem has an elementary solution. It is as if just by adding a third party, a fundamental change occurs in the structure of the problem that renders it unsolvable. In this thesis, we demonstrate how such an effect is likewise present in the theory of quantum entanglement. In fact, the complexity differences between two-party and three-party entanglement become quite conspicuous when comparing the difficulty in deciding what state changes are possible for these systems when no additional entanglement is consumed in the transformation process. We examine this entanglement transformation question and its variants in the language of computational complexity theory, a powerful subject that formalizes the concept of problem difficulty. Since deciding feasibility of a specified bipartite transformation is relatively easy, this task belongs to the complexity class P. On the other hand, for tripartite systems, we find the problem to be NP-Hard, meaning that its solution is at least as hard as the solution to some of the most difficult problems humans have encountered. One can then rigorously defend the assertion that a fundamental complexity difference exists between bipartite and tripartite entanglement since unlike the former, the full range of forms realizable by the latter is incalculable (assuming P≠NP). However, similar to the three-body celestial problem, when one examines a special subclass of the problem---invertible transformations on systems having at least one qubit subsystem---we prove that the problem can be solved efficiently. As a hybrid of the two questions, we find that the question of tripartite to bipartite transformations can be solved by an efficient randomized algorithm. Our results are

  20. Analyzing ion distributions around DNA.

    PubMed

    Lavery, Richard; Maddocks, John H; Pasi, Marco; Zakrzewska, Krystyna

    2014-07-01

    We present a new method for analyzing ion, or molecule, distributions around helical nucleic acids and illustrate the approach by analyzing data derived from molecular dynamics simulations. The analysis is based on the use of curvilinear helicoidal coordinates and leads to highly localized ion densities compared to those obtained by simply superposing molecular dynamics snapshots in Cartesian space. The results identify highly populated and sequence-dependent regions where ions strongly interact with the nucleic and are coupled to its conformational fluctuations. The data from this approach is presented as ion populations or ion densities (in units of molarity) and can be analyzed in radial, angular and longitudinal coordinates using 1D or 2D graphics. It is also possible to regenerate 3D densities in Cartesian space. This approach makes it easy to understand and compare ion distributions and also allows the calculation of average ion populations in any desired zone surrounding a nucleic acid without requiring references to its constituent atoms. The method is illustrated using microsecond molecular dynamics simulations for two different DNA oligomers in the presence of 0.15 M potassium chloride. We discuss the results in terms of convergence, sequence-specific ion binding and coupling with DNA conformation. PMID:24906882

  1. Remote Laser Diffraction PSD Analyzer

    SciTech Connect

    T. A. Batcheller; G. M. Huestis; S. M. Bolton

    2000-06-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified off-the-shelf classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a hot cell (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  2. Remote Laser Diffraction PSD Analyzer

    SciTech Connect

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-06-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified "off-the-shelf" classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a "hot cell" (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  3. Infrared analyzers for process measurements

    NASA Astrophysics Data System (ADS)

    Hyvarinen, Timo S.; Lammasniemi, Jorma; Malinen, Jouko; Niemela, Pentti; Tenhunen, Jussi

    1993-01-01

    Optical analysis techniques, infrared spectroscopy in the front end, are rapidly achieving new applications in process control. This progress is accelerated by the development of more rugged instrument constructions. This paper describes two analyzer techniques especially developed for use in demanding environments. First, the integrated multichannel detector techniques is suitable for applications where the measurement can be accomplished by using 2 to 4 wavelengths. This technique has been used to construct several compact, portable and battery-operated IR analyzers, and process analyzers which measure exactly simultaneously at each wavelength resulting in very high tolerance against rapid changes and flow of the process stream. Secondly, a miniaturized Fourier transform infrared (FTIR) spectrometer is being developed for use as an OEM module in specific process and laboratory instruments. Special attention has been paid to increase the resistance of FTIR technique to ambient vibrations. The module contains an integrated digital signal processing electronics for intelligent control of the spectrometer and for fast real time spectral data treatment. Application studies include on line measurement of the concentrations of diluted and colloidal organic detrimental substances, especially pitch components, in the circulating waters in paper machine wet end.

  4. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    PubMed

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  5. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    PubMed Central

    Cheung, Mike W.-L.; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  6. Walking Problems

    MedlinePlus

    ... daily activities, get around, and exercise. Having a problem with walking can make daily life more difficult. ... walk is called your gait. A variety of problems can cause an abnormal gait and lead to ...

  7. Breathing Problems

    MedlinePlus

    ... re not getting enough air. Sometimes mild breathing problems are from a stuffy nose or hard exercise. ... emphysema or pneumonia cause breathing difficulties. So can problems with your trachea or bronchi, which are part ...

  8. Erection problems

    MedlinePlus

    ... cord injury In some cases, your emotions or relationship problems can lead to ED, such as: Poor ... you stressed, depressed, or anxious? Are you having relationship problems? You may have a number of different ...

  9. Joint Problems

    MedlinePlus

    ... ankles and toes. Other types of arthritis include gout or pseudogout. Sometimes, there is a mechanical problem ... for more information on osteoarthritis, rheumatoid arthritis and gout. How Common are Joint Problems? Osteoarthritis, which affects ...

  10. ITK and ANALYZE: a synergistic integration

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Robb, Richard A.

    2004-05-01

    The Insight Toolkit (ITK) is a C++ open-source software toolkit developed under sponsorship of the National Library of Medicine. It provides advanced algorithms for performing image registration and segmentation, but does not provide support for visualization and analysis, nor does it offer any graphical user interface (GUI). The purpose of this integration project is to make ITK readily accessible to end-users with little or no programming skills, and provide interactive processing, visualization and measurement capabilities. This is achieved through the integration of ITK with ANALYZE, a multi-dimension image visualization/analysis application installed in over 300 institutions around the world, with a user-base in excess of 4000. This integration is carried out at both the software foundation and GUI levels. The foundation technology upon which ANALYZE is built is a comprehensive C-function library called AVW. A new set of AVW-ITK functions have been developed and integrated into the AVW library, and four new ITK modules have been added to the ANALYZE interface. Since ITK is a software developer"s toolkit, the only way to access its intrinsic power is to write programs that incorporate it. Integrating ITK with ANALYZE opens the ITK algorithms to end-users who otherwise might never be able to take advantage of the toolkit"s advanced functionality. In addition, this integration provides end-to-end interactive problem solving capabilities which allow all users, including programmers, an integrated system to readily display and quantitatively evaluate the results from the segmentation and registration routines in ITK, regardless of the type or format of input images, which are comprehensively supported in ANALYZE.

  11. Analyzing Ramp Compression Wave Experiments

    NASA Astrophysics Data System (ADS)

    Hayes, D. B.

    2007-12-01

    Isentropic compression of a solid to 100's of GPa by a ramped, planar compression wave allows measurement of material properties at high strain and at modest temperature. Introduction of a measurement plane disturbs the flow, requiring special analysis techniques. If the measurement interface is windowed, the unsteady nature of the wave in the window requires special treatment. When the flow is hyperbolic the equations of motion can be integrated backward in space in the sample to a region undisturbed by the interface interactions, fully accounting for the untoward interactions. For more complex materials like hysteretic elastic/plastic solids or phase changing material, hybrid analysis techniques are required.

  12. The complex structured singular value

    NASA Technical Reports Server (NTRS)

    Packard, A.; Doyle, J.

    1993-01-01

    A tutorial introduction to the complex structured singular value (mu) is presented, with an emphasis on the mathematical aspects of mu. The mu-based methods discussed here have been useful for analyzing the performance and robustness properties of linear feedback systems. Several tests for robust stability and performance with computable bounds for transfer functions and their state space realizations are compared, and a simple synthesis problem is studied. Uncertain systems are represented using linear fractional transformations which naturally unify the frequency-domain and state space methods.

  13. Complex Langevin method: When can it be trusted?

    SciTech Connect

    Aarts, Gert; Seiler, Erhard; Stamatescu, Ion-Olimpiu

    2010-03-01

    We analyze to what extent the complex Langevin method, which is in principle capable of solving the so-called sign problems, can be considered as reliable. We give a formal derivation of the correctness and then point out various mathematical loopholes. The detailed study of some simple examples leads to practical suggestions about the application of the method.

  14. The Aqueduct Global Flood Analyzer

    NASA Astrophysics Data System (ADS)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  15. Truck acoustic data analyzer system

    DOEpatents

    Haynes, Howard D.; Akerman, Alfred; Ayers, Curtis W.

    2006-07-04

    A passive vehicle acoustic data analyzer system having at least one microphone disposed in the acoustic field of a moving vehicle and a computer in electronic communication the microphone(s). The computer detects and measures the frequency shift in the acoustic signature emitted by the vehicle as it approaches and passes the microphone(s). The acoustic signature of a truck driving by a microphone can provide enough information to estimate the truck speed in miles-per-hour (mph), engine speed in rotations-per-minute (RPM), turbocharger speed in RPM, and vehicle weight.

  16. Trace Gas Analyzer (TGA) program

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The design, fabrication, and test of a breadboard trace gas analyzer (TGA) is documented. The TGA is a gas chromatograph/mass spectrometer system. The gas chromatograph subsystem employs a recirculating hydrogen carrier gas. The recirculation feature minimizes the requirement for transport and storage of large volumes of carrier gas during a mission. The silver-palladium hydrogen separator which permits the removal of the carrier gas and its reuse also decreases vacuum requirements for the mass spectrometer since the mass spectrometer vacuum system need handle only the very low sample pressure, not sample plus carrier. System performance was evaluated with a representative group of compounds.

  17. Charged particle mobility refrigerant analyzer

    DOEpatents

    Allman, Steve L.; Chen, Chung-Hsuan; Chen, Fang C.

    1993-01-01

    A method for analyzing a gaseous electronegative species comprises the steps of providing an analysis chamber; providing an electric field of known potential within the analysis chamber; admitting into the analysis chamber a gaseous sample containing the gaseous electronegative species; providing a pulse of free electrons within the electric field so that the pulse of free electrons interacts with the gaseous electronegative species so that a swarm of electrically charged particles is produced within the electric field; and, measuring the mobility of the electrically charged particles within the electric field.

  18. Charged particle mobility refrigerant analyzer

    DOEpatents

    Allman, S.L.; Chunghsuan Chen; Chen, F.C.

    1993-02-02

    A method for analyzing a gaseous electronegative species comprises the steps of providing an analysis chamber; providing an electric field of known potential within the analysis chamber; admitting into the analysis chamber a gaseous sample containing the gaseous electronegative species; providing a pulse of free electrons within the electric field so that the pulse of free electrons interacts with the gaseous electronegative species so that a swarm of electrically charged particles is produced within the electric field; and, measuring the mobility of the electrically charged particles within the electric field.

  19. The OpenSHMEM Analyzer

    SciTech Connect

    Hernandez, Oscar

    2014-07-30

    The OpenSHMEM Analyzer is a compiler-based tool that can help users detect errors and provide useful analyses about their OpenSHMEM applications. The tool is built on top of the OpenUH compiler (a branch of Open64 compiler) and presents OpenSHMEM information as feedback to the user. Some of the analyses it provides include checks for correct usage of symmetric variables in OpenSHMEM calls, out-of-bounds checks for symmetric data, checks for the correct initialization of pointers to symmetric data, and symmetric data alias information.

  20. Method for analyzing microbial communities

    DOEpatents

    Zhou, Jizhong [Oak Ridge, TN; Wu, Liyou [Oak Ridge, TN

    2010-07-20

    The present invention provides a method for quantitatively analyzing microbial genes, species, or strains in a sample that contains at least two species or strains of microorganisms. The method involves using an isothermal DNA polymerase to randomly and representatively amplify genomic DNA of the microorganisms in the sample, hybridizing the resultant polynucleotide amplification product to a polynucleotide microarray that can differentiate different genes, species, or strains of microorganisms of interest, and measuring hybridization signals on the microarray to quantify the genes, species, or strains of interest.