Science.gov

Sample records for analyzing complex problems

  1. Analyzing the many skills involved in solving complex physics problems

    NASA Astrophysics Data System (ADS)

    Adams, Wendy K.; Wieman, Carl E.

    2015-05-01

    We have empirically identified over 40 distinct sub-skills that affect a person's ability to solve complex problems in many different contexts. The identification of so many sub-skills explains why it has been so difficult to teach or assess problem solving as a single skill. The existence of these sub-skills is supported by several studies comparing a wide range of individuals' strengths and weaknesses in these sub-skills, their "problem solving fingerprint," while solving different types of problems including a classical mechanics problem, quantum mechanics problems, and a complex trip-planning problem with no physics. We see clear differences in the problem solving fingerprint of physics and engineering majors compared to the elementary education majors that we tested. The implications of these findings for guiding the teaching and assessing of problem solving in physics instruction are discussed.

  2. The Bright Side of Being Blue: Depression as an Adaptation for Analyzing Complex Problems

    ERIC Educational Resources Information Center

    Andrews, Paul W.; Thomson, J. Anderson, Jr.

    2009-01-01

    Depression is the primary emotional condition for which help is sought. Depressed people often report persistent rumination, which involves analysis, and complex social problems in their lives. Analysis is often a useful approach for solving complex problems, but it requires slow, sustained processing, so disruption would interfere with problem…

  3. The bright side of being blue: depression as an adaptation for analyzing complex problems.

    PubMed

    Andrews, Paul W; Thomson, J Anderson

    2009-07-01

    Depression is the primary emotional condition for which help is sought. Depressed people often report persistent rumination, which involves analysis, and complex social problems in their lives. Analysis is often a useful approach for solving complex problems, but it requires slow, sustained processing, so disruption would interfere with problem solving. The analytical rumination hypothesis proposes that depression is an evolved response to complex problems, whose function is to minimize disruption and sustain analysis of those problems by (a) giving the triggering problem prioritized access to processing resources, (b) reducing the desire to engage in distracting activities (anhedonia), and (c) producing psychomotor changes that reduce exposure to distracting stimuli. As processing resources are limited, sustained analysis of the triggering problem reduces the ability to concentrate on other things. The hypothesis is supported by evidence from many levels-genes, neurotransmitters and their receptors, neurophysiology, neuroanatomy, neuroenergetics, pharmacology, cognition, behavior, and efficacy of treatments. In addition, the hypothesis provides explanations for puzzling findings in the depression literature, challenges the belief that serotonin transmission is low in depression, and has implications for treatment.

  4. The bright side of being blue: Depression as an adaptation for analyzing complex problems

    PubMed Central

    Andrews, Paul W.; Thomson, J. Anderson

    2009-01-01

    Depression ranks as the primary emotional problem for which help is sought. Depressed people often have severe, complex problems, and rumination is a common feature. Depressed people often believe that their ruminations give them insight into their problems, but clinicians often view depressive rumination as pathological because it is difficult to disrupt and interferes with the ability to concentrate on other things. Abundant evidence indicates that depressive rumination involves the analysis of episode-related problems. Because analysis is time consuming and requires sustained processing, disruption would interfere with problem-solving. The analytical rumination (AR) hypothesis proposes that depression is an adaptation that evolved as a response to complex problems and whose function is to minimize disruption of rumination and sustain analysis of complex problems. It accomplishes this by giving episode-related problems priority access to limited processing resources, by reducing the desire to engage in distracting activities (anhedonia), and by producing psychomotor changes that reduce exposure to distracting stimuli. Because processing resources are limited, the inability to concentrate on other things is a tradeoff that must be made to sustain analysis of the triggering problem. The AR hypothesis is supported by evidence from many levels, including genes, neurotransmitters and their receptors, neurophysiology, neuroanatomy, neuroenergetics, pharmacology, cognition and behavior, and the efficacy of treatments. In addition, we address and provide explanations for puzzling findings in the cognitive and behavioral genetics literatures on depression. In the process, we challenge the belief that serotonin transmission is low in depression. Finally, we discuss implications of the hypothesis for understanding and treating depression. PMID:19618990

  5. Analyzing and Solving Productivity Problems.

    ERIC Educational Resources Information Center

    Walsh, David S.; Johnson, Thomas J.

    1980-01-01

    The authors discuss ways to define a company's position on productivity, and explain productivity concepts. They describe a problem cause/solution set matrix with which to identify accurately the most probable cause of productivity problems. (SK)

  6. Implementation of Complexity Analyzing Based on Additional Effect

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang

    According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.

  7. Analyzing, solving offshore seawater injection problems

    SciTech Connect

    Al-Rubale, J.S.; Muhsin, A.A.; Shaker, H.A.; Washash, I.

    1988-01-01

    Changes in seawater treatment, necessary cleaning of injection lines, and modifying well completion practices have reduced injection well plugging on pressure maintenance projects operated by Abu Dhabi Marine Operating Co., (Adma-Opco) in Zakum and Umm Shaif fields, offshore Abu Dhabi, in the Arabian Gulf. Plugging was caused primarily by iron sulfide and corrosion products that were displaced down hole after being formed in the water distribution system. These materials, in turn, resulted from O/sub 2/ inadvertently entering the injection system where it combined with corrosive H/sub 2/S generated by sulfate-reducing bacteria. The problem was further compounded by debris peeling from the interior of well tubulars, a high solids content of brine used to complete injectors, and slime formation in injection pipe lines. Acidizing wells proved a quick method for partially restoring injectivity, but a continuing concerted effort is being made to achieve more permanent results by eliminating the O/sub 2/ and H/sub 2/S, which are at the root of the difficulty.

  8. Analyzing and Detecting Problems in Systems of Systems

    NASA Technical Reports Server (NTRS)

    Lindvall, Mikael; Ackermann, Christopher; Stratton, William C.; Sibol, Deane E.; Godfrey, Sally

    2008-01-01

    Many software systems are evolving complex system of systems (SoS) for which inter-system communication is mission-critical. Evidence indicates that transmission failures and performance issues are not uncommon occurrences. In a NASA-supported Software Assurance Research Program (SARP) project, we are researching a new approach addressing such problems. In this paper, we are presenting an approach for analyzing inter-system communications with the goal to uncover both transmission errors and performance problems. Our approach consists of a visualization and an evaluation component. While the visualization of the observed communication aims to facilitate understanding, the evaluation component automatically checks the conformance of an observed communication (actual) to a desired one (planned). The actual and the planned are represented as sequence diagrams. The evaluation algorithm checks the conformance of the actual to the planned diagram. We have applied our approach to the communication of aerospace systems and were successful in detecting and resolving even subtle and long existing transmission problems.

  9. IOSIE: A Method for Analyzing Student Behavioral Problems

    ERIC Educational Resources Information Center

    Scarpaci, Richard T.

    2007-01-01

    The author argues for a rational method to analyze behavior problems and proposes a method of identifying the problem, the objectives to be achieved, the solution, the implementation, and the evaluation (IOSIE) as a practical approach to assist teachers in resolving most classroom behavior management problems. The approach draws heavily on…

  10. Software Analyzes Complex Systems in Real Time

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Expert system software programs, also known as knowledge-based systems, are computer programs that emulate the knowledge and analytical skills of one or more human experts, related to a specific subject. SHINE (Spacecraft Health Inference Engine) is one such program, a software inference engine (expert system) designed by NASA for the purpose of monitoring, analyzing, and diagnosing both real-time and non-real-time systems. It was developed to meet many of the Agency s demanding and rigorous artificial intelligence goals for current and future needs. NASA developed the sophisticated and reusable software based on the experience and requirements of its Jet Propulsion Laboratory s (JPL) Artificial Intelligence Research Group in developing expert systems for space flight operations specifically, the diagnosis of spacecraft health. It was designed to be efficient enough to operate in demanding real time and in limited hardware environments, and to be utilized by non-expert systems applications written in conventional programming languages. The technology is currently used in several ongoing NASA applications, including the Mars Exploration Rovers and the Spacecraft Health Automatic Reasoning Pilot (SHARP) program for the diagnosis of telecommunication anomalies during the Neptune Voyager Encounter. It is also finding applications outside of the Space Agency.

  11. Quantum Computing: Solving Complex Problems

    ScienceCinema

    DiVincenzo, David [IBM Watson Research Center

    2016-07-12

    One of the motivating ideas of quantum computation was that there could be a new kind of machine that would solve hard problems in quantum mechanics. There has been significant progress towards the experimental realization of these machines (which I will review), but there are still many questions about how such a machine could solve computational problems of interest in quantum physics. New categorizations of the complexity of computational problems have now been invented to describe quantum simulation. The bad news is that some of these problems are believed to be intractable even on a quantum computer, falling into a quantum analog of the NP class. The good news is that there are many other new classifications of tractability that may apply to several situations of physical interest.

  12. Analyzing the Origins of Childhood Externalizing Behavioral Problems

    ERIC Educational Resources Information Center

    Barnes, J. C.; Boutwell, Brian B.; Beaver, Kevin M.; Gibson, Chris L.

    2013-01-01

    Drawing on a sample of twin children from the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B; Snow et al., 2009), the current study analyzed 2 of the most prominent predictors of externalizing behavioral problems (EBP) in children: (a) parental use of spankings and (b) childhood self-regulation. A variety of statistical techniques were…

  13. Analyzing the problem of falls among older people

    PubMed Central

    Dionyssiotis, Yannis

    2012-01-01

    Falls are a serious problem facing the elderly. The prevention of falls that contribute to disability, mainly in elderly people, is an important issue. Ensuring the greatest possible functionality for elderly people is an important element in the prevention of disability. This paper analyzes the importance of falls, risk factors for falls, and interventions to prevent falls. Recent publications as well as research regarding the prevention and rehabilitation for falls are reviewed. PMID:23055770

  14. Analyzing Problem's Difficulty Based on Neural Networks and Knowledge Map

    ERIC Educational Resources Information Center

    Kuo, Rita; Lien, Wei-Peng; Chang, Maiga; Heh, Jia-Sheng

    2004-01-01

    This paper proposes a methodology to calculate both the difficulty of the basic problems and the difficulty of solving a problem. The method to calculate the difficulty of problem is according to the process of constructing a problem, including Concept Selection, Unknown Designation, and Proposition Construction. Some necessary measures observed…

  15. Analyzing complex networks evolution through Information Theory quantifiers

    NASA Astrophysics Data System (ADS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martín Gómez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Niño/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  16. Complex Problem Solving in a Workplace Setting.

    ERIC Educational Resources Information Center

    Middleton, Howard

    2002-01-01

    Studied complex problem solving in the hospitality industry through interviews with six office staff members and managers. Findings show it is possible to construct a taxonomy of problem types and that the most common approach can be termed "trial and error." (SLD)

  17. The Process of Solving Complex Problems

    ERIC Educational Resources Information Center

    Fischer, Andreas; Greiff, Samuel; Funke, Joachim

    2012-01-01

    This article is about Complex Problem Solving (CPS), its history in a variety of research domains (e.g., human problem solving, expertise, decision making, and intelligence), a formal definition and a process theory of CPS applicable to the interdisciplinary field. CPS is portrayed as (a) knowledge acquisition and (b) knowledge application…

  18. Analyzing patterns in experts' approaches to solving experimental problems

    NASA Astrophysics Data System (ADS)

    Čančula, Maja Poklinek; Planinšič, Gorazd; Etkina, Eugenia

    2015-04-01

    We report detailed observations of three pairs of expert scientists and a pair of advanced undergraduate students solving an experimental optics problem. Using a new method ("transition graphs") of visualizing sequences of logical steps, we were able to compare the groups and identify patterns that could not be found using previously existing methods. While the problem solving of undergraduates significantly differed from that of experts at the beginning of the process, it gradually became more similar to the expert problem solving. We mapped problem solving steps and their sequence to the elements of an approach to teaching and learning physics called Investigative Science Learning Environment (ISLE), and we speculate that the ISLE educational framework closely represents the actual work of physicists.

  19. Analyzing Quadratic Unconstrained Binary Optimization Problems Via Multicommodity Flows

    PubMed Central

    Wang, Di; Kleinberg, Robert D.

    2009-01-01

    Quadratic Unconstrained Binary Optimization (QUBO) problems concern the minimization of quadratic polynomials in n {0, 1}-valued variables. These problems are NP-complete, but prior work has identified a sequence of polynomial-time computable lower bounds on the minimum value, denoted by C2, C3, C4,…. It is known that C2 can be computed by solving a maximum-flow problem, whereas the only previously known algorithms for computing Ck (k > 2) require solving a linear program. In this paper we prove that C3 can be computed by solving a maximum multicommodity flow problem in a graph constructed from the quadratic function. In addition to providing a lower bound on the minimum value of the quadratic function on {0, 1}n, this multicommodity flow problem also provides some information about the coordinates of the point where this minimum is achieved. By looking at the edges that are never saturated in any maximum multicommodity flow, we can identify relational persistencies: pairs of variables that must have the same or different values in any minimizing assignment. We furthermore show that all of these persistencies can be detected by solving single-commodity flow problems in the same network. PMID:20161596

  20. Special Education Provision in Nigeria: Analyzing Contexts, Problems, and Prospects

    ERIC Educational Resources Information Center

    Obiakor, Festus E.; Offor, MaxMary Tabugbo

    2011-01-01

    Nigeria has made some efforts to educate all of its citizenry, including those with disabilities. And, it has struggled to make sure that programs are available to those who need them. However, its traditional, sociocultural, and educational problems have prevented some programmatic consistency and progress. As a result, the special education…

  1. Program for Analyzing Flows in a Complex Network

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok Kumar

    2006-01-01

    Generalized Fluid System Simulation Program (GFSSP) version 4 is a general-purpose computer program for analyzing steady-state and transient flows in a complex fluid network. The program is capable of modeling compressibility, fluid transients (e.g., water hammers), phase changes, mixtures of chemical species, and such externally applied body forces as gravitational and centrifugal ones. A graphical user interface enables the user to interactively develop a simulation of a fluid network consisting of nodes and branches. The user can also run the simulation and view the results in the interface. The system of equations for conservation of mass, energy, chemical species, and momentum is solved numerically by a combination of the Newton-Raphson and successive-substitution methods.

  2. Problem in analyzing cystine stones using FTIR spectroscopy.

    PubMed

    Fazil Marickar, Y M; Lekshmi, P R; Varma, Luxmi; Koshy, Peter

    2009-10-01

    Cystine stones are produced by an inherited disorder of the transport of amino acid cystine that results in excess of cystine in the urine (cystinuria). Cystine calculi in urinary tract present a significant problem in patients. We have recorded that cystine calculi are very uncommon in our region. Cystine crystals are unusually identified in the urinary deposits. The problem of recognizing cystine by FTIR as a component in mixture of stones is significant. The problem is compounded by the similarity of wavelengths of cystine with that of whewellite and uric acid. The objective of this paper is to elucidate the problems of identifying cystine in stone analysis and identifying a solution to get over this deficiency. Out of 1,300 urinary stones analysed by ordinary wet chemical methods and infrared spectroscopy, 30 stone samples, which were reported to have cystine peaks in significant numbers, were selected. These samples were powdered, mixed with potassium bromide, pelletized and taken up for FTIR analysis. The wavelength patterns were scrutinized by comparing with the peaks obtained by the reference standards of cystine. Spectra were also obtained from pure cystine. Comparison of spectra with those of whewellite and uric acid was performed. Then the samples were taken for Scanning electron microscopy with elemental distribution analysis X-ray (SEM-EDAX). The samples were made conductive by gold sputtering and were fed into JEOL JSM 35 C SEM machine. Morphology was recorded by taking photographs. Further elemental distribution analysis (EDAX) was carried out to identify the elemental composition. Of the 30 samples taken up for FTIR analysis, all showed spectra identifiable with the reference peaks for cystine. However, when these peaks were compared with those of whewellite and uric acid, all the stone samples showed duplication of peaks for whewellite and uric acid and whewellite. The pure cystine spectra showed identifiable peaks are in the range of 3026, 1618

  3. Complex Problem Solving--More than Reasoning?

    ERIC Educational Resources Information Center

    Wustenberg, Sascha; Greiff, Samuel; Funke, Joachim

    2012-01-01

    This study investigates the internal structure and construct validity of Complex Problem Solving (CPS), which is measured by a "Multiple-Item-Approach." It is tested, if (a) three facets of CPS--"rule identification" (adequateness of strategies), "rule knowledge" (generated knowledge) and "rule application" (ability to control a system)--can be…

  4. [Analyzing the risk problem in couples with serodiscordance].

    PubMed

    de Amorim, Camila Miranda; Szapiro, Ana Maria

    2008-01-01

    This article relates an investigation made with people who live in a situation of partner serodiscordance, shall say in partnerships in which one partner is HIV positive and the other is HIV negative. The study is aimed at understanding how these people deal with the constant risk this situation involves. Fifteen individuals living with serodiscordant partners were interviewed in the Testing and Counseling Center of the São Francisco de Assis School Hospital of the Federal University of Rio de Janeiro. The fear of transmitting the HIV to the seronegative partner is constant. Besides the fear, there are the difficulties to talk about the problem, to plan the future and to keep a satisfactory sexual life. Condom use does not seem to be an easily adopted practice. The interviewees point to other factors that need to be taken into consideration beyond safe sexual practices and knowledge of the forms of HIV transmission. Such factors seem to depend much more directly on the capacity of the partners to construct a new couple identity in face of a risk situation. The risk of infection always lies in another being. Paradoxically, in this case the risk of infection comes from someone so close that the continuation of the partnership itself depends on this other being.

  5. Percutaneous transhepatic management of complex biliary problems.

    PubMed Central

    Zuidema, G D; Cameron, J L; Sitzmann, J V; Kadir, S; Smith, G W; Kaufman, S L; White, R I

    1983-01-01

    A series of 27 patients with complex biliary problems secondary to previous biliary operations is presented. The patients are divided into two groups: (1) patients with acute perioperative biliary problems; all had biliary leak with abscess, biliary cutaneous fistula, and/or stricture following cholecystectomy or common duct exploration and (2) patients with chronic postoperative biliary problems; all had previous repair of biliary stricture or injuries with late stricture formation. Early management of all patients included placement of a percutaneous biliary stent. Abscesses were drained operatively, and biliary leaks or fistulas were allowed to close spontaneously. Jaundice and cholangitis were allowed to resolve. Following stabilization, management of stricture, if present, was addressed. Eight acute patients had strictures, of which four were partial and three were dilated percutaneously. Four were complete and required operative repair. All 12 chronic patients had strictures, of which six were partial and successfully managed with percutaneous dilatation. Four patients also had common duct stones which were successfully crushed percutaneously. The authors conclude that percutaneous transhepatic drainage offers significant advantages in the early stabilization and treatment of patients with complex biliary problems, and that partial strictures of the biliary tree may be managed successfully by percutaneous dilatation. Images Fig. 1. Fig. 2. Fig. 3. Fig. 4. Fig. 5. Fig. 6. Fig. 7. Fig. 8. Fig. 9. Fig. 10. Fig. 11. PMID:6847278

  6. Refined scale-dependent permutation entropy to analyze systems complexity

    NASA Astrophysics Data System (ADS)

    Wu, Shuen-De; Wu, Chiu-Wen; Humeau-Heurtier, Anne

    2016-05-01

    Multiscale entropy (MSE) has become a prevailing method to quantify the complexity of systems. Unfortunately, MSE has a temporal complexity in O(N2) , which is unrealistic for long time series. Moreover, MSE relies on the sample entropy computation which is length-dependent and which leads to large variance and possible undefined entropy values for short time series. Here, we propose and introduce a new multiscale complexity measure, the refined scale-dependent permutation entropy (RSDPE). Through the processing of different kinds of synthetic data and real signals, we show that RSDPE has a behavior close to the one of MSE. Furthermore, RSDPE has a temporal complexity in O(N) . Finally, RSDPE has the advantage of being much less length-dependent than MSE. From all this, we conclude that RSDPE over-performs MSE in terms of computational cost and computational accuracy.

  7. Analyzing Complex and Structured Data via Unsupervised Learning Techniques

    NASA Astrophysics Data System (ADS)

    Polsterer, Kai Lars; Gieseke, Fabian; Gianniotis, Nikos; Kügler, Dennis

    2015-08-01

    In the last decades more and more dedicated all-sky-surveys created an enormous amount of data which is publicly available on the internet. The resulting datasets contain spatial, spectral, and temporal information which exhibit complex structures in the respective domain. The capability to deal with morphological features, spectral signatures, and complex time series data has become very important but is still a challenging task. A common approach when processing this kind of structured data is to extract representative features and use those for a further analysis. We present unsupervised learning approaches that help to visualize / cluster these complex data sets by e.g. deriving rotation / translation invariant prototypes or capturing the latent dynamics of time series without employing features and using echo-state-networks instead.

  8. Fractal applications to complex crustal problems

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1989-01-01

    Complex scale-invariant problems obey fractal statistics. The basic definition of a fractal distribution is that the number of objects with a characteristic linear dimension greater than r satisfies the relation N = about r exp -D where D is the fractal dimension. Fragmentation often satisfies this relation. The distribution of earthquakes satisfies this relation. The classic relationship between the length of a rocky coast line and the step length can be derived from this relation. Power law relations for spectra can also be related to fractal dimensions. Topography and gravity are examples. Spectral techniques can be used to obtain maps of fractal dimension and roughness amplitude. These provide a quantitative measure of texture analysis. It is argued that the distribution of stress and strength in a complex crustal region, such as the Alps, is fractal. Based on this assumption, the observed frequency-magnitude relation for the seismicity in the region can be derived.

  9. New Approach to Analyzing Physics Problems: A Taxonomy of Introductory Physics Problems

    ERIC Educational Resources Information Center

    Teodorescu, Raluca E.; Bennhold, Cornelius; Feldman, Gerald; Medsker, Larry

    2013-01-01

    This paper describes research on a classification of physics problems in the context of introductory physics courses. This classification, called the Taxonomy of Introductory Physics Problems (TIPP), relates physics problems to the cognitive processes required to solve them. TIPP was created in order to design educational objectives, to develop…

  10. Estimating uncertainties in complex joint inverse problems

    NASA Astrophysics Data System (ADS)

    Afonso, Juan Carlos

    2016-04-01

    Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related

  11. System and method for modeling and analyzing complex scenarios

    DOEpatents

    Shevitz, Daniel Wolf

    2013-04-09

    An embodiment of the present invention includes a method for analyzing and solving possibility tree. A possibility tree having a plurality of programmable nodes is constructed and solved with a solver module executed by a processor element. The solver module executes the programming of said nodes, and tracks the state of at least a variable through a branch. When a variable of said branch is out of tolerance with a parameter, the solver disables remaining nodes of the branch and marks the branch as an invalid solution. The valid solutions are then aggregated and displayed as valid tree solutions.

  12. MatOFF: a tool for analyzing behaviorally complex neurophysiological experiments.

    PubMed

    Genovesio, Aldo; Mitz, Andrew R

    2007-09-15

    The simple operant conditioning originally used in behavioral neurophysiology 30 years ago has given way to complex and sophisticated behavioral paradigms; so much so, that early general purpose programs for analyzing neurophysiological data are ill-suited for complex experiments. The trend has been to develop custom software for each class of experiment, but custom software can have serious drawbacks. We describe here a general purpose software tool for behavioral and electrophysiological studies, called MatOFF, that is especially suited for processing neurophysiological data gathered during the execution of complex behaviors. Written in the MATLAB programming language, MatOFF solves the problem of handling complex analysis requirements in a unique and powerful way. While other neurophysiological programs are either a loose collection of tools or append MATLAB as a post-processing step, MatOFF is an integrated environment that supports MATLAB scripting within the event search engine safely isolated in a programming sandbox. The results from scripting are stored separately, but in parallel with the raw data, and thus available to all subsequent MatOFF analysis and display processing. An example from a recently published experiment shows how all the features of MatOFF work together to analyze complex experiments and mine neurophysiological data in efficient ways.

  13. Network-Thinking: Graphs to Analyze Microbial Complexity and Evolution

    PubMed Central

    Corel, Eduardo; Lopez, Philippe; Méheust, Raphaël; Bapteste, Eric

    2016-01-01

    The tree model and tree-based methods have played a major, fruitful role in evolutionary studies. However, with the increasing realization of the quantitative and qualitative importance of reticulate evolutionary processes, affecting all levels of biological organization, complementary network-based models and methods are now flourishing, inviting evolutionary biology to experience a network-thinking era. We show how relatively recent comers in this field of study, that is, sequence-similarity networks, genome networks, and gene families–genomes bipartite graphs, already allow for a significantly enhanced usage of molecular datasets in comparative studies. Analyses of these networks provide tools for tackling a multitude of complex phenomena, including the evolution of gene transfer, composite genes and genomes, evolutionary transitions, and holobionts. PMID:26774999

  14. Network-Thinking: Graphs to Analyze Microbial Complexity and Evolution.

    PubMed

    Corel, Eduardo; Lopez, Philippe; Méheust, Raphaël; Bapteste, Eric

    2016-03-01

    The tree model and tree-based methods have played a major, fruitful role in evolutionary studies. However, with the increasing realization of the quantitative and qualitative importance of reticulate evolutionary processes, affecting all levels of biological organization, complementary network-based models and methods are now flourishing, inviting evolutionary biology to experience a network-thinking era. We show how relatively recent comers in this field of study, that is, sequence-similarity networks, genome networks, and gene families-genomes bipartite graphs, already allow for a significantly enhanced usage of molecular datasets in comparative studies. Analyses of these networks provide tools for tackling a multitude of complex phenomena, including the evolution of gene transfer, composite genes and genomes, evolutionary transitions, and holobionts.

  15. Analyzing complex networks through correlations in centrality measurements

    NASA Astrophysics Data System (ADS)

    Furlan Ronqui, José Ricardo; Travieso, Gonzalo

    2015-05-01

    Many real world systems can be expressed as complex networks of interconnected nodes. It is frequently important to be able to quantify the relative importance of the various nodes in the network, a task accomplished by defining some centrality measures, with different centrality definitions stressing different aspects of the network. It is interesting to know to what extent these different centrality definitions are related for different networks. In this work, we study the correlation between pairs of a set of centrality measures for different real world networks and two network models. We show that the centralities are in general correlated, but with stronger correlations for network models than for real networks. We also show that the strength of the correlation of each pair of centralities varies from network to network. Taking this fact into account, we propose the use of a centrality correlation profile, consisting of the values of the correlation coefficients between all pairs of centralities of interest, as a way to characterize networks. Using the yeast protein interaction network as an example we show also that the centrality correlation profile can be used to assess the adequacy of a network model as a representation of a given real network.

  16. Analyzing complex gaze behavior in the natural world

    NASA Astrophysics Data System (ADS)

    Pelz, Jeff B.; Kinsman, Thomas B.; Evans, Karen M.

    2011-03-01

    The history of eye-movement research extends back at least to 1794, when Erasmus Darwin (Charles' grandfather) published Zoonomia, including descriptions of eye movements due to self-motion. But research on eye movements was restricted to the laboratory for 200 years, until Michael Land built the first wearable eyetracker at the University of Sussex and published the seminal paper "Where we look when we steer" [1]. In the intervening centuries, we learned a tremendous amount about the mechanics of the oculomotor system and how it responds to isolated stimuli, but virtually nothing about how we actually use our eyes to explore, gather information, navigate, and communicate in the real world. Inspired by Land's work, we have been working to extend knowledge in these areas by developing hardware, algorithms, and software that have allowed researchers to ask questions about how we actually use vision in the real world. Central to that effort are new methods for analyzing the volumes of data that come from the experiments made possible by the new systems. We describe a number of recent experiments and SemantiCode, a new program that supports assisted coding of eye-movement data collected in unrestricted environments.

  17. Parameterized Complexity of Eulerian Deletion Problems.

    PubMed

    Cygan, Marek; Marx, Dániel; Pilipczuk, Marcin; Pilipczuk, Michał; Schlotter, Ildikó

    2014-01-01

    We study a family of problems where the goal is to make a graph Eulerian, i.e., connected and with all the vertices having even degrees, by a minimum number of deletions. We completely classify the parameterized complexity of various versions: undirected or directed graphs, vertex or edge deletions, with or without the requirement of connectivity, etc. The collection of results shows an interesting contrast: while the node-deletion variants remain intractable, i.e., W[1]-hard for all the studied cases, edge-deletion problems are either fixed-parameter tractable or polynomial-time solvable. Of particular interest is a randomized FPT algorithm for making an undirected graph Eulerian by deleting the minimum number of edges, based on a novel application of the color coding technique. For versions that remain NP-complete but fixed-parameter tractable we consider also possibilities of polynomial kernelization; unfortunately, we prove that this is not possible unless NP⊆coNP/poly. PMID:24415818

  18. Hybrid techniques for complex aerospace electromagnetics problems

    NASA Technical Reports Server (NTRS)

    Aberle, Jim

    1993-01-01

    Important aerospace electromagnetics problems include the evaluation of antenna performance on aircraft and the prediction and control of the aircraft's electromagnetic signature. Due to the ever increasing complexity and expense of aircraft design, aerospace engineers have become increasingly dependent on computer solutions. Traditionally, computational electromagnetics (CEM) has relied primarily on four disparate techniques: the method of moments (MoM), the finite-difference time-domain (FDTD) technique, the finite element method (FEM), and high frequency asymptotic techniques (HFAT) such as ray tracing. Each of these techniques has distinct advantages and disadvantages, and no single technique is capable of accurately solving all problems of interest on computers that are available now or will be available in the foreseeable future. As a result, new approaches that overcome the deficiencies of traditional techniques are beginning to attract a great deal of interest in the CEM community. Among these new approaches are hybrid methods which combine two or more of these techniques into a coherent model. During the ASEE Summer Faculty Fellowship Program a hybrid FEM/MoM computer code was developed and applied to a geometry containing features found on many modern aircraft.

  19. Analyzing Problems in Schools and School Systems: A Theoretical Approach. Topics in Educational Leadership.

    ERIC Educational Resources Information Center

    Gaynor, Alan Kibbe

    This book is directed toward students in organizational-theory and problem-analysis classes and their professors, as well as school administrators seeking to examine their problems and policies from new perspectives. It explains and illustrates methodology for describing, documenting, and analyzing organizational problems. Part I, "Methodology,"…

  20. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 2. [sample problem library guide

    NASA Technical Reports Server (NTRS)

    Jackson, C. E., Jr.

    1977-01-01

    A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.

  1. Complex network problems in physics, computer science and biology

    NASA Astrophysics Data System (ADS)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe

  2. Team-Based Complex Problem Solving: A Collective Cognition Perspective

    ERIC Educational Resources Information Center

    Hung, Woei

    2013-01-01

    Today, much problem solving is performed by teams, rather than individuals. The complexity of these problems has exceeded the cognitive capacity of any individual and requires a team of members to solve them. The success of solving these complex problems not only relies on individual team members who possess different but complementary expertise,…

  3. Spectral methods for problems in complex geometries

    NASA Technical Reports Server (NTRS)

    Orszag, S. A.

    1979-01-01

    Techniques that permit the efficient application of spectral methods to solve problems in nearly arbitrary geometries are presented. These methods were found to be viable alternatives to finite difference and finite element processes. The spectral methods applied are extensions of the standard techniques of separation of variables to the solution of arbitrarily complicated problems.

  4. Managing Complex Problems in Rangeland Ecosystems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Management of rangelands, and natural resources in general, has become increasingly complex. There is an atmosphere of increasing expectations for conservation efforts associated with a variety of issues from water quality to endangered species. We argue that many current issues are complex by their...

  5. Complex partial status epilepticus: a recurrent problem.

    PubMed Central

    Cockerell, O C; Walker, M C; Sander, J W; Shorvon, S D

    1994-01-01

    Twenty patients with complex partial status epilepticus were identified retrospectively from a specialist neurology hospital. Seventeen patients experienced recurrent episodes of complex partial status epilepticus, often occurring at regular intervals, usually over many years, and while being treated with effective anti-epileptic drugs. No unifying cause for the recurrences, and no common epilepsy aetiologies, were identified. In spite of the frequency of recurrence and length of history, none of the patients showed any marked evidence of cognitive or neurological deterioration. Complex partial status epilepticus is more common than is generally recognised, should be differentiated from other forms of non-convulsive status, and is often difficult to treat. PMID:8021671

  6. Solving the Inverse-Square Problem with Complex Variables

    ERIC Educational Resources Information Center

    Gauthier, N.

    2005-01-01

    The equation of motion for a mass that moves under the influence of a central, inverse-square force is formulated and solved as a problem in complex variables. To find the solution, the constancy of angular momentum is first established using complex variables. Next, the complex position coordinate and complex velocity of the particle are assumed…

  7. Building an information model (with the help of PSL/PSA). [Problem Statement Language/Problem Statement Analyzer

    NASA Technical Reports Server (NTRS)

    Callender, E. D.; Farny, A. M.

    1983-01-01

    Problem Statement Language/Problem Statement Analyzer (PSL/PSA) applications, which were once a one-step process in which product system information was immediately translated into PSL statements, have in light of experience been shown to result in inconsistent representations. These shortcomings have prompted the development of an intermediate step, designated the Product System Information Model (PSIM), which provides a basis for the mutual understanding of customer terminology and the formal, conceptual representation of that product system in a PSA data base. The PSIM is initially captured as a paper diagram, followed by formal capture in the PSL/PSA data base.

  8. Explicitly solvable complex Chebyshev approximation problems related to sine polynomials

    NASA Technical Reports Server (NTRS)

    Freund, Roland

    1989-01-01

    Explicitly solvable real Chebyshev approximation problems on the unit interval are typically characterized by simple error curves. A similar principle is presented for complex approximation problems with error curves induced by sine polynomials. As an application, some new explicit formulae for complex best approximations are derived.

  9. How Unstable Are Complex Financial Systems? Analyzing an Inter-bank Network of Credit Relations

    NASA Astrophysics Data System (ADS)

    Sinha, Sitabhra; Thess, Maximilian; Markose, Sheri

    The recent worldwide economic crisis of 2007-09 has focused attention on the need to analyze systemic risk in complex financial networks. We investigate the problem of robustness of such systems in the context of the general theory of dynamical stability in complex networks and, in particular, how the topology of connections influence the risk of the failure of a single institution triggering a cascade of successive collapses propagating through the network. We use data on bilateral liabilities (or exposure) in the derivatives market between 202 financial intermediaries based in USA and Europe in the last quarter of 2009 to empirically investigate the network structure of the over-the-counter (OTC) derivatives market. We observe that the network exhibits both heterogeneity in node properties and the existence of communities. It also has a prominent core-periphery organization and can resist large-scale collapse when subjected to individual bank defaults (however, failure of any bank in the core may result in localized collapse of the innermost core with substantial loss of capital) but is vulnerable to system-wide breakdown as a result of an accompanying liquidity crisis.

  10. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  11. Analyzing the Responses of 7-8 Year Olds When Solving Partitioning Problems

    ERIC Educational Resources Information Center

    Badillo, Edelmira; Font, Vicenç; Edo, Mequè

    2015-01-01

    We analyze the mathematical solutions of 7- to 8-year-old pupils while individually solving an arithmetic problem. The analysis was based on the "configuration of objects," an instrument derived from the onto-semiotic approach to mathematical knowledge. Results are illustrated through a number of cases. From the analysis of mathematical…

  12. Multigrid Methods for Aerodynamic Problems in Complex Geometries

    NASA Technical Reports Server (NTRS)

    Caughey, David A.

    1995-01-01

    Work has been directed at the development of efficient multigrid methods for the solution of aerodynamic problems involving complex geometries, including the development of computational methods for the solution of both inviscid and viscous transonic flow problems. The emphasis is on problems of complex, three-dimensional geometry. The methods developed are based upon finite-volume approximations to both the Euler and the Reynolds-Averaged Navier-Stokes equations. The methods are developed for use on multi-block grids using diagonalized implicit multigrid methods to achieve computational efficiency. The work is focused upon aerodynamic problems involving complex geometries, including advanced engine inlets.

  13. Complex Mathematical Problem Solving by Individuals and Dyads.

    ERIC Educational Resources Information Center

    Vye, Nancy J.; Goldman, Susan R.; Voss, James F.; Hmelo, Cindy; Williams, Susan; Cognition and Technology Group at Vanderbilt University

    1997-01-01

    Describes two studies of mathematical problem solving using an episode from "The Adventures of Jasper Woodbury," a set of curriculum materials that afford complex problem-solving opportunities. Discussion focuses on characteristics of problems that make solutions difficult, kinds of reasoning that dyadic interactions support, and considerations of…

  14. Preparing for Complexity and Wicked Problems through Transformational Learning Approaches

    ERIC Educational Resources Information Center

    Yukawa, Joyce

    2015-01-01

    As the information environment becomes increasingly complex and challenging, Library and Information Studies (LIS) education is called upon to nurture innovative leaders capable of managing complex situations and "wicked problems." While disciplinary expertise remains essential, higher levels of mental complexity and adaptive…

  15. A New Approach to Analyzing the Cognitive Load in Physics Problems

    NASA Astrophysics Data System (ADS)

    Teodorescu, Raluca

    2010-02-01

    I will present a Taxonomy of Introductory Physics Problems (TIPP), which relates physics problems to the cognitive processes and the knowledge required to solve them. TIPP was created for designing and clarifying educational objectives, for developing assessments to evaluate components of the problem-solving process, and for guiding curriculum design in introductory physics courses. To construct TIPP, I considered processes that have been identified either by cognitive science and expert-novice research or by direct observation of students' behavior while solving physics problems. Based on Marzano and Kendall's taxonomy [1], I developed a procedure to classify physics problems according to the cognitive processes that they involve and the knowledge to which they refer. The procedure is applicable to any physics problem and its validity and reliability have been confirmed. This algorithm was then used to build TIPP, which is a database that contains text-based and research-based physics problems and explains their relationship to cognitive processes and knowledge. TIPP has been used in the years 2006--2009 to reform the first semester of the introductory algebra-based physics course at The George Washington University. The reform targeted students' cognitive development and attitudes improvement. The methodology employed in the course involves exposing students to certain types of problems in a variety of contexts with increasing complexity. To assess the effectiveness of our approach, rubrics were created to evaluate students' problem-solving abilities and the Colorado Learning Attitudes about Science Survey (CLASS) was administered pre- and post-instruction to determine students' shift in dispositions towards learning physics. Our results show definitive gains in the areas targeted by our curricular reform.[4pt] [1] R.J. Marzano and J.S. Kendall, The New Taxonomy of Educational Objectives, 2^nd Ed., (Corwin Press, Thousand Oaks, 2007). )

  16. The complex problem of sensitive skin.

    PubMed

    Marriott, Marie; Holmes, Jo; Peters, Lisa; Cooper, Karen; Rowson, Matthew; Basketter, David A

    2005-08-01

    There exists within the population subsets of individuals who display heightened skin reactivity to materials the majority find tolerable. In a series of investigations, we have examined interrelationships between many of the endpoints associated with the term 'sensitive skin'. In the most recent work, 58 volunteers were treated with 10% lactic acid, 50% ethanol, 0.5% menthol and 1.0% capsaicin on the nasolabial fold, unoccluded, with sensory reactions recorded at 2.5 min, 5 min and 8 min after application. Urticant susceptibility was evaluated with 1 m benzoic acid and 125 mM trans-cinnamic acid applied to the volar forearm for 20 min. A 2 x 23-h patch test was also conducted using 0.1% and 0.3% sodium dodecyl sulfate, 0.3% and 0.6% cocamidopropyl betaine and 0.1% and 0.2% benzalkonium chloride to determine irritant susceptibility. As found in previous studies, increased susceptibility to one endpoint was not predictive of sensitivity to another. In our experience, nasolabial stinging was a poor predictor of general skin sensitivity. Nevertheless, it may be possible to identify in the normal population individuals who, coincidentally, are more generally sensitive to a range of non-immunologic adverse skin reactions. Whether such individuals are those who experience problems with skin care products remains to be addressed. PMID:16033403

  17. Completed Beltrami-Michell formulation for analyzing mixed boundary value problems in elasticity

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Kaljevic, Igor; Hopkins, Dale A.; Saigal, Sunil

    1995-01-01

    In elasticity, the method of forces, wherein stress parameters are considered as the primary unknowns, is known as the Beltrami-Michell formulation (BMF). The existing BMF can only solve stress boundary value problems; it cannot handle the more prevalent displacement of mixed boundary value problems of elasticity. Therefore, this formulation, which has restricted application, could not become a true alternative to the Navier's displacement method, which can solve all three types of boundary value problems. The restrictions in the BMF have been alleviated by augmenting the classical formulation with a novel set of conditions identified as the boundary compatibility conditions. This new method, which completes the classical force formulation, has been termed the completed Beltrami-Michell formulation (CBMF). The CBMF can solve general elasticity problems with stress, displacement, and mixed boundary conditions in terms of stresses as the primary unknowns. The CBMF is derived from the stationary condition of the variational functional of the integrated force method. In the CBMF, stresses for kinematically stable structures can be obtained without any reference to the displacements either in the field or on the boundary. This paper presents the CBMF and its derivation from the variational functional of the integrated force method. Several examples are presented to demonstrate the applicability of the completed formulation for analyzing mixed boundary value problems under thermomechanical loads. Selected example problems include a cylindrical shell wherein membrane and bending responses are coupled, and a composite circular plate.

  18. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 1. [thermal analyzer manual

    NASA Technical Reports Server (NTRS)

    Lee, H. P.

    1977-01-01

    The NASTRAN Thermal Analyzer Manual describes the fundamental and theoretical treatment of the finite element method, with emphasis on the derivations of the constituent matrices of different elements and solution algorithms. Necessary information and data relating to the practical applications of engineering modeling are included.

  19. DNA computing, computation complexity and problem of biological evolution rate.

    PubMed

    Melkikh, Alexey V

    2008-12-01

    An analogy between the evolution of organisms and some complex computational problems (cryptosystem cracking, determination of the shortest path in a graph) is considered. It is shown that in the absence of a priori information about possible species of organisms such a problem is complex (is rated in the class NP) and cannot be solved in a polynomial number of steps. This conclusion suggests the need for re-examination of evolution mechanisms. Ideas of a deterministic approach to the evolution are discussed.

  20. Reviewing the impact of problem structure on planning: a software tool for analyzing tower tasks.

    PubMed

    Kaller, Christoph P; Rahm, Benjamin; Köstering, Lena; Unterrainer, Josef M

    2011-01-01

    Cognitive, clinical, and neuroimaging studies on planning abilities most frequently implement the Tower of London task or one of its variants. Yet, cumulating evidence from a series of experiments suggests that the commonly used approximation of problem difficulty in terms of the minimum number of moves for goal attainment is too coarse a measure for the underlying cognitive operations, and in some cases may be even misleading. Rather, problem difficulty can be more specifically characterized by a set of structural task parameters such as the number and nature of optimal and suboptimal solution paths, the required search depths, the patterns of intermediate and goal moves, goal hierarchies and the associated degree of ambiguity in the sequential ordering of goal moves. First applications in developmental and patient studies have proven fruitful in targeting fundamental alterations of planning abilities in healthy and clinical conditions. In addition, recent evidence from neuroimaging shows that manipulations of problem structure relate to separate cognitive and neural processes and are accompanied by dissociable brain activation patterns. Here, we briefly review these structural problem parameters and the concepts behind. As controlling for task parameters and selecting a balanced problem set is a complex and error-prone endeavor, we further present TowerTool, a software solution that allows easy access to in-depth analysis of the problem structure of widely used planning tasks like the Tower of London, the Tower of Hanoi, and their variants. Thereby, we hope to encourage and facilitate the implementation of structurally balanced task sets in future studies on planning and to promote transfer between the cognitive, developmental, and clinical neurosciences. PMID:20723568

  1. Group Planning and Task Efficiency with Complex Problems. Final Report.

    ERIC Educational Resources Information Center

    Lawson, E. D.

    One hundred eighty 4-man groups (90 of men and 90 of women) using 3 types of net (All-Channel, Wheel and Circle) under 3 conditions (Planning Period (PP), Rest Period (RP) and Control) were run in a single session with 5 complex problems to determine whether a single 2-minute planning period after solution of the first problem would result in…

  2. Semantic Annotation of Complex Text Structures in Problem Reports

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Throop, David R.; Fleming, Land D.

    2011-01-01

    Text analysis is important for effective information retrieval from databases where the critical information is embedded in text fields. Aerospace safety depends on effective retrieval of relevant and related problem reports for the purpose of trend analysis. The complex text syntax in problem descriptions has limited statistical text mining of problem reports. The presentation describes an intelligent tagging approach that applies syntactic and then semantic analysis to overcome this problem. The tags identify types of problems and equipment that are embedded in the text descriptions. The power of these tags is illustrated in a faceted searching and browsing interface for problem report trending that combines automatically generated tags with database code fields and temporal information.

  3. Clinical Problem Analysis (CPA): A Systematic Approach To Teaching Complex Medical Problem Solving.

    ERIC Educational Resources Information Center

    Custers, Eugene J. F. M.; Robbe, Peter F. De Vries; Stuyt, Paul M. J.

    2000-01-01

    Discusses clinical problem analysis (CPA) in medical education, an approach to solving complex clinical problems. Outlines the five step CPA model and examines the value of CPA's content-independent (methodical) approach. Argues that teaching students to use CPA will enable them to avoid common diagnostic reasoning errors and pitfalls. Compares…

  4. Particle swarm optimization for complex nonlinear optimization problems

    NASA Astrophysics Data System (ADS)

    Alexandridis, Alex; Famelis, Ioannis Th.; Tsitouras, Charalambos

    2016-06-01

    This work presents the application of a technique belonging to evolutionary computation, namely particle swarm optimization (PSO), to complex nonlinear optimization problems. To be more specific, a PSO optimizer is setup and applied to the derivation of Runge-Kutta pairs for the numerical solution of initial value problems. The effect of critical PSO operational parameters on the performance of the proposed scheme is thoroughly investigated.

  5. Theory of periodically specified problems: Complexity and approximability

    SciTech Connect

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.; Rosenkrantz, D.J.

    1997-12-05

    We study the complexity and the efficient approximability of graph and satisfiability problems when specified using various kinds of periodic specifications studied. The general results obtained include the following: (1) We characterize the complexities of several basic generalized CNF satisfiability problems SAT(S) [Sc78], when instances are specified using various kinds of 1- and 2-dimensional periodic specifications. We outline how this characterization can be used to prove a number of new hardness results for the complexity classes DSPACE(n), NSPACE(n), DEXPTIME, NEXPTIME, EXPSPACE etc. These results can be used to prove in a unified way the hardness of a number of combinatorial problems when instances are specified succinctly using various succient specifications considered in the literature. As one corollary, we show that a number of basic NP-hard problems because EXPSPACE-hard when inputs are represented using 1-dimensional infinite periodic wide specifications. This answers a long standing open question posed by Orlin. (2) We outline a simple yet a general technique to devise approximation algorithms with provable worst case performance guarantees for a number of combinatorial problems specified periodically. Our efficient approximation algorithms and schemes are based on extensions of the ideas and represent the first non-trivial characterization of a class of problems having an {epsilon}-approximation (or PTAS) for periodically specified NEXPTIME-hard problems. Two of properties of our results are: (i) For the first time, efficient approximation algorithms and schemes have been developed for natural NEXPTIME-complete problems. (ii) Our results are the first polynomial time approximation algorithms with good performance guarantees for hard problems specified using various kinds of periodic specifications considered in this paper.

  6. On the planetary and Milne problems in complex radiative transfer

    NASA Astrophysics Data System (ADS)

    Viik, T.

    2016-11-01

    In this paper we consider two classical problems in radiative transfer - the planetary and the Milne problems - in an isotropic homogeneous optically semi-infinite medium where the albedo of single scattering may be defined anywhere in the complex plane. It appeared that the method of approximating the kernel in the integral equation for the Sobolev resolvent function can be used even in such a case. This approach allows to express almost all the relevant functions of transfer for those problems by simply determinable auxiliary functions.

  7. Investigating the Effect of Complexity Factors in Gas Law Problems

    ERIC Educational Resources Information Center

    Schuttlefield, Jennifer D.; Kirk, John; Pienta, Norbert J.; Tang, Hui

    2012-01-01

    Undergraduate students were asked to complete gas law questions using a Web-based tool as a first step in our understanding of the role of cognitive load in chemistry word questions and in helping us assess student problem-solving. Each question contained five different complexity factors, which were randomly assigned by the tool so that a…

  8. Olae: A Bayesian Performance Assessment for Complex Problem Solving.

    ERIC Educational Resources Information Center

    VanLehn, Kurt

    Olae is a computer system for assessing student knowledge of physics, and Newtonian mechanics in particular, using performance data collected while students solve complex problems. Although originally designed as a stand-alone system, it has also been used as part of the Andes intelligent tutoring system. Like many other performance assessment…

  9. What Do Employers Pay for Employees' Complex Problem Solving Skills?

    ERIC Educational Resources Information Center

    Ederer, Peer; Nedelkoska, Ljubica; Patt, Alexander; Castellazzi, Silvia

    2015-01-01

    We estimate the market value that employers assign to the complex problem solving (CPS) skills of their employees, using individual-level Mincer-style wage regressions. For the purpose of the study, we collected new and unique data using psychometric measures of CPS and an extensive background questionnaire on employees' personal and work history.…

  10. Application of NASA management approach to solve complex problems on earth

    NASA Technical Reports Server (NTRS)

    Potate, J. S.

    1972-01-01

    The application of NASA management approach to solving complex problems on earth is discussed. The management of the Apollo program is presented as an example of effective management techniques. Four key elements of effective management are analyzed. Photographs of the Cape Kennedy launch sites and supporting equipment are included to support the discussions.

  11. Finite difference solutions of heat conduction problems in multi-layered bodies with complex geometries

    NASA Technical Reports Server (NTRS)

    Masiulaniec, K. C.; Keith, T. G., Jr.; Dewitt, K. J.

    1984-01-01

    A numerical procedure is presented for analyzing a wide variety of heat conduction problems in multilayered bodies having complex geometry. The method is based on a finite difference solution of the heat conduction equation using a body fitted coordinate system transformation. Solution techniques are described for steady and transient problems with and without internal energy generation. Results are found to compare favorably with several well known solutions.

  12. Emergent Science: Solving complex science problems via collaborations

    NASA Astrophysics Data System (ADS)

    Li, X.; Ramachandran, R.; Wilson, B. D.; Lynnes, C.; Conover, H.

    2009-12-01

    The recent advances in Cyberinfrastructure have democratized the use of computational and data resources. These resources together with new social networking and collaboration technologies, present an unprecedented opportunity to impact the science process. These advances can move the science process from “circumspect science” -- where scientists publish only when the project is complete, publish only the final results, seldom publish things that did not work, and communicate results with each other using paper technology -- to “open science” -- where scientists can share and publish every element in their research, from the data used as input, workflows used to analyze these data sets, possibly failed experiments, and the final results. Open science can foster novel ways of social collaboration in science. We are already seeing the impact of social collaboration in our daily lives. A simple example is the use of reviews posted online by other consumers while evaluating whether to buy a product or not. This phenomenon has been well documented and is referred by many names such as Smart Mobs, Wisdom of Crowds, Wikinomics, Crowd sourcing, We-Think and swarm collaboration. Similar social collaborations during the science process can lead to “emergent science”. We define "emergent science" as way complex science problems can be solved and new research directions forged out of a multiplicity of relatively simple collaborative interactions. There are, however, barriers that prevent social collaboration within the science process. Some of these barriers are technical such as lack of science collaboration platforms and the others are social. The success of any collaborative platform has to take into account the incentives or motivation for the scientists to participate. This presentation will address obstacles facing emergent science and will suggest possible solutions required to build a critical mass.

  13. The Complex Route to Success: Complex Problem-Solving Skills in the Prediction of University Success

    ERIC Educational Resources Information Center

    Stadler, Matthias J.; Becker, Nicolas; Greiff, Samuel; Spinath, Frank M.

    2016-01-01

    Successful completion of a university degree is a complex matter. Based on considerations regarding the demands of acquiring a university degree, the aim of this paper was to investigate the utility of complex problem-solving (CPS) skills in the prediction of objective and subjective university success (SUS). The key finding of this study was that…

  14. Complexity and efficient approximability of two dimensional periodically specified problems

    SciTech Connect

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.

    1996-09-01

    The authors consider the two dimensional periodic specifications: a method to specify succinctly objects with highly regular repetitive structure. These specifications arise naturally when processing engineering designs including VLSI designs. These specifications can specify objects whose sizes are exponentially larger than the sizes of the specification themselves. Consequently solving a periodically specified problem by explicitly expanding the instance is prohibitively expensive in terms of computational resources. This leads one to investigate the complexity and efficient approximability of solving graph theoretic and combinatorial problems when instances are specified using two dimensional periodic specifications. They prove the following results: (1) several classical NP-hard optimization problems become NEXPTIME-hard, when instances are specified using two dimensional periodic specifications; (2) in contrast, several of these NEXPTIME-hard problems have polynomial time approximation algorithms with guaranteed worst case performance.

  15. How Humans Solve Complex Problems: The Case of the Knapsack Problem

    PubMed Central

    Murawski, Carsten; Bossaerts, Peter

    2016-01-01

    Life presents us with problems of varying complexity. Yet, complexity is not accounted for in theories of human decision-making. Here we study instances of the knapsack problem, a discrete optimisation problem commonly encountered at all levels of cognition, from attention gating to intellectual discovery. Complexity of this problem is well understood from the perspective of a mechanical device like a computer. We show experimentally that human performance too decreased with complexity as defined in computer science. Defying traditional economic principles, participants spent effort way beyond the point where marginal gain was positive, and economic performance increased with instance difficulty. Human attempts at solving the instances exhibited commonalities with algorithms developed for computers, although biological resource constraints–limited working and episodic memories–had noticeable impact. Consistent with the very nature of the knapsack problem, only a minority of participants found the solution–often quickly–but the ones who did appeared not to realise. Substantial heterogeneity emerged, suggesting why prizes and patents, schemes that incentivise intellectual discovery but discourage information sharing, have been found to be less effective than mechanisms that reveal private information, such as markets. PMID:27713516

  16. [Problems of protecting complex inventions in the field of microbiology].

    PubMed

    Korovkin, V I

    1978-01-01

    Some problems are discussed which are connected with the protection of inventions in the field of microbiology when the invention is complex. The rights of the author are determined when a method and a product are to be protected at the same time. The additional juridical protection of a microbial strain is not necessary. The complex protection of a microbial strain and the method of its utilization is recommended in certain cases since it might prevent conflicts which arise upon the parallel juridical protection of a strain and the method of its utilization.

  17. Integrated Science: Providing a More Complete Understanding of Complex Problems

    USGS Publications Warehouse

    ,

    2006-01-01

    Integration among sciences is critical in order to address some of our most pressing problems. Because of the inherent complexity of natural systems, and the increasing complexity of human demands on them, narrowly-focused approaches are no longer sufficient. USGS Workshop on Enhancing Integrated Science, November 1998. The Mid-Continent Geographic Science Center is actively participating in several integrated science studies that include research partners from the other disciplines of the U.S. Geological Survey (USGS), other Federal and State agencies, universities, and private non-government organizations. The following three examples illustrate the diversity of these studies.

  18. A generalized topological entropy for analyzing the complexity of DNA sequences.

    PubMed

    Jin, Shuilin; Tan, Renjie; Jiang, Qinghua; Xu, Li; Peng, Jiajie; Wang, Yong; Wang, Yadong

    2014-01-01

    Topological entropy is one of the most difficult entropies to be used to analyze the DNA sequences, due to the finite sample and high-dimensionality problems. In order to overcome these problems, a generalized topological entropy is introduced. The relationship between the topological entropy and the generalized topological entropy is compared, which shows the topological entropy is a special case of the generalized entropy. As an application the generalized topological entropy in introns, exons and promoter regions was computed, respectively. The results indicate that the entropy of introns is higher than that of exons, and the entropy of the exons is higher than that of the promoter regions for each chromosome, which suggest that DNA sequence of the promoter regions is more regular than the exons and introns.

  19. Analyzing complex patients' temporal histories: new frontiers in temporal data mining.

    PubMed

    Sacchi, Lucia; Dagliati, Arianna; Bellazzi, Riccardo

    2015-01-01

    In recent years, data coming from hospital information systems (HIS) and local healthcare organizations have started to be intensively used for research purposes. This rising amount of available data allows reconstructing the compete histories of the patients, which have a strong temporal component. This chapter introduces the major challenges faced by temporal data mining researchers in an era when huge quantities of complex clinical temporal data are becoming available. The analysis is focused on the peculiar features of this kind of data and describes the methodological and technological aspects that allow managing such complex framework. The chapter shows how heterogeneous data can be processed to derive a homogeneous representation. Starting from this representation, it illustrates different techniques for jointly analyze such kind of data. Finally, the technological strategies that allow creating a common data warehouse to gather data coming from different sources and with different formats are presented.

  20. A Comparison of Geographic Information Systems, Complex Networks, and Other Models for Analyzing Transportation Network Topologies

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia (Technical Monitor); Kuby, Michael; Tierney, Sean; Roberts, Tyler; Upchurch, Christopher

    2005-01-01

    This report reviews six classes of models that are used for studying transportation network topologies. The report is motivated by two main questions. First, what can the "new science" of complex networks (scale-free, small-world networks) contribute to our understanding of transport network structure, compared to more traditional methods? Second, how can geographic information systems (GIS) contribute to studying transport networks? The report defines terms that can be used to classify different kinds of models by their function, composition, mechanism, spatial and temporal dimensions, certainty, linearity, and resolution. Six broad classes of models for analyzing transport network topologies are then explored: GIS; static graph theory; complex networks; mathematical programming; simulation; and agent-based modeling. Each class of models is defined and classified according to the attributes introduced earlier. The paper identifies some typical types of research questions about network structure that have been addressed by each class of model in the literature.

  1. Binocular adaptive optics vision analyzer with full control over the complex pupil functions.

    PubMed

    Schwarz, Christina; Prieto, Pedro M; Fernández, Enrique J; Artal, Pablo

    2011-12-15

    We present a binocular adaptive optics vision analyzer fully capable of controlling both amplitude and phase of the two complex pupil functions in each eye of the subject. A special feature of the instrument is its comparatively simple setup. A single reflective liquid crystal on silicon spatial light modulator working in pure phase modulation generates the phase profiles for both pupils simultaneously. In addition, another liquid crystal spatial light modulator working in transmission operates in pure intensity modulation to produce a large variety of pupil masks for each eye. Subjects perform visual tasks through any predefined variations of the complex pupil function for both eyes. As an example of the system efficiency, we recorded images of the stimuli through the system as they were projected at the subject's retina. This instrument proves to be extremely versatile for designing and testing novel ophthalmic elements and simulating visual outcomes, as well as for further research of binocular vision.

  2. Analyzing networks of phenotypes in complex diseases: methodology and applications in COPD

    PubMed Central

    2014-01-01

    Background The investigation of complex disease heterogeneity has been challenging. Here, we introduce a network-based approach, using partial correlations, that analyzes the relationships among multiple disease-related phenotypes. Results We applied this method to two large, well-characterized studies of chronic obstructive pulmonary disease (COPD). We also examined the associations between these COPD phenotypic networks and other factors, including case-control status, disease severity, and genetic variants. Using these phenotypic networks, we have detected novel relationships between phenotypes that would not have been observed using traditional epidemiological approaches. Conclusion Phenotypic network analysis of complex diseases could provide novel insights into disease susceptibility, disease severity, and genetic mechanisms. PMID:24964944

  3. COMPLEXITY & APPROXIMABILITY OF QUANTIFIED & STOCHASTIC CONSTRAINT SATISFACTION PROBLEMS

    SciTech Connect

    H. B. HUNT; M. V. MARATHE; R. E. STEARNS

    2001-06-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S and T be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SATc(S).) Here, we study simultaneously the complexity of decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. We present simple yet general techniques to characterize simultaneously, the complexity or efficient approximability of a number of versions/variants of the problems SAT(S), Q-SAT(S), S-SAT(S),MAX-Q-SAT(S) etc., for many different such D,C,S,T. These versions/variants include decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. Our unified approach is based on the following two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic representability. Some of the results extend the earlier results in [Pa85,LMP99,CF+93,CF+94] Our techniques and results reported here also provide significant steps towards obtaining dichotomy theorems, for a number of the problems above, including the problems MAX-Q-SAT(S), and MAX-S-SAT(S). The discovery of such dichotomy theorems, for unquantified formulas, has received significant recent attention in the literature [CF+93, CF+94, Cr95, KSW97]. Keywords: NP-hardness; Approximation Algorithms; PSPACE-hardness; Quantified and Stochastic Constraint Satisfaction Problems.

  4. Data Mining and Complex Problems: Case Study in Composite Materials

    NASA Technical Reports Server (NTRS)

    Rabelo, Luis; Marin, Mario

    2009-01-01

    Data mining is defined as the discovery of useful, possibly unexpected, patterns and relationships in data using statistical and non-statistical techniques in order to develop schemes for decision and policy making. Data mining can be used to discover the sources and causes of problems in complex systems. In addition, data mining can support simulation strategies by finding the different constants and parameters to be used in the development of simulation models. This paper introduces a framework for data mining and its application to complex problems. To further explain some of the concepts outlined in this paper, the potential application to the NASA Shuttle Reinforced Carbon-Carbon structures and genetic programming is used as an illustration.

  5. Complexity and approximability of quantified and stochastic constraint satisfaction problems

    SciTech Connect

    Hunt, H. B.; Stearns, R. L.; Marathe, M. V.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SAT{sub c}(S)). Here, we study simultaneously the complexity of and the existence of efficient approximation algorithms for a number of variants of the problems SAT(S) and SAT{sub c}(S), and for many different D, C, and S. These problem variants include decision and optimization problems, for formulas, quantified formulas stochastically-quantified formulas. We denote these problems by Q-SAT(S), MAX-Q-SAT(S), S-SAT(S), MAX-S-SAT(S) MAX-NSF-Q-SAT(S) and MAX-NSF-S-SAT(S). The main contribution is the development of a unified predictive theory for characterizing the the complexity of these problems. Our unified approach is based on the following basic two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic representability. Let k {ge} 2. Let S be a finite set of finite-arity relations on {Sigma}{sub k} with the following condition on S: All finite arity relations on {Sigma}{sub k} can be represented as finite existentially-quantified conjunctions of relations in S applied to variables (to variables and constant symbols in C), Then we prove the following new results: (1) The problems SAT(S) and SAT{sub c}(S) are both NQL-complete and {le}{sub logn}{sup bw}-complete for NP. (2) The problems Q-SAT(S), Q-SAT{sub c}(S), are PSPACE-complete. Letting k = 2, the problem S-SAT(S) and S-SAT{sub c}(S) are PSPACE-complete. (3) {exists} {epsilon} > 0 for which approximating the problems MAX-Q-SAT(S) within {epsilon} times optimum is PSPACE-hard. Letting k =: 2, {exists} {epsilon} > 0 for which approximating the problems MAX-S-SAT(S) within {epsilon} times optimum is PSPACE-hard. (4

  6. Complex saddle points and the sign problem in complex Langevin simulation

    NASA Astrophysics Data System (ADS)

    Hayata, Tomoya; Hidaka, Yoshimasa; Tanizaki, Yuya

    2016-10-01

    We show that complex Langevin simulation converges to a wrong result within the semiclassical analysis, by relating it to the Lefschetz-thimble path integral, when the path-integral weight has different phases among dominant complex saddle points. Equilibrium solution of the complex Langevin equation forms local distributions around complex saddle points. Its ensemble average approximately becomes a direct sum of the average in each local distribution, where relative phases among them are dropped. We propose that by taking these phases into account through reweighting, we can solve the wrong convergence problem. However, this prescription may lead to a recurrence of the sign problem in the complex Langevin method for quantum many-body systems.

  7. Complexity of hierarchically and 1-dimensional periodically specified problems

    SciTech Connect

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.; Radhakrishnan, V.

    1995-08-23

    We study the complexity of various combinatorial and satisfiability problems when instances are specified using one of the following specifications: (1) the 1-dimensional finite periodic narrow specifications of Wanke and Ford et al. (2) the 1-dimensional finite periodic narrow specifications with explicit boundary conditions of Gale (3) the 2-way infinite1-dimensional narrow periodic specifications of Orlin et al. and (4) the hierarchical specifications of Lengauer et al. we obtain three general types of results. First, we prove that there is a polynomial time algorithm that given a 1-FPN- or 1-FPN(BC)specification of a graph (or a C N F formula) constructs a level-restricted L-specification of an isomorphic graph (or formula). This theorem along with the hardness results proved here provides alternative and unified proofs of many hardness results proved in the past either by Lengauer and Wagner or by Orlin. Second, we study the complexity of generalized CNF satisfiability problems of Schaefer. Assuming P {ne} PSPACE, we characterize completely the polynomial time solvability of these problems, when instances are specified as in (1), (2),(3) or (4). As applications of our first two types of results, we obtain a number of new PSPACE-hardness and polynomial time algorithms for problems specified as in (1), (2), (3) or(4). Many of our results also hold for O(log N) bandwidth bounded planar instances.

  8. COMPLEXITY&APPROXIMABILITY OF QUANTIFIED&STOCHASTIC CONSTRAINT SATISFACTION PROBLEMS

    SciTech Connect

    Hunt, H. B.; Marathe, M. V.; Stearns, R. E.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S and T be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SATc(S).) Here, we study simultaneously the complexity of decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. We present simple yet general techniques to characterize simultaneously, the complexity or efficient approximability of a number of versions/variants of the problems SAT(S), Q-SAT(S), S-SAT(S),MAX-Q-SAT(S) etc., for many different such D,C ,S, T. These versions/variants include decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. Our unified approach is based on the following two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic represent ability. Some of the results extend the earlier results in [Pa85,LMP99,CF+93,CF+94O]u r techniques and results reported here also provide significant steps towards obtaining dichotomy theorems, for a number of the problems above, including the problems MAX-&-SAT( S), and MAX-S-SAT(S). The discovery of such dichotomy theorems, for unquantified formulas, has received significant recent attention in the literature [CF+93,CF+94,Cr95,KSW97

  9. Analyzing complex functional brain networks: Fusing statistics and network science to understand the brain*†

    PubMed Central

    Simpson, Sean L.; Bowman, F. DuBois; Laurienti, Paul J.

    2014-01-01

    Complex functional brain network analyses have exploded over the last decade, gaining traction due to their profound clinical implications. The application of network science (an interdisciplinary offshoot of graph theory) has facilitated these analyses and enabled examining the brain as an integrated system that produces complex behaviors. While the field of statistics has been integral in advancing activation analyses and some connectivity analyses in functional neuroimaging research, it has yet to play a commensurate role in complex network analyses. Fusing novel statistical methods with network-based functional neuroimage analysis will engender powerful analytical tools that will aid in our understanding of normal brain function as well as alterations due to various brain disorders. Here we survey widely used statistical and network science tools for analyzing fMRI network data and discuss the challenges faced in filling some of the remaining methodological gaps. When applied and interpreted correctly, the fusion of network scientific and statistical methods has a chance to revolutionize the understanding of brain function. PMID:25309643

  10. TOPAZ - the transient one-dimensional pipe flow analyzer: code validation and sample problems

    SciTech Connect

    Winters, W.S.

    1985-10-01

    TOPAZ is a ''user friendly'' computer code for modeling the one-dimensional-transient physics of multi-species gas transfer in arbitrary arrangements of pipes, valves, vessels, and flow branches. This document presents a series of sample problems designed to aid potential users in creating TOPAZ input files. To the extent possible, sample problems were selected for which analytical solutions currently exist. TOPAZ comparisons with such solutions are intended to provide a measure of code validation.

  11. Employing the Hilbert-Huang Transform to analyze observed natural complex signals: Calm wind meandering cases

    NASA Astrophysics Data System (ADS)

    Martins, Luis Gustavo Nogueira; Stefanello, Michel Baptistella; Degrazia, Gervásio Annes; Acevedo, Otávio Costa; Puhales, Franciano Scremin; Demarco, Giuliano; Mortarini, Luca; Anfossi, Domenico; Roberti, Débora Regina; Denardin, Felipe Costa; Maldaner, Silvana

    2016-11-01

    In this study we analyze natural complex signals employing the Hilbert-Huang spectral analysis. Specifically, low wind meandering meteorological data are decomposed into turbulent and non turbulent components. These non turbulent movements, responsible for the absence of a preferential direction of the horizontal wind, provoke negative lobes in the meandering autocorrelation functions. The meandering characteristic time scales (meandering periods) are determined from the spectral peak provided by the Hilbert-Huang marginal spectrum. The magnitudes of the temperature and horizontal wind meandering period obtained agree with the results found from the best fit of the heuristic meandering autocorrelation functions. Therefore, the new method represents a new procedure to evaluate meandering periods that does not employ mathematical expressions to represent observed meandering autocorrelation functions.

  12. Complex Langevin: etiology and diagnostics of its main problem

    NASA Astrophysics Data System (ADS)

    Aarts, Gert; James, Frank A.; Seiler, Erhard; Stamatescu, Ion-Olimpiu

    2011-10-01

    The complex Langevin method is a leading candidate for solving the so-called sign problem occurring in various physical situations. Its most vexing problem is that sometimes it produces `convergence to the wrong limit'. In this paper we carefully revisit the formal justification of the method, identifying points at which it may fail and derive a necessary and sufficient criterion for correctness. This criterion is, however, not practical, since its application requires checking an infinite tower of identities. We propose instead a practical test involving only a check of the first few of those identities; this raises the question of the `sensitivity' of the test. This sensitivity as well as the general insights into the possible reasons of failure (the etiology) are then tested in two toy models where the correct answer is known. At least in those models the test works perfectly.

  13. Coordinating complex problem-solving among distributed intelligent agents

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1992-01-01

    A process-oriented control model is described for distributed problem solving. The model coordinates the transfer and manipulation of information across independent networked applications, both intelligent and conventional. The model was implemented using SOCIAL, a set of object-oriented tools for distributing computing. Complex sequences of distributed tasks are specified in terms of high level scripts. Scripts are executed by SOCIAL objects called Manager Agents, which realize an intelligent coordination model that routes individual tasks to suitable server applications across the network. These tools are illustrated in a prototype distributed system for decision support of ground operations for NASA's Space Shuttle fleet.

  14. The Deadlock Recovery Problem in the AGV System with the Ladder Guidepath Layout and its Computational Complexity

    NASA Astrophysics Data System (ADS)

    Koizumi, Kenji; Masuyama, Shigeru

    This paper proposes the minimum time deadlock recovery problem in the AGV system with the ladder guidepath layout(DRPL, for short) and analyzes its computational complexity. In order to analyze the computational complexity, this paper introduces the decision problem version of DRPL to ask whether all deadlocks in the AGV system are recoverable within predetermined time, and NP-hardness in a special case of the problem is proved. Moreover, the condition by which the problem becomes NP-hard when the AGV system has a ladder guidepath layout is clarified, and we propose a polynomial time algorithm that solves the optimization problem version of this problem whenever the problem in the ladder guidepath layout is not NP-hard.

  15. Revisit an old problem -- Complexation between DNA and PEI

    NASA Astrophysics Data System (ADS)

    Wu, Chi

    2009-03-01

    After revisiting the captioned problem by using a combination of chemical synthesis and physical methods, we studied the dynamics of the complexation between branched polyethyleneimine (bPEI) and plasmid DNA (pDNA) and characterized the structure, size and surface charge of the resultant DNA/PEI complexes (polyplexes). As expected, in order to reach a high efficiency in gene transfection into cells it is necessary to use a higher N:P ratio and make the polyplexes positively charged. Our results reveal that it is those uncomplexed bPEI chains free in the solution mixture that plays a vitally important role in enhancing the transfection efficiency, inspiring new thinking of how to correlate in vitro and in vivo studies so that we can improve the in vivo transfection efficiency. Increasing the N:P ratio normally results in a higher cytotoxicity, which is a catch-22 problem. Recently, we found that a proper modification of bPEI can greatly reduce its cytotoxicity without any suffering in the transfection efficiency. In this lecture, we will show that our properly modified bPEI is even much more effective and less cytotoxic in the gene transfection than those commercially available lipoflexes. Our recent breakthrough leads to a complete new direction in the development of non-viral vectors for molecular medicines, including gene transfection.

  16. Analyzing Energy and Resource Problems: An Interdisciplinary Approach to Mathematical Modeling.

    ERIC Educational Resources Information Center

    Fishman, Joseph

    1993-01-01

    Suggests ways in which mathematical models can be presented and developed in the classroom to promote discussion, analysis, and understanding of issues related to energy consumption. Five problems deal with past trends and future projections of availability of a nonrenewable resource, natural gas. (Contains 13 references.) (MDH)

  17. Case Studies in Critical Ecoliteracy: A Curriculum for Analyzing the Social Foundations of Environmental Problems

    ERIC Educational Resources Information Center

    Turner, Rita; Donnelly, Ryan

    2013-01-01

    This article outlines the features and application of a set of model curriculum materials that utilize eco-democratic principles and humanities-based content to cultivate critical analysis of the cultural foundations of socio-environmental problems. We first describe the goals and components of the materials, then discuss results of their use in…

  18. Analyzing and Attempting to Overcome Prospective Teachers' Difficulties during Problem-Solving Instruction

    ERIC Educational Resources Information Center

    Karp, Alexander

    2010-01-01

    This article analyzes the experiences of prospective secondary mathematics teachers during a teaching methods course, offered prior to their student teaching, but involving actual teaching and reflexive analysis of this teaching. The study focuses on the pedagogical difficulties that arose during their teaching, in which prospective teachers…

  19. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    NASA Technical Reports Server (NTRS)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  20. Leveraging Cultural Resources through Teacher Pedagogical Reasoning: Elementary Grade Teachers Analyze Second Language Learners' Science Problem Solving

    ERIC Educational Resources Information Center

    Buxton, Cory A.; Salinas, Alejandra; Mahotiere, Margarette; Lee, Okhee; Secada, Walter G.

    2013-01-01

    Grounded in teacher professional development addressing the intersection of student diversity and content area instruction, this study examined school teachers' pedagogical reasoning complexity as they reflected on their second language learners' science problem solving abilities using both home and school contexts. Teachers responded to interview…

  1. Quantum trajectories in complex space: one-dimensional stationary scattering problems.

    PubMed

    Chou, Chia-Chun; Wyatt, Robert E

    2008-04-21

    One-dimensional time-independent scattering problems are investigated in the framework of the quantum Hamilton-Jacobi formalism. The equation for the local approximate quantum trajectories near the stagnation point of the quantum momentum function is derived, and the first derivative of the quantum momentum function is related to the local structure of quantum trajectories. Exact complex quantum trajectories are determined for two examples by numerically integrating the equations of motion. For the soft potential step, some particles penetrate into the nonclassical region, and then turn back to the reflection region. For the barrier scattering problem, quantum trajectories may spiral into the attractors or from the repellers in the barrier region. Although the classical potentials extended to complex space show different pole structures for each problem, the quantum potentials present the same second-order pole structure in the reflection region. This paper not only analyzes complex quantum trajectories and the total potentials for these examples but also demonstrates general properties and similar structures of the complex quantum trajectories and the quantum potentials for one-dimensional time-independent scattering problems. PMID:18433189

  2. Understanding the determinants of problem-solving behavior in a complex environment

    NASA Technical Reports Server (NTRS)

    Casner, Stephen A.

    1994-01-01

    It is often argued that problem-solving behavior in a complex environment is determined as much by the features of the environment as by the goals of the problem solver. This article explores a technique to determine the extent to which measured features of a complex environment influence problem-solving behavior observed within that environment. In this study, the technique is used to determine how complex flight deck and air traffic control environment influences the strategies used by airline pilots when controlling the flight path of a modern jetliner. Data collected aboard 16 commercial flights are used to measure selected features of the task environment. A record of the pilots' problem-solving behavior is analyzed to determine to what extent behavior is adapted to the environmental features that were measured. The results suggest that the measured features of the environment account for as much as half of the variability in the pilots' problem-solving behavior and provide estimates on the probable effects of each environmental feature.

  3. Human opinion dynamics: an inspiration to solve complex optimization problems.

    PubMed

    Kaur, Rishemjit; Kumar, Ritesh; Bhondekar, Amol P; Kapur, Pawan

    2013-01-01

    Human interactions give rise to the formation of different kinds of opinions in a society. The study of formations and dynamics of opinions has been one of the most important areas in social physics. The opinion dynamics and associated social structure leads to decision making or so called opinion consensus. Opinion formation is a process of collective intelligence evolving from the integrative tendencies of social influence with the disintegrative effects of individualisation, and therefore could be exploited for developing search strategies. Here, we demonstrate that human opinion dynamics can be utilised to solve complex mathematical optimization problems. The results have been compared with a standard algorithm inspired from bird flocking behaviour and the comparison proves the efficacy of the proposed approach in general. Our investigation may open new avenues towards understanding the collective decision making. PMID:24141795

  4. Human opinion dynamics: An inspiration to solve complex optimization problems

    PubMed Central

    Kaur, Rishemjit; Kumar, Ritesh; Bhondekar, Amol P.; Kapur, Pawan

    2013-01-01

    Human interactions give rise to the formation of different kinds of opinions in a society. The study of formations and dynamics of opinions has been one of the most important areas in social physics. The opinion dynamics and associated social structure leads to decision making or so called opinion consensus. Opinion formation is a process of collective intelligence evolving from the integrative tendencies of social influence with the disintegrative effects of individualisation, and therefore could be exploited for developing search strategies. Here, we demonstrate that human opinion dynamics can be utilised to solve complex mathematical optimization problems. The results have been compared with a standard algorithm inspired from bird flocking behaviour and the comparison proves the efficacy of the proposed approach in general. Our investigation may open new avenues towards understanding the collective decision making. PMID:24141795

  5. Human opinion dynamics: An inspiration to solve complex optimization problems

    NASA Astrophysics Data System (ADS)

    Kaur, Rishemjit; Kumar, Ritesh; Bhondekar, Amol P.; Kapur, Pawan

    2013-10-01

    Human interactions give rise to the formation of different kinds of opinions in a society. The study of formations and dynamics of opinions has been one of the most important areas in social physics. The opinion dynamics and associated social structure leads to decision making or so called opinion consensus. Opinion formation is a process of collective intelligence evolving from the integrative tendencies of social influence with the disintegrative effects of individualisation, and therefore could be exploited for developing search strategies. Here, we demonstrate that human opinion dynamics can be utilised to solve complex mathematical optimization problems. The results have been compared with a standard algorithm inspired from bird flocking behaviour and the comparison proves the efficacy of the proposed approach in general. Our investigation may open new avenues towards understanding the collective decision making.

  6. Determining electron temperature for small spherical probes from network analyzer measurements of complex impedance

    SciTech Connect

    Walker, D. N.; Fernsler, R. F.; Blackwell, D. D.; Amatucci, W. E.

    2008-12-15

    In earlier work, using a network analyzer, it was shown that collisionless resistance (CR) exists in the sheath of a spherical probe when driven by a small rf signal. The CR is inversely proportional to the plasma density gradient at the location where the applied angular frequency equals the plasma frequency {omega}{sub pe}. Recently, efforts have concentrated on a study of the low-to-intermediate frequency response of the probe to the rf signal. At sufficiently low frequencies, the CR is beyond cutoff, i.e., below the plasma frequency at the surface of the probe. Since the electron density at the probe surface decreases as a function of applied (negative) bias, the CR will extend to lower frequencies as the magnitude of negative bias increases. Therefore to eliminate both CR and ion current contributions, the frequencies presently being considered are much greater than the ion plasma frequency, {omega}{sub pi}, but less than the plasma frequency, {omega}{sub pe}(r{sub 0}), where r{sub 0} is the probe radius. It is shown that, in this frequency regime, the complex impedance measurements made with a network analyzer can be used to determine electron temperature. An overview of the theory is presented along with comparisons to data sets made using three stainless steel spherical probes of different sizes in different experimental environments and different plasma parameter regimes. The temperature measurements made by this method are compared to those made by conventional Langmuir probe sweeps; the method shown here requires no curve fitting as is the usual procedure with Langmuir probes when a Maxwell-Boltzmann electron distribution is assumed. The new method requires, however, a solution of the Poisson equation to determine the approximate sheath dimensions and integrals to determine approximate plasma and sheath inductances. The solution relies on the calculation of impedance for a spherical probe immersed in a collisionless plasma and is based on a simple

  7. Calculation of a nonlinear eigenvalue problem based on the MMP method for analyzing photonic crystals

    NASA Astrophysics Data System (ADS)

    Jalali, Tahmineh

    2014-12-01

    The multiple multipoles (MMP) method is used to solve a nonlinear eigenvalue problem for analysis of a 2D metallic and dielectric photonic crystal. Simulation space is implemented in the first Brillouin zone, in order to obtain band structure and modal fields and in the supercell to calculate waveguide modes. The Bloch theorem is used to implement fictitious periodic boundary conditions for the first Brillouin zone and supercell. This method successfully computes the transmission and reflection coefficients of photonic crystal waveguide without significant error for termination of the computational space. To validate our code, the band structure of a cubic lattice is simulated and results are compared with results of the plane wave expansion method. The proposed method is shown to be applicable to photonic crystals of irregular shape and frequency dependent (independent) materials, such as dielectric or dispersive material, and experimental data for different lattice structures. Numerical calculations show that the MMP method is stable, accurate and fast and can be used on personal computers.

  8. Strategies in Forecasting Outcomes in Ethical Decision-making: Identifying and Analyzing the Causes of the Problem

    PubMed Central

    Beeler, Cheryl K.; Antes, Alison L.; Wang, Xiaoqian; Caughron, Jared J.; Thiel, Chase E.; Mumford, Michael D.

    2010-01-01

    This study examined the role of key causal analysis strategies in forecasting and ethical decision-making. Undergraduate participants took on the role of the key actor in several ethical problems and were asked to identify and analyze the causes, forecast potential outcomes, and make a decision about each problem. Time pressure and analytic mindset were manipulated while participants worked through these problems. The results indicated that forecast quality was associated with decision ethicality, and the identification of the critical causes of the problem was associated with both higher quality forecasts and higher ethicality of decisions. Neither time pressure nor analytic mindset impacted forecasts or ethicality of decisions. Theoretical and practical implications of these findings are discussed. PMID:20352056

  9. A formulation to analyze system-of-systems problems: A case study of airport metroplex operations

    NASA Astrophysics Data System (ADS)

    Ayyalasomayajula, Sricharan Kishore

    A system-of-systems (SoS) can be described as a collection of multiple, heterogeneous, distributed, independent components interacting to achieve a range of objectives. A generic formulation was developed to model component interactions in an SoS to understand their influence on overall SoS performance. The formulation employs a lexicon to aggregate components into hierarchical interaction networks and understand how their topological properties affect the performance of the aggregations. Overall SoS performance is evaluated by monitoring the changes in stakeholder profitability due to changes in component interactions. The formulation was applied to a case study in air transportation focusing on operations at airport metroplexes. Metroplexes are geographical regions with two or more airports in close proximity to one another. The case study explored how metroplex airports interact with one another, what dependencies drive these interactions, and how these dependencies affect metroplex throughput and capacity. Metrics were developed to quantify runway dependencies at a metroplex and were correlated with its throughput and capacity. Operations at the New York/New Jersey metroplex (NYNJ) airports were simulated to explore the feasibility of operating very large aircraft (VLA), such as the Airbus A380, as a delay-mitigation strategy at these airports. The proposed formulation was employed to analyze the impact of this strategy on different stakeholders in the national air transportation system (ATS), such as airlines and airports. The analysis results and their implications were used to compare the pros and cons of operating VLAs at NYNJ from the perspectives of airline profitability, and flight delays at NYNJ and across the ATS.

  10. Complex Problem Exercises in Developing Engineering Students' Conceptual and Procedural Knowledge of Electromagnetics

    ERIC Educational Resources Information Center

    Leppavirta, J.; Kettunen, H.; Sihvola, A.

    2011-01-01

    Complex multistep problem exercises are one way to enhance engineering students' learning of electromagnetics (EM). This study investigates whether exposure to complex problem exercises during an introductory EM course improves students' conceptual and procedural knowledge. The performance in complex problem exercises is compared to prior success…

  11. Effects of friction and heat conduction on sound propagation in ducts. [analyzing complex aerodynamic noise problems

    NASA Technical Reports Server (NTRS)

    Huerre, P.; Karamcheti, K.

    1976-01-01

    The theory of sound propagation is examined in a viscous, heat-conducting fluid, initially at rest and in a uniform state, and contained in a rigid, impermeable duct with isothermal walls. Topics covered include: (1) theoretical formulation of the small amplitude fluctuating motions of a viscous, heat-conducting and compressible fluid; (2) sound propagation in a two dimensional duct; and (3) perturbation study of the inplane modes.

  12. Deep graphs-A general framework to represent and analyze heterogeneous complex systems across scales.

    PubMed

    Traxl, Dominik; Boers, Niklas; Kurths, Jürgen

    2016-06-01

    Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of

  13. Deep graphs—A general framework to represent and analyze heterogeneous complex systems across scales

    NASA Astrophysics Data System (ADS)

    Traxl, Dominik; Boers, Niklas; Kurths, Jürgen

    2016-06-01

    Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of

  14. Making mobility-related disability better: a complex response to a complex problem.

    PubMed

    Rockwood, Kenneth

    2012-01-01

    Mobility disability in older adults can arise from single system problems, such as discrete musculoskeletal injury. In frail older adults, however, mobility disability is part of a complex web of problems. The approach to their rehabilitation must take that complexity into account, as is reported by Fairhall et al. First, their overall health state must be assessed, which is achieved by a comprehensive geriatric assessment. The assessment can show how a particular patient came to be disabled, so that an individualized care plan can be worked out. Whether this approach works in general can be evaluated by looking at group differences in mean mobility test scores. Knowing whether it has worked in the individual patient requires an individualized measure. This is because not every patient starts from the same point, and not every patient achieves success by aiming for the same goal. For one patient, walking unassisted for three metres would be a triumph; for another it would be a tragedy. Unless we understand the complexity of the needs of frail older adults, we will neither be able to treat them effectively nor evaluate our efforts sensibly.Please see related article http://www.biomedcentral.com/1741-7015/10/120.

  15. Inverse Problems in Complex Models and Applications to Earth Sciences

    NASA Astrophysics Data System (ADS)

    Bosch, M. E.

    2015-12-01

    The inference of the subsurface earth structure and properties requires the integration of different types of data, information and knowledge, by combined processes of analysis and synthesis. To support the process of integrating information, the regular concept of data inversion is evolving to expand its application to models with multiple inner components (properties, scales, structural parameters) that explain multiple data (geophysical survey data, well-logs, core data). The probabilistic inference methods provide the natural framework for the formulation of these problems, considering a posterior probability density function (PDF) that combines the information from a prior information PDF and the new sets of observations. To formulate the posterior PDF in the context of multiple datasets, the data likelihood functions are factorized assuming independence of uncertainties for data originating across different surveys. A realistic description of the earth medium requires modeling several properties and structural parameters, which relate to each other according to dependency and independency notions. Thus, conditional probabilities across model components also factorize. A common setting proceeds by structuring the model parameter space in hierarchical layers. A primary layer (e.g. lithology) conditions a secondary layer (e.g. physical medium properties), which conditions a third layer (e.g. geophysical data). In general, less structured relations within model components and data emerge from the analysis of other inverse problems. They can be described with flexibility via direct acyclic graphs, which are graphs that map dependency relations between the model components. Examples of inverse problems in complex models can be shown at various scales. At local scale, for example, the distribution of gas saturation is inferred from pre-stack seismic data and a calibrated rock-physics model. At regional scale, joint inversion of gravity and magnetic data is applied

  16. Aiming at strategies for a complex problem of medical nonadherence.

    PubMed

    Castellano, Jose M; Copeland-Halperin, Robert; Fuster, Valentin

    2013-09-01

    The deteriorating health of the population and the increasing prevalence of chronic diseases are global problems whose causes are multifactorial and complex. The Western lifestyle does not promote healthy living, and the consequences are most devastating when social inequalities, together with the economic and population explosion of recent decades, are considered. The expansion of poor nutritional habits, obesity, sedentarism, and hypertension are increasingly contributing to the development of a cardiovascular disease epidemic. Recent data on the rates of compliance with lifestyle modification and adherence to prescribed medication are alarming. Over 50% of patients, on average, decide to abandon the treatment prescribed, and the objectives to improve their habits (quit smoking, lose weight, or engage in physical activity) are met by an equal or lower percentage. Beyond the impact it has on individual health, it carries a huge economic cost, as it is associated with a failure in achieving therapeutic goals, higher rate of hospitalization, and death. Improving communication between doctors and patients, the active involvement of other health professionals, and the development of combination drug formulations (polypill) are potential strategies for improving adherence and reducing costs.

  17. Postoperative nausea and vomiting: A simple yet complex problem

    PubMed Central

    Shaikh, Safiya Imtiaz; Nagarekha, D.; Hegade, Ganapati; Marutheesh, M.

    2016-01-01

    Postoperative nausea and vomiting (PONV) is one of the complex and significant problems in anesthesia practice, with growing trend toward ambulatory and day care surgeries. This review focuses on pathophysiology, pharmacological prophylaxis, and rescue therapy for PONV. We searched the Medline and PubMed database for articles published in English from 1991 to 2014 while writing this review using “postoperative nausea and vomiting, PONV, nausea-vomiting, PONV prophylaxis, and rescue” as keywords. PONV is influenced by multiple factors which are related to the patient, surgery, and pre-, intra-, and post-operative anesthesia factors. The risk of PONV can be assessed using a scoring system such as Apfel simplified scoring system which is based on four independent risk predictors. PONV prophylaxis is administered to patients with medium and high risks based on this scoring system. Newer drugs such as neurokinin-1 receptor antagonist (aprepitant) are used along with serotonin (5-hydroxytryptamine subtype 3) receptor antagonist, corticosteroids, anticholinergics, antihistaminics, and butyrophenones for PONV prophylaxis. Combination of drugs from different classes with different mechanism of action are administered for optimized efficacy in adults with moderate risk for PONV. Multimodal approach with combination of pharmacological and nonpharmacological prophylaxis along with interventions that reduce baseline risk is employed in patients with high PONV risk. PMID:27746521

  18. Eye-Tracking Study of Complexity in Gas Law Problems

    ERIC Educational Resources Information Center

    Tang, Hui; Pienta, Norbert

    2012-01-01

    This study, part of a series investigating students' use of online tools to assess problem solving, uses eye-tracking hardware and software to explore the effect of problem difficulty and cognitive processes when students solve gas law word problems. Eye movements are indices of cognition; eye-tracking data typically include the location,…

  19. The Influence of Prior Experience and Process Utilization in Solving Complex Problems.

    ERIC Educational Resources Information Center

    Sterner, Paula; Wedman, John

    By using ill-structured problems and examining problem- solving processes, this study was conducted to explore the nature of solving complex, multistep problems, focusing on how prior knowledge, problem-solving process utilization, and analogical problem solving are related to success. Twenty-four college students qualified to participate by…

  20. An eye-tracking paradigm for analyzing the processing time of sentences with different linguistic complexities.

    PubMed

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics.

  1. Using New Models to Analyze Complex Regularities of the World: Commentary on Musso et al. (2013)

    ERIC Educational Resources Information Center

    Nokelainen, Petri; Silander, Tomi

    2014-01-01

    This commentary to the recent article by Musso et al. (2013) discusses issues related to model fitting, comparison of classification accuracy of generative and discriminative models, and two (or more) cultures of data modeling. We start by questioning the extremely high classification accuracy with an empirical data from a complex domain. There is…

  2. An Effective Methodology for Processing and Analyzing Large, Complex Spacecraft Data Streams

    ERIC Educational Resources Information Center

    Teymourlouei, Haydar

    2013-01-01

    The emerging large datasets have made efficient data processing a much more difficult task for the traditional methodologies. Invariably, datasets continue to increase rapidly in size with time. The purpose of this research is to give an overview of some of the tools and techniques that can be utilized to manage and analyze large datasets. We…

  3. Complex Problem Solving in Educational Contexts--Something beyond "g": Concept, Assessment, Measurement Invariance, and Construct Validity

    ERIC Educational Resources Information Center

    Greiff, Samuel; Wustenberg, Sascha; Molnar, Gyongyver; Fischer, Andreas; Funke, Joachim; Csapo, Beno

    2013-01-01

    Innovative assessments of cross-curricular competencies such as complex problem solving (CPS) have currently received considerable attention in large-scale educational studies. This study investigated the nature of CPS by applying a state-of-the-art approach to assess CPS in high school. We analyzed whether two processes derived from cognitive…

  4. An Activation Force-based Affinity Measure for Analyzing Complex Networks

    PubMed Central

    Guo, Jun; Guo, Hanliang; Wang, Zhanyi

    2011-01-01

    Affinity measure is a key factor that determines the quality of the analysis of a complex network. Here, we introduce a type of statistics, activation forces, to weight the links of a complex network and thereby develop a desired affinity measure. We show that the approach is superior in facilitating the analysis through experiments on a large-scale word network and a protein-protein interaction (PPI) network consisting of ∼5,000 human proteins. The experiment on the word network verifies that the measured word affinities are highly consistent with human knowledge. Further, the experiment on the PPI network verifies the measure and presents a general method for the identification of functionally similar proteins based on PPIs. Most strikingly, we find an affinity network that compactly connects the cancer-associated proteins to each other, which may reveal novel information for cancer study; this includes likely protein interactions and key proteins in cancer-related signal transduction pathways. PMID:22355630

  5. Shear wave splitting beneath the Bighorn Mountains, Wyoming: Analyzing the need for models of complex anisotropy

    NASA Astrophysics Data System (ADS)

    Solomon, M. A.; Schutt, D.

    2010-12-01

    In this study, we examine complexity in upper mantle anisotropy observed at the high-density Bighorns Mountains broadband array in north-central Wyoming. Preliminary results (Anderson et al., pers. com.) using standard methods show largely variable fast-axis orientations and consistent but unusually small delay times (avg. 0.7 s). At the nearby Billings Array that was emplaced just to the north of the Bighorns in from 1999 to 2001, two layers of anisotropy were found, with the bottom layer striking parallel to but dipping opposite to what passive plate shear of the asthenosphere would predict, and an upper layer consistent with LPO accretion associated with drift of the North American plate during the Mesozoic [Yuan et al., 2008]. The ongoing Bighorns, and past Billings measurements, suggest complex anisotropy exists throughout the region. To characterize this anisotropy, we are forward modeling possible multiple-layer structure and comparing observed SKS to predicted SKS using the Neighborhood Algorithm [Sambridge, 1999] to guide the search through model space, and using the cross-convolution method to measure goodness of fit [Menke and Levin, 2003]. This combination of methods provides for statistical examination of the fit of various complex models, and proves more effective than fitting back-azimuthal variations of splitting times [Yuan et al., 2008].

  6. Investigating the Effect of Complexity Factors in Stoichiometry Problems Using Logistic Regression and Eye Tracking

    ERIC Educational Resources Information Center

    Tang, Hui; Kirk, John; Pienta, Norbert J.

    2014-01-01

    This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we…

  7. A note on the Dirichlet problem for model complex partial differential equations

    NASA Astrophysics Data System (ADS)

    Ashyralyev, Allaberen; Karaca, Bahriye

    2016-08-01

    Complex model partial differential equations of arbitrary order are considered. The uniqueness of the Dirichlet problem is studied. It is proved that the Dirichlet problem for higher order of complex partial differential equations with one complex variable has infinitely many solutions.

  8. A complexity analysis of space-bounded learning algorithms for the constraint satisfaction problem

    SciTech Connect

    Bayardo, R.J. Jr.; Miranker, D.P.

    1996-12-31

    Learning during backtrack search is a space-intensive process that records information (such as additional constraints) in order to avoid redundant work. In this paper, we analyze the effects of polynomial-space-bounded learning on runtime complexity of backtrack search. One space-bounded learning scheme records only those constraints with limited size, and another records arbitrarily large constraints but deletes those that become irrelevant to the portion of the search space being explored. We find that relevance-bounded learning allows better runtime bounds than size-bounded learning on structurally restricted constraint satisfaction problems. Even when restricted to linear space, our relevance-bounded learning algorithm has runtime complexity near that of unrestricted (exponential space-consuming) learning schemes.

  9. Technologically Mediated Complex Problem-Solving on a Statistics Task

    ERIC Educational Resources Information Center

    Scanlon, Eileen; Blake, Canan; Joiner, Richard; O'Shea, Tim

    2005-01-01

    Simulations on computers can allow many experiments to be conducted quickly to help students develop an understanding of statistical topics. We used a simulation of a challenging problem in statistics as the focus of an exploration of situations where members of a problem-solving group are physically separated then reconnected via combinations of…

  10. Analyzing complex wake-terrain interactions and its implications on wind-farm performance.

    NASA Astrophysics Data System (ADS)

    Tabib, Mandar; Rasheed, Adil; Fuchs, Franz

    2016-09-01

    Rotating wind turbine blades generate complex wakes involving vortices (helical tip-vortex, root-vortex etc.).These wakes are regions of high velocity deficits and high turbulence intensities and they tend to degrade the performance of down-stream turbines. Hence, a conservative inter-turbine distance of up-to 10 times turbine diameter (10D) is sometimes used in wind-farm layout (particularly in cases of flat terrain). This ensures that wake-effects will not reduce the overall wind-farm performance, but this leads to larger land footprint for establishing a wind-farm. In-case of complex-terrain, within a short distance (say 10D) itself, the nearby terrain can rise in altitude and be high enough to influence the wake dynamics. This wake-terrain interaction can happen either (a) indirectly, through an interaction of wake (both near tip vortex and far wake large-scale vortex) with terrain induced turbulence (especially, smaller eddies generated by small ridges within the terrain) or (b) directly, by obstructing the wake-region partially or fully in its flow-path. Hence, enhanced understanding of wake- development due to wake-terrain interaction will help in wind farm design. To this end the current study involves: (1) understanding the numerics for successful simulation of vortices, (2) understanding fundamental vortex-terrain interaction mechanism through studies devoted to interaction of a single vortex with different terrains, (3) relating influence of vortex-terrain interactions to performance of a wind-farm by studying a multi-turbine wind-farm layout under different terrains. The results on interaction of terrain and vortex has shown a much faster decay of vortex for complex terrain compared to a flatter-terrain. The potential reasons identified explaining the observation are (a) formation of secondary vortices in flow and its interaction with the primary vortex and (b) enhanced vorticity diffusion due to increased terrain-induced turbulence. The implications of

  11. The Complexity Status of Problems Related to Sparsest Cuts

    NASA Astrophysics Data System (ADS)

    Bonsma, Paul; Broersma, Hajo; Patel, Viresh; Pyatkin, Artem

    Given an undirected graph G = (V,E) with a capacity function w : E longrightarrow {Z}^+ on the edges, the sparsest cut problem is to find a vertex subset S ⊂ V minimizing ∑ e ∈ E(S,V ∖ S) w(e)/(|S||V ∖ S|). This problem is NP-hard. The proof can be found in [16]. In the case of unit capacities (i. e. if w(e) = 1 for every e ∈ E) the problem is to minimize |E(S,V ∖ S)|/(|S||V ∖ S|) over all subsets S ⊂ V. While this variant of the sparsest cut problem is often assumed to be NP-hard, this note contains the first proof of this fact. We also prove that the problem is polynomially solvable for graphs of bounded treewidth.

  12. Nuclear three-body problem in the complex energy plane: Complex-scaling Slater method

    NASA Astrophysics Data System (ADS)

    Kruppa, A. T.; Papadimitriou, G.; Nazarewicz, W.; Michel, N.

    2014-01-01

    Background: The physics of open quantum systems is an interdisciplinary area of research. The nuclear "openness" manifests itself through the presence of the many-body continuum representing various decay, scattering, and reaction channels. As the radioactive nuclear beam experimentation extends the known nuclear landscape toward the particle drip lines, the coupling to the continuum space becomes exceedingly more important. Of particular interest are weakly bound and unbound nuclear states appearing around particle thresholds. Theories of such nuclei must take into account their open quantum nature. Purpose: To describe open quantum systems, we introduce a complex-scaling (CS) approach in the Slater basis. We benchmark it with the complex-energy Gamow shell model (GSM) by studying energies and wave functions of the bound and unbound states of the two-neutron halo nucleus 6He viewed as an α +n+n cluster system. Methods: Both CS and GSM approaches are applied to a translationally invariant Hamiltonian with the two-body interaction approximated by the finite-range central Minnesota force. In the CS approach, we use the Slater basis, which exhibits the correct asymptotic behavior at large distances. To extract particle densities from the back-rotated CS solutions, we apply the Tikhonov regularization procedure, which minimizes the ultraviolet numerical noise. Results: We show that the CS-Slater method is both accurate and efficient. Its equivalence to the GSM approach has been demonstrated numerically for both energies and wave functions of 6He. One important technical aspect of our calculation was to fully retrieve the correct asymptotic behavior of a resonance state from the complex-scaled (square-integrable) wave function. While standard applications of the inverse complex transformation to the complex-rotated solution provide unstable results, the stabilization method fully reproduces the GSM benchmark. We also propose a method to determine the smoothing

  13. Detrended Partial-Cross-Correlation Analysis: A New Method for Analyzing Correlations in Complex System

    PubMed Central

    Yuan, Naiming; Fu, Zuntao; Zhang, Huan; Piao, Lin; Xoplaki, Elena; Luterbacher, Juerg

    2015-01-01

    In this paper, a new method, detrended partial-cross-correlation analysis (DPCCA), is proposed. Based on detrended cross-correlation analysis (DCCA), this method is improved by including partial-correlation technique, which can be applied to quantify the relations of two non-stationary signals (with influences of other signals removed) on different time scales. We illustrate the advantages of this method by performing two numerical tests. Test I shows the advantages of DPCCA in handling non-stationary signals, while Test II reveals the “intrinsic” relations between two considered time series with potential influences of other unconsidered signals removed. To further show the utility of DPCCA in natural complex systems, we provide new evidence on the winter-time Pacific Decadal Oscillation (PDO) and the winter-time Nino3 Sea Surface Temperature Anomaly (Nino3-SSTA) affecting the Summer Rainfall over the middle-lower reaches of the Yangtze River (SRYR). By applying DPCCA, better significant correlations between SRYR and Nino3-SSTA on time scales of 6 ~ 8 years are found over the period 1951 ~ 2012, while significant correlations between SRYR and PDO on time scales of 35 years arise. With these physically explainable results, we have confidence that DPCCA is an useful method in addressing complex systems. PMID:25634341

  14. Teaching Problem Solving; the Effect of Algorithmic and Heuristic Problem Solving Training in Relation to Task Complexity and Relevant Aptitudes.

    ERIC Educational Resources Information Center

    de Leeuw, L.

    Sixty-four fifth and sixth-grade pupils were taught number series extrapolation by either an algorithm, fully prescribed problem-solving method or a heuristic, less prescribed method. The trained problems were within categories of two degrees of complexity. There were 16 subjects in each cell of the 2 by 2 design used. Aptitude Treatment…

  15. The Fallacy of Univariate Solutions to Complex Systems Problems

    PubMed Central

    Lessov-Schlaggar, Christina N.; Rubin, Joshua B.; Schlaggar, Bradley L.

    2016-01-01

    Complex biological systems, by definition, are composed of multiple components that interact non-linearly. The human brain constitutes, arguably, the most complex biological system known. Yet most investigation of the brain and its function is carried out using assumptions appropriate for simple systems—univariate design and linear statistical approaches. This heuristic must change before we can hope to discover and test interventions to improve the lives of individuals with complex disorders of brain development and function. Indeed, a movement away from simplistic models of biological systems will benefit essentially all domains of biology and medicine. The present brief essay lays the foundation for this argument. PMID:27375425

  16. The Fallacy of Univariate Solutions to Complex Systems Problems.

    PubMed

    Lessov-Schlaggar, Christina N; Rubin, Joshua B; Schlaggar, Bradley L

    2016-01-01

    Complex biological systems, by definition, are composed of multiple components that interact non-linearly. The human brain constitutes, arguably, the most complex biological system known. Yet most investigation of the brain and its function is carried out using assumptions appropriate for simple systems-univariate design and linear statistical approaches. This heuristic must change before we can hope to discover and test interventions to improve the lives of individuals with complex disorders of brain development and function. Indeed, a movement away from simplistic models of biological systems will benefit essentially all domains of biology and medicine. The present brief essay lays the foundation for this argument. PMID:27375425

  17. Complex Causal Process Diagrams for Analyzing the Health Impacts of Policy Interventions

    PubMed Central

    Joffe, Michael; Mindell, Jennifer

    2006-01-01

    Causal diagrams are rigorous tools for controlling confounding. They also can be used to describe complex causal systems, which is done routinely in communicable disease epidemiology. The use of change diagrams has advantages over static diagrams, because change diagrams are more tractable, relate better to interventions, and have clearer interpretations. Causal diagrams are a useful basis for modeling. They make assumptions explicit, provide a framework for analysis, generate testable predictions, explore the effects of interventions, and identify data gaps. Causal diagrams can be used to integrate different types of information and to facilitate communication both among public health experts and between public health experts and experts in other fields. Causal diagrams allow the use of instrumental variables, which can help control confounding and reverse causation. PMID:16449586

  18. Asbestos quantification in track ballast, a complex analytical problem

    NASA Astrophysics Data System (ADS)

    Cavallo, Alessandro

    2016-04-01

    Track ballast forms the trackbeb upon which railroad ties are laid. It is used to bear the load from the railroad ties, to facilitate water drainage, and also to keep down vegetation. It is typically made of angular crushed stone, with a grain size between 30 and 60 mm, with good mechanical properties (high compressive strength, freeze - thaw resistance, resistance to fragmentation). The most common rock types are represented by basalts, porphyries, orthogneisses, some carbonatic rocks and "green stones" (serpentinites, prasinites, amphibolites, metagabbros). Especially "green stones" may contain traces, and sometimes appreciable amounts of asbestiform minerals (chrysotile and/or fibrous amphiboles, generally tremolite - actinolite). In Italy, the chrysotile asbestos mine in Balangero (Turin) produced over 5 Mt railroad ballast (crushed serpentinites), which was used for the railways in northern and central Italy, from 1930 up to 1990. In addition to Balangero, several other serpentinite and prasinite quarries (e.g. Emilia Romagna) provided the railways ballast up to the year 2000. The legal threshold for asbestos content in track ballast is established in 1000 ppm: if the value is below this threshold, the material can be reused, otherwise it must be disposed of as hazardous waste, with very high costs. The quantitative asbestos determination in rocks is a very complex analytical issue: although techniques like TEM-SAED and micro-Raman are very effective in the identification of asbestos minerals, a quantitative determination on bulk materials is almost impossible or really expensive and time consuming. Another problem is represented by the discrimination of asbestiform minerals (e.g. chrysotile, asbestiform amphiboles) from the common acicular - pseudo-fibrous varieties (lamellar serpentine minerals, prismatic/acicular amphiboles). In this work, more than 200 samples from the main Italian rail yards were characterized by a combined use of XRD and a special SEM

  19. Solar optical codes evaluation for modeling and analyzing complex solar receiver geometries

    NASA Astrophysics Data System (ADS)

    Yellowhair, Julius; Ortega, Jesus D.; Christian, Joshua M.; Ho, Clifford K.

    2014-09-01

    Solar optical modeling tools are valuable for modeling and predicting the performance of solar technology systems. Four optical modeling tools were evaluated using the National Solar Thermal Test Facility heliostat field combined with flat plate receiver geometry as a benchmark. The four optical modeling tools evaluated were DELSOL, HELIOS, SolTrace, and Tonatiuh. All are available for free from their respective developers. DELSOL and HELIOS both use a convolution of the sunshape and optical errors for rapid calculation of the incident irradiance profiles on the receiver surfaces. SolTrace and Tonatiuh use ray-tracing methods to intersect the reflected solar rays with the receiver surfaces and construct irradiance profiles. We found the ray-tracing tools, although slower in computation speed, to be more flexible for modeling complex receiver geometries, whereas DELSOL and HELIOS were limited to standard receiver geometries such as flat plate, cylinder, and cavity receivers. We also list the strengths and deficiencies of the tools to show tool preference depending on the modeling and design needs. We provide an example of using SolTrace for modeling nonconventional receiver geometries. The goal is to transfer the irradiance profiles on the receiver surfaces calculated in an optical code to a computational fluid dynamics code such as ANSYS Fluent. This approach eliminates the need for using discrete ordinance or discrete radiation transfer models, which are computationally intensive, within the CFD code. The irradiance profiles on the receiver surfaces then allows for thermal and fluid analysis on the receiver.

  20. [Problems of formal organizational structure of industrial health care complexes].

    PubMed

    Włodarczyk, C

    1978-01-01

    The author formulates the thesis that the description of organizational structure of industrial health care complex calls for isolation of the following aspects:--structure of territorial links--systemof organizational units and divisions--organization of basic functions--structure of management--structure of supervision of middle and lowe-level personnel--composition of health care complex council--system of accessibility ranges. Each of the above aspects has been considered on the basis of operative rules of law, using organizational analysis methods.

  1. Behavioral tests of hippocampal function: simple paradigms complex problems.

    PubMed

    Gerlai, R

    2001-11-01

    Behavioral tests have become important tools for the analysis of functional effects of induced mutations in transgenic mice. However, depending on the type of mutation and several experimental parameters, false positive or negative findings may be obtained. Given the fact that molecular neurobiologists now make increasing use of behavioral paradigms in their research, it is imperative to revisit such problems. In this review three tests, T-maze spontaneous alternation task (T-CAT), Context dependent fear conditioning (CDFC), and Morris water maze (MWM) sensitive to hippocampal function, serve as illustrative examples for the potential problems. Spontaneous alternation tests are sometimes flawed because the handling procedure makes the test dependent on fear rather than exploratory behavior leading to altered alternation rates independent of hippocampal function. CDFC can provide misleading results because the context test, assumed to be a configural task dependent on the hippocampus, may have a significant elemental, i.e. cued, component. MWM may pose problems if its visible platform task is disproportionately easier for the subjects to solve than the hidden platform task, if the order of administration of visible and hidden platform tasks is not counterbalanced, or if inappropriate parameters are measured. Without attempting to be exhaustive, this review discusses such experimental problems and gives examples on how to avoid them.

  2. Games that Enlist Collective Intelligence to Solve Complex Scientific Problems.

    PubMed

    Burnett, Stephen; Furlong, Michelle; Melvin, Paul Guy; Singiser, Richard

    2016-03-01

    There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article. PMID:27047610

  3. Games that Enlist Collective Intelligence to Solve Complex Scientific Problems.

    PubMed

    Burnett, Stephen; Furlong, Michelle; Melvin, Paul Guy; Singiser, Richard

    2016-03-01

    There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article.

  4. Games that Enlist Collective Intelligence to Solve Complex Scientific Problems

    PubMed Central

    Burnett, Stephen; Furlong, Michelle; Melvin, Paul Guy; Singiser, Richard

    2016-01-01

    There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article. PMID:27047610

  5. Navigating complex decision spaces: Problems and paradigms in sequential choice

    PubMed Central

    Walsh, Matthew M.; Anderson, John R.

    2015-01-01

    To behave adaptively, we must learn from the consequences of our actions. Doing so is difficult when the consequences of an action follow a delay. This introduces the problem of temporal credit assignment. When feedback follows a sequence of decisions, how should the individual assign credit to the intermediate actions that comprise the sequence? Research in reinforcement learning provides two general solutions to this problem: model-free reinforcement learning and model-based reinforcement learning. In this review, we examine connections between stimulus-response and cognitive learning theories, habitual and goal-directed control, and model-free and model-based reinforcement learning. We then consider a range of problems related to temporal credit assignment. These include second-order conditioning and secondary reinforcers, latent learning and detour behavior, partially observable Markov decision processes, actions with distributed outcomes, and hierarchical learning. We ask whether humans and animals, when faced with these problems, behave in a manner consistent with reinforcement learning techniques. Throughout, we seek to identify neural substrates of model-free and model-based reinforcement learning. The former class of techniques is understood in terms of the neurotransmitter dopamine and its effects in the basal ganglia. The latter is understood in terms of a distributed network of regions including the prefrontal cortex, medial temporal lobes cerebellum, and basal ganglia. Not only do reinforcement learning techniques have a natural interpretation in terms of human and animal behavior, but they also provide a useful framework for understanding neural reward valuation and action selection. PMID:23834192

  6. On the Complexity of the Asymmetric VPN Problem

    NASA Astrophysics Data System (ADS)

    Rothvoß, Thomas; Sanità, Laura

    We give the first constant factor approximation algorithm for the asymmetric Virtual Private Network (textsc{Vpn}) problem with arbitrary concave costs. We even show the stronger result, that there is always a tree solution of cost at most 2·OPT and that a tree solution of (expected) cost at most 49.84·OPT can be determined in polynomial time.

  7. Increasing Student Effort in Complex Problem Solving through Cooperative Learning and Self-Recording Techniques

    ERIC Educational Resources Information Center

    Brahmer, Kelly; Harmatys, Jennifer

    2009-01-01

    In recent years, teachers have noticed a drop in student effort on complex problems in math and science. The purpose of this study was to determine if incorporating cooperative learning and self-recording strategies had an impact upon student effort on complex problems. A total of 38 9th through 11th grade math and science students at two…

  8. Problem-oriented stereo vision quality evaluation complex

    NASA Astrophysics Data System (ADS)

    Sidorchuk, D.; Gusamutdinova, N.; Konovalenko, I.; Ershov, E.

    2015-12-01

    We describe an original low cost hardware setting for efficient testing of stereo vision algorithms. The method uses a combination of a special hardware setup and mathematical model and is easy to construct, precise in applications of our interest. For a known scene we derive its analytical representation, called virtual scene. Using a four point correspondence between the scene and virtual one we compute extrinsic camera parameters, and project virtual scene on the image plane, which is the ground truth for depth map. Another result, presented in this paper, is a new depth map quality metric. Its main purpose is to tune stereo algorithms for particular problem, e.g. obstacle avoidance.

  9. On the problem of constructing a modern, economic radiotelescope complex

    NASA Technical Reports Server (NTRS)

    Bogomolov, A. F.; Sokolov, A. G.; Poperechenko, B. A.; Polyak, V. S.

    1977-01-01

    Criteria for comparing and planning the technical and economic characteristics of large parabolic reflector antenna systems and other types used in radioastronomy and deep space communications are discussed. The experience gained in making and optimizing a series of highly efficient parabolic antennas in the USSR is reviewed. Several ways are indicated for further improving the complex characteristics of antennas similar to the original TNA-1500 64m radio telescope. The suggestions can be applied in planning the characteristics of radiotelescopes which are now being built, in particular, the TNA-8000 with a diameter of 128 m.

  10. Assessing Complex Problem-Solving Skills and Knowledge Assembly Using Web-Based Hypermedia Design.

    ERIC Educational Resources Information Center

    Dabbagh, Nada

    This research project studied the effects of hierarchical versus heterarchical hypermedia structures of Web-based case representations on complex problem-solving skills and knowledge assembly in problem-centered learning environments in order to develop a system or model that informs the design of Web-based cases for ill-structured problems across…

  11. Nonlinear problems of complex natural systems: Sun and climate dynamics.

    PubMed

    Bershadskii, A

    2013-01-13

    The universal role of the nonlinear one-third subharmonic resonance mechanism in generation of strong fluctuations in complex natural dynamical systems related to global climate is discussed using wavelet regression detrended data. The role of the oceanic Rossby waves in the year-scale global temperature fluctuations and the nonlinear resonance contribution to the El Niño phenomenon have been discussed in detail. The large fluctuations in the reconstructed temperature on millennial time scales (Antarctic ice core data for the past 400,000 years) are also shown to be dominated by the one-third subharmonic resonance, presumably related to the Earth's precession effect on the energy that the intertropical regions receive from the Sun. The effects of galactic turbulence on the temperature fluctuations are also discussed. PMID:23185052

  12. Dusty (complex) plasmas: recent developments, advances, and unsolved problems

    NASA Astrophysics Data System (ADS)

    Popel, Sergey

    The area of dusty (complex) plasma research is a vibrant subfield of plasma physics that be-longs to frontier research in physical sciences. This area is intrinsically interdisciplinary and encompasses astrophysics, planetary science, atmospheric science, magnetic fusion energy sci-ence, and various applied technologies. The research in dusty plasma started after two major discoveries in very different areas: (1) the discovery by the Voyager 2 spacecraft in 1980 of the radial spokes in Saturn's B ring, and (2) the discovery of the early 80's growth of contaminating dust particles in plasma processing. Dusty plasmas are ubiquitous in the universe; examples are proto-planetary and solar nebulae, molecular clouds, supernovae explosions, interplanetary medium, circumsolar rings, and asteroids. Within the solar system, we have planetary rings (e.g., Saturn and Jupiter), Martian atmosphere, cometary tails and comae, dust clouds on the Moon, etc. Close to the Earth, there are noctilucent clouds and polar mesospheric summer echoes, which are clouds of tiny (charged) ice particles that are formed in the summer polar mesosphere at the altitudes of about 82-95 km. Dust and dusty plasmas are also found in the vicinity of artificial satellites and space stations. Dust also turns out to be common in labo-ratory plasmas, such as in the processing of semiconductors and in tokamaks. In processing plasmas, dust particles are actually grown in the discharge from the reactive gases used to form the plasmas. An example of the relevance of industrial dusty plasmas is the growth of silicon microcrystals for improved solar cells in the future. In fact, nanostructured polymorphous sili-con films provide solar cells with high and time stable efficiency. These nano-materials can also be used for the fabrication of ultra-large-scale integration circuits, display devices, single elec-tron devices, light emitting diodes, laser diodes, and others. In microelectronic industries, dust has to be

  13. Measurements of student understanding on complex scientific reasoning problems

    NASA Astrophysics Data System (ADS)

    Izumi, Alisa Sau-Lin

    While there has been much discussion of cognitive processes underlying effective scientific teaching, less is known about the response nature of assessments targeting processes of scientific reasoning specific to biology content. This study used multiple-choice (m-c) and short-answer essay student responses to evaluate progress in high-order reasoning skills. In a pilot investigation of student responses on a non-content-based test of scientific thinking, it was found that some students showed a pre-post gain on the m-c test version while showing no gain on a short-answer essay version of the same questions. This result led to a subsequent research project focused on differences between alternate versions of tests of scientific reasoning. Using m-c and written responses from biology tests targeted toward the skills of (1) reasoning with a model and (2) designing controlled experiments, test score frequencies, factor analysis, and regression models were analyzed to explore test format differences. Understanding the format differences in tests is important for the development of practical ways to identify student gains in scientific reasoning. The overall results suggested test format differences. Factor analysis revealed three interpretable factors---m-c format, genetics content, and model-based reasoning. Frequency distributions on the m-c and open explanation portions of the hybrid items revealed that many students answered the m-c portion of an item correctly but gave inadequate explanations. In other instances students answered the m-c portion incorrectly yet demonstrated sufficient explanation or answered the m-c correctly and also provided poor explanations. When trying to fit test score predictors for non-associated student measures---VSAT, MSAT, high school grade point average, or final course grade---the test scores accounted for close to zero percent of the variance. Overall, these results point to the importance of using multiple methods of testing and of

  14. Periodically specified problems: An exponential complexity gap between exact and approximate solutions

    SciTech Connect

    Hunt, H.B. III; Rosenkrantz, D.J.; Stearns, R.E.; Marathe, M.V.; Radhakrishnan, V.

    1994-11-28

    We study both the complexity and approximability of various graph and combinatorial problems specified using two dimensional narrow periodic specifications (see [CM93, HW92, KMW67, KO91, Or84b, Wa93]). The following two general kinds of results are presented. (1) We prove that a number of natural graph and combinatorial problems are NEXPTIME- or EXPSPACE-complete when instances are so specified; (2) In contrast, we prove that the optimization versions of several of these NEXPTIME-, EXPSPACE-complete problems have polynomial time approximation algorithms with constant performance guarantees. Moreover, some of these problems even have polynomial time approximation schemes. We also sketch how our NEXPTIME-hardness results can be used to prove analogous NEXPTIME-hardness results for problems specified using other kinds of succinct specification languages. Our results provide the first natural problems for which there is a proven exponential (and possibly doubly exponential) gap between the complexities of finding exact and approximate solutions.

  15. A Real-Life Case Study of Audit Interactions--Resolving Messy, Complex Problems

    ERIC Educational Resources Information Center

    Beattie, Vivien; Fearnley, Stella; Hines, Tony

    2012-01-01

    Real-life accounting and auditing problems are often complex and messy, requiring the synthesis of technical knowledge in addition to the application of generic skills. To help students acquire the necessary skills to deal with these problems effectively, educators have called for the use of case-based methods. Cases based on real situations (such…

  16. Percentages: The Effect of Problem Structure, Number Complexity and Calculation Format

    ERIC Educational Resources Information Center

    Baratta, Wendy; Price, Beth; Stacey, Kaye; Steinle, Vicki; Gvozdenko, Eugene

    2010-01-01

    This study reports how the difficulty of simple worded percentage problems is affected by the problem structure and the complexity of the numbers involved. We also investigate which methods students know. Results from 677 Year 8 and 9 students are reported. Overall the results indicate that more attention needs to be given to this important topic.…

  17. An Exploratory Framework for Handling the Complexity of Mathematical Problem Posing in Small Groups

    ERIC Educational Resources Information Center

    Kontorovich, Igor; Koichu, Boris; Leikin, Roza; Berman, Avi

    2012-01-01

    The paper introduces an exploratory framework for handling the complexity of students' mathematical problem posing in small groups. The framework integrates four facets known from past research: task organization, students' knowledge base, problem-posing heuristics and schemes, and group dynamics and interactions. In addition, it contains a new…

  18. The Ethnology of Traditional and Complex Societies. Test Edition. AAAS Study Guides on Contemporary Problems.

    ERIC Educational Resources Information Center

    Simic, Andrei

    This is one of several study guides on contemporary problems produced by the American Association for the Advancement of Science with support of the National Science Foundation. This guide focuses on the ethnology of traditional and complex societies. Part I, Simple and Complex Societies, includes three sections: (1) Introduction: Anthropologists…

  19. Individual Differences in Students' Complex Problem Solving Skills: How They Evolve and What They Imply

    ERIC Educational Resources Information Center

    Wüstenberg, Sascha; Greiff, Samuel; Vainikainen, Mari-Pauliina; Murphy, Kevin

    2016-01-01

    Changes in the demands posed by increasingly complex workplaces in the 21st century have raised the importance of nonroutine skills such as complex problem solving (CPS). However, little is known about the antecedents and outcomes of CPS, especially with regard to malleable external factors such as classroom climate. To investigate the relations…

  20. Sleep, Cognition, and Behavioral Problems in School-Age Children: A Century of Research Meta-Analyzed

    ERIC Educational Resources Information Center

    Astill, Rebecca G.; Van der Heijden, Kristiaan B.; Van IJzendoorn, Marinus H.; Van Someren, Eus J. W.

    2012-01-01

    Clear associations of sleep, cognitive performance, and behavioral problems have been demonstrated in meta-analyses of studies in adults. This meta-analysis is the first to systematically summarize all relevant studies reporting on sleep, cognition, and behavioral problems in healthy school-age children (5-12 years old) and incorporates 86 studies…

  1. On the complexity of classical and quantum algorithms for numerical problems in quantum mechanics

    NASA Astrophysics Data System (ADS)

    Bessen, Arvid J.

    Our understanding of complex quantum mechanical processes is limited by our inability to solve the equations that govern them except for simple cases. Numerical simulation of quantum systems appears to be our best option to understand, design and improve quantum systems. It turns out, however, that computational problems in quantum mechanics are notoriously difficult to treat numerically. The computational time that is required often scales exponentially with the size of the problem. One of the most radical approaches for treating quantum problems was proposed by Feytiman in 1982 [46]: he suggested that quantum mechanics itself showed a promising way to simulate quantum physics. This idea, the so called quantum computer, showed its potential convincingly in one important regime with the development of Shor's integer factorization algorithm which improves exponentially on the best known classical algorithm. In this thesis we explore six different computational problems from quantum mechanics, study their computational complexity and try to find ways to remedy them. In the first problem we investigate the reasons behind the improved performance of Shor's and similar algorithms. We show that the key quantum part in Shor's algorithm, the quantum phase estimation algorithm, achieves its good performance through the use of power queries and we give lower bounds for all phase estimation algorithms that use power queries that match the known upper bounds. Our research indicates that problems that allow the use of power queries will achieve similar exponential improvements over classical algorithms. We then apply our lower bound technique for power queries to the Sturm-Liouville eigenvalue problem and show matching lower bounds to the upper bounds of Papageorgiou and Wozniakowski [85]. It seems to be very difficult, though, to find nontrivial instances of the Sturm-Lionville problem for which power queries can be simulated efficiently. A quantum computer differs from a

  2. Using SEM to Analyze Complex Survey Data: A Comparison between Design-Based Single-Level and Model-Based Multilevel Approaches

    ERIC Educational Resources Information Center

    Wu, Jiun-Yu; Kwok, Oi-man

    2012-01-01

    Both ad-hoc robust sandwich standard error estimators (design-based approach) and multilevel analysis (model-based approach) are commonly used for analyzing complex survey data with nonindependent observations. Although these 2 approaches perform equally well on analyzing complex survey data with equal between- and within-level model structures…

  3. Development and operation of an integrated sampling probe and gas analyzer for turbulent mixing studies in complex supersonic flows

    NASA Astrophysics Data System (ADS)

    Wiswall, John D.

    -temporal characteristic scales of the flow on the resulting time-area-averaged concentration measurements. Two series of experiments were performed to verify the probe's design; the first used Schlieren photography and verified that the probe sampled from the supersonic flowfield isokinetically. The second series involved traversing the probe across a free mixing layer of air and helium, to obtain both mean concentration and high frequency measurements. High-frequency data was statistically analyzed and inspection of the Probability Density Function (PDF) of the hot-film response was instrumental to interpret how well the resulting average mixing measurements represent these types of complex flows. The probe is minimally intrusive, has accuracy comparable to its predecessors, has an improved frequency response for mean concentration measurements, and samples from a very small area in the flowfield.

  4. Conceptual and procedural knowledge community college students use when solving a complex science problem

    NASA Astrophysics Data System (ADS)

    Steen-Eibensteiner, Janice Lee

    2006-07-01

    A strong science knowledge base and problem solving skills have always been highly valued for employment in the science industry. Skills currently needed for employment include being able to problem solve (Overtoom, 2000). Academia also recognizes the need for effectively teaching students to apply problem solving skills in clinical settings. This thesis investigates how students solve complex science problems in an academic setting in order to inform the development of problem solving skills for the workplace. Students' use of problem solving skills in the form of learned concepts and procedural knowledge was studied as students completed a problem that might come up in real life. Students were taking a community college sophomore biology course, Human Anatomy & Physiology II. The problem topic was negative feedback inhibition of the thyroid and parathyroid glands. The research questions answered were (1) How well do community college students use a complex of conceptual knowledge when solving a complex science problem? (2) What conceptual knowledge are community college students using correctly, incorrectly, or not using when solving a complex science problem? (3) What problem solving procedural knowledge are community college students using successfully, unsuccessfully, or not using when solving a complex science problem? From the whole class the high academic level participants performed at a mean of 72% correct on chapter test questions which was a low average to fair grade of C-. The middle and low academic participants both failed (F) the test questions (37% and 30% respectively); 29% (9/31) of the students show only a fair performance while 71% (22/31) fail. From the subset sample population of 2 students each from the high, middle, and low academic levels selected from the whole class 35% (8/23) of the concepts were used effectively, 22% (5/23) marginally, and 43% (10/23) poorly. Only 1 concept was used incorrectly by 3/6 of the students and identified as

  5. Harm reduction as a complex adaptive system: A dynamic framework for analyzing Tanzanian policies concerning heroin use.

    PubMed

    Ratliff, Eric A; Kaduri, Pamela; Masao, Frank; Mbwambo, Jessie K K; McCurdy, Sheryl A

    2016-04-01

    Contrary to popular belief, policies on drug use are not always based on scientific evidence or composed in a rational manner. Rather, decisions concerning drug policies reflect the negotiation of actors' ambitions, values, and facts as they organize in different ways around the perceived problems associated with illicit drug use. Drug policy is thus best represented as a complex adaptive system (CAS) that is dynamic, self-organizing, and coevolving. In this analysis, we use a CAS framework to examine how harm reduction emerged around heroin trafficking and use in Tanzania over the past thirty years (1985-present). This account is an organizational ethnography based on of the observant participation of the authors as actors within this system. We review the dynamic history and self-organizing nature of harm reduction, noting how interactions among system actors and components have coevolved with patterns of heroin us, policing, and treatment activities over time. Using a CAS framework, we describe harm reduction as a complex process where ambitions, values, facts, and technologies interact in the Tanzanian sociopolitical environment. We review the dynamic history and self-organizing nature of heroin policies, noting how the interactions within and between competing prohibitionist and harm reduction policies have changed with patterns of heroin use, policing, and treatment activities over time. Actors learn from their experiences to organize with other actors, align their values and facts, and implement new policies. Using a CAS approach provides researchers and policy actors a better understanding of patterns and intricacies in drug policy. This knowledge of how the system works can help improve the policy process through adaptive action to introduce new actors, different ideas, and avenues for communication into the system.

  6. Harm reduction as a complex adaptive system: A dynamic framework for analyzing Tanzanian policies concerning heroin use.

    PubMed

    Ratliff, Eric A; Kaduri, Pamela; Masao, Frank; Mbwambo, Jessie K K; McCurdy, Sheryl A

    2016-04-01

    Contrary to popular belief, policies on drug use are not always based on scientific evidence or composed in a rational manner. Rather, decisions concerning drug policies reflect the negotiation of actors' ambitions, values, and facts as they organize in different ways around the perceived problems associated with illicit drug use. Drug policy is thus best represented as a complex adaptive system (CAS) that is dynamic, self-organizing, and coevolving. In this analysis, we use a CAS framework to examine how harm reduction emerged around heroin trafficking and use in Tanzania over the past thirty years (1985-present). This account is an organizational ethnography based on of the observant participation of the authors as actors within this system. We review the dynamic history and self-organizing nature of harm reduction, noting how interactions among system actors and components have coevolved with patterns of heroin us, policing, and treatment activities over time. Using a CAS framework, we describe harm reduction as a complex process where ambitions, values, facts, and technologies interact in the Tanzanian sociopolitical environment. We review the dynamic history and self-organizing nature of heroin policies, noting how the interactions within and between competing prohibitionist and harm reduction policies have changed with patterns of heroin use, policing, and treatment activities over time. Actors learn from their experiences to organize with other actors, align their values and facts, and implement new policies. Using a CAS approach provides researchers and policy actors a better understanding of patterns and intricacies in drug policy. This knowledge of how the system works can help improve the policy process through adaptive action to introduce new actors, different ideas, and avenues for communication into the system. PMID:26790689

  7. Temporality Matters: Advancing a Method for Analyzing Problem-Solving Processes in a Computer-Supported Collaborative Environment

    ERIC Educational Resources Information Center

    Kapur, Manu

    2011-01-01

    This paper argues for a need to develop methods for examining temporal patterns in computer-supported collaborative learning (CSCL) groups. It advances one such quantitative method--Lag-sequential Analysis (LsA)--and instantiates it in a study of problem-solving interactions of collaborative groups in an online, synchronous environment. LsA…

  8. Analyzing Multiple Informant Data on Child and Adolescent Behavior Problems: Predictive Validity and Comparison of Aggregation Procedures

    ERIC Educational Resources Information Center

    van Dulmen, Manfred H. M.; Egeland, Byron

    2011-01-01

    We compared the predictive validity of five aggregation methods for multiple informant data on child and adolescent behavior problems. In addition, we compared the predictive validity of these aggregation methods with single informant scores. Data were derived from the Minnesota Longitudinal Study of Parents and Children (N = 175). Maternal and…

  9. Thinking Problems of the Present Collision Warning Work by Analyzing the Intersection Between Cosmos 2251 and Iridium 33

    NASA Astrophysics Data System (ADS)

    Wang, R. L.; Liu, W.; Yan, R. D.; Gong, J. C.

    2013-08-01

    After Cosmos 2251 and Iridium 33 collision breakup event, the institutions at home and abroad began the collision warning analysis for the event. This paper compared the results from the different research units and discussed the problems of the current collision warning work, then gave the suggestions of further study.

  10. Analyzing the Effects of a Mathematics Problem-Solving Program, Exemplars, on Mathematics Problem-Solving Scores with Deaf and Hard-of-Hearing Students

    ERIC Educational Resources Information Center

    Chilvers, Amanda Leigh

    2013-01-01

    Researchers have noted that mathematics achievement for deaf and hard-of-hearing (d/hh) students has been a concern for many years, including the ability to problem solve. This quasi-experimental study investigates the use of the Exemplars mathematics program with students in grades 2-8 in a school for the deaf that utilizes American Sign Language…

  11. Medicines counterfeiting is a complex problem: a review of key challenges across the supply chain.

    PubMed

    Tremblay, Michael

    2013-02-01

    The paper begins by asking why there is a market for counterfeit medicines, which in effect creates the problem of counterfeiting itself. Contributing factors include supply chain complexity and the lack of whole-systems thinking. These two underpin the author's view that counterfeiting is a complex (i.e. wicked) problem, and that corporate, public policy and regulatory actions need to be mindful of how their actions may be causal. The paper offers a problem-based review of key components of this complexity, viz., the knowledge end-users/consumers have of medicines; whether restrictive information policies may hamper information provision to patients; the internet's direct access to consumers; internet-enabled distribution of unsafe and counterfeit medicines; whether the internet is a parallel and competitive supply chain to legitimate routes; organised crime as an emerging medicines manufacturer and supplier and whether substandard medicines is really the bigger problem. Solutions respect the perceived complexity of the supply chain challenges. The paper identifies the need to avoid technologically-driven solutions, calling for 'technological agnosticism'. Both regulation and public policy need to reflect the dynamic nature of the problem and avoid creating perverse incentives; it may be, for instance, that medicines pricing and reimbursement policies, which affect consumer/patient access may act as market signals to counterfeiters, since this creates a cash market in cheaper drugs. PMID:23656447

  12. Medicines counterfeiting is a complex problem: a review of key challenges across the supply chain.

    PubMed

    Tremblay, Michael

    2013-02-01

    The paper begins by asking why there is a market for counterfeit medicines, which in effect creates the problem of counterfeiting itself. Contributing factors include supply chain complexity and the lack of whole-systems thinking. These two underpin the author's view that counterfeiting is a complex (i.e. wicked) problem, and that corporate, public policy and regulatory actions need to be mindful of how their actions may be causal. The paper offers a problem-based review of key components of this complexity, viz., the knowledge end-users/consumers have of medicines; whether restrictive information policies may hamper information provision to patients; the internet's direct access to consumers; internet-enabled distribution of unsafe and counterfeit medicines; whether the internet is a parallel and competitive supply chain to legitimate routes; organised crime as an emerging medicines manufacturer and supplier and whether substandard medicines is really the bigger problem. Solutions respect the perceived complexity of the supply chain challenges. The paper identifies the need to avoid technologically-driven solutions, calling for 'technological agnosticism'. Both regulation and public policy need to reflect the dynamic nature of the problem and avoid creating perverse incentives; it may be, for instance, that medicines pricing and reimbursement policies, which affect consumer/patient access may act as market signals to counterfeiters, since this creates a cash market in cheaper drugs.

  13. A knowledge-based tool for multilevel decomposition of a complex design problem

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    1989-01-01

    Although much work has been done in applying artificial intelligence (AI) tools and techniques to problems in different engineering disciplines, only recently has the application of these tools begun to spread to the decomposition of complex design problems. A new tool based on AI techniques has been developed to implement a decomposition scheme suitable for multilevel optimization and display of data in an N x N matrix format.

  14. Performance of isotope ratio infrared spectroscopy (IRIS) for analyzing waters containing organic contaminants: Problems and solutions (Invited)

    NASA Astrophysics Data System (ADS)

    West, A. G.; Goldsmith, G. R.; Dawson, T. E.

    2010-12-01

    The development of isotope ratio infrared spectroscopy (IRIS) for simultaneous δ2H and δ18O analysis of liquid water samples shows much potential for affordable, simple and potentially portable isotopic analyses. IRIS has been shown to be comparable in precision and accuracy to isotope ratio mass spectrometry (IRMS) when analyzing pure water samples. However, recent studies have shown that organic contaminants in analyzed water samples may interfere with the spectroscopy leading to errors of considerable magnitude in the reported stable isotope data. Many environmental, biological and forensic studies require analyses of water containing organic contaminants in some form, yet our current methods of removing organic contaminants prior to analysis appear inadequate for IRIS. Treated plant water extracts analyzed by IRIS showed deviations as large as 35‰ (δ2H) and 11.8‰ (δ18O) from the IRMS value, indicating that trace amounts of contaminants were sufficient to disrupt IRIS analyses. However, not all organic contaminants negatively influence IRIS. For such samples, IRIS presents a labour saving method relative to IRMS. Prior to widespread use in the environmental, biological and forensic sciences, a means of obtaining reliable data from IRIS needs to be demonstrated. One approach is to use instrument-based software to flag potentially problematic spectra and output a corrected isotope value based on analysis of the spectra. We evaluate this approach on two IRIS systems and discuss the way forward for ensuring accurate stable isotope data using IRIS.

  15. Analogy as a strategy for supporting complex problem solving under uncertainty.

    PubMed

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving. PMID:22815065

  16. Tackling complex problems, building evidence for practice, and educating doctoral nursing students to manage the tension.

    PubMed

    Sharts-Hopko, Nancy C

    2013-01-01

    The mandate for evidence-based practice (EBP) arose in response to, among other catalysts, several Institute of Medicine reports beginning in the late 1990s. At the same time, the National Institutes of Health and others have recognized that the most complex, important, and challenging problems, termed "wicked problems," are inherently transdisciplinary and require thinking beyond the limits of existing theories. When nursing students are prepared for EBP, they operate within a fairly stable set of assumptions and they exercise a past orientation. Wicked problem-solving occurs within a context that is characterized as dynamic and ambiguous and requires a future orientation to imagine potential solutions to questions of "what if?" Both skills, EBP, and wicked problem-solving, are essential within the discipline of nursing. Students at all levels need to understand when each scientific approach is required. PhD students must be prepared to participate in wicked problem-solving.

  17. On the Critical Behaviour, Crossover Point and Complexity of the Exact Cover Problem

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.; Smelyanskiy, Vadim N.; Shumow, Daniel; Koga, Dennis (Technical Monitor)

    2003-01-01

    Research into quantum algorithms for NP-complete problems has rekindled interest in the detailed study a broad class of combinatorial problems. A recent paper applied the quantum adiabatic evolution algorithm to the Exact Cover problem for 3-sets (EC3), and provided an empirical evidence that the algorithm was polynomial. In this paper we provide a detailed study of the characteristics of the exact cover problem. We present the annealing approximation applied to EC3, which gives an over-estimate of the phase transition point. We also identify empirically the phase transition point. We also study the complexity of two classical algorithms on this problem: Davis-Putnam and Simulated Annealing. For these algorithms, EC3 is significantly easier than 3-SAT.

  18. Calculating Probabilistic Distance to Solution in a Complex Problem Solving Domain

    ERIC Educational Resources Information Center

    Sudol, Leigh Ann; Rivers, Kelly; Harris, Thomas K.

    2012-01-01

    In complex problem solving domains, correct solutions are often comprised of a combination of individual components. Students usually go through several attempts, each attempt reflecting an individual solution state that can be observed during practice. Classic metrics to measure student performance over time rely on counting the number of…

  19. Regularity of the Dirichlet problem for the complex Monge-Ampère equation

    PubMed Central

    Moriyon, Roberto

    1979-01-01

    Regularity up to the boundary of the solutions of a boundary value problem for a complex Monge-Ampère equation on perturbations of an annulus in Cn is proven. The result can be applied to the classification of such domains. PMID:16592626

  20. Assessment of Complex Problem Solving: What We Know and What We Don't Know

    ERIC Educational Resources Information Center

    Herde, Christoph Nils; Wüstenberg, Sascha; Greiff, Samuel

    2016-01-01

    Complex Problem Solving (CPS) is seen as a cross-curricular 21st century skill that has attracted interest in large-scale-assessments. In the Programme for International Student Assessment (PISA) 2012, CPS was assessed all over the world to gain information on students' skills to acquire and apply knowledge while dealing with nontransparent…

  1. Learning about Complex Multi-Stakeholder Issues: Assessing the Visual Problem Appraisal

    ERIC Educational Resources Information Center

    Witteveen, Loes; Put, Marcel; Leeuwis, Cees

    2010-01-01

    This paper presents an evaluation of the visual problem appraisal (VPA) learning environment in higher education. The VPA has been designed for the training of competences that are required in complex stakeholder settings in relation to sustainability issues. The design of VPA incorporates a diversity of instruction strategies to accommodate the…

  2. Eighth-Grade Students Defining Complex Problems: The Effectiveness of Scaffolding in a Multimedia Program

    ERIC Educational Resources Information Center

    Zydney, Janet Mannheimer

    2005-01-01

    This pilot study investigated the effectiveness of a multimedia learning environment called "Pollution Solution" on eighth-grade students' ability to define a complex problem. Sixty students from four earth science classes taught by the same teacher in a New York City public school were included in the sample for this study. The classes were…

  3. On Using Meta-Modeling and Multi-Modeling to Address Complex Problems

    ERIC Educational Resources Information Center

    Abu Jbara, Ahmed

    2013-01-01

    Models, created using different modeling techniques, usually serve different purposes and provide unique insights. While each modeling technique might be capable of answering specific questions, complex problems require multiple models interoperating to complement/supplement each other; we call this Multi-Modeling. To address the syntactic and…

  4. Ecosystem services and cooperative fisheries research to address a complex fishery problem

    EPA Science Inventory

    The St. Louis River represents a complex fishery management problem. Current fishery management goals have to be developed taking into account bi-state commercial, subsistence and recreational fisheries which are valued for different characteristics by a wide range of anglers, as...

  5. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    ERIC Educational Resources Information Center

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  6. Computer-Based Assessment of Complex Problem Solving: Concept, Implementation, and Application

    ERIC Educational Resources Information Center

    Greiff, Samuel; Wustenberg, Sascha; Holt, Daniel V.; Goldhammer, Frank; Funke, Joachim

    2013-01-01

    Complex Problem Solving (CPS) skills are essential to successfully deal with environments that change dynamically and involve a large number of interconnected and partially unknown causal influences. The increasing importance of such skills in the 21st century requires appropriate assessment and intervention methods, which in turn rely on adequate…

  7. Differential Relations between Facets of Complex Problem Solving and Students' Immigration Background

    ERIC Educational Resources Information Center

    Sonnleitner, Philipp; Brunner, Martin; Keller, Ulrich; Martin, Romain

    2014-01-01

    Whereas the assessment of complex problem solving (CPS) has received increasing attention in the context of international large-scale assessments, its fairness in regard to students' cultural background has gone largely unexplored. On the basis of a student sample of 9th-graders (N = 299), including a representative number of immigrant students (N…

  8. The Development of Complex Problem Solving in Adolescence: A Latent Growth Curve Analysis

    ERIC Educational Resources Information Center

    Frischkorn, Gidon T.; Greiff, Samuel; Wüstenberg, Sascha

    2014-01-01

    Complex problem solving (CPS) as a cross-curricular competence has recently attracted more attention in educational psychology as indicated by its implementation in international educational large-scale assessments such as the Programme for International Student Assessment. However, research on the development of CPS is scarce, and the few…

  9. Statistical physics analysis of the computational complexity of solving random satisfiability problems using backtrack algorithms

    NASA Astrophysics Data System (ADS)

    Cocco, S.; Monasson, R.

    2001-08-01

    The computational complexity of solving random 3-Satisfiability (3-SAT) problems is investigated using statistical physics concepts and techniques related to phase transitions, growth processes and (real-space) renormalization flows. 3-SAT is a representative example of hard computational tasks; it consists in knowing whether a set of αN randomly drawn logical constraints involving N Boolean variables can be satisfied altogether or not. Widely used solving procedures, as the Davis-Putnam-Loveland-Logemann (DPLL) algorithm, perform a systematic search for a solution, through a sequence of trials and errors represented by a search tree. The size of the search tree accounts for the computational complexity, i.e. the amount of computational efforts, required to achieve resolution. In the present study, we identify, using theory and numerical experiments, easy (size of the search tree scaling polynomially with N) and hard (exponential scaling) regimes as a function of the ratio α of constraints per variable. The typical complexity is explicitly calculated in the different regimes, in very good agreement with numerical simulations. Our theoretical approach is based on the analysis of the growth of the branches in the search tree under the operation of DPLL. On each branch, the initial 3-SAT problem is dynamically turned into a more generic 2+p-SAT problem, where p and 1 - p are the fractions of constraints involving three and two variables respectively. The growth of each branch is monitored by the dynamical evolution of α and p and is represented by a trajectory in the static phase diagram of the random 2+p-SAT problem. Depending on whether or not the trajectories cross the boundary between satisfiable and unsatisfiable phases, single branches or full trees are generated by DPLL, resulting in easy or hard resolutions. Our picture for the origin of complexity can be applied to other computational problems solved by branch and bound algorithms.

  10. Generalized CNF satisfiability, local reductions and complexity of succinctly specified problems

    SciTech Connect

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.; Radhakrishnan, V.

    1995-02-01

    We, study the complexity and efficient approximability of various decision, counting and optimization problems when instances are specified using (1) the 1-dimensional finite periodic narrow specifications of Wanke, (2) the 2-way infinite 1-dimensional narrow periodic (sometimes called dynamic) specifications of Karp and Orlin et al., and (3) the hierarchical specification language of Lengauer et al. We outline how generalized CNF satisfiability problems and local reductions can be used to obtain both hardness and easiness results for a number of decision, counting, optimization and approximate optimization problems when instances are specified as in (1), (2) or (3). As corollaries we obtain a number of new PSPACE-hardness and {number_sign}PSPACE-hardness,9 results and a number of new polynomial time approximation algorithms for natural PSPACE-hard optimization problems. In particular assuming P {ne} PSPACE, we characterize completely the complexities of the generalized CNF satisfiability problems SAT(S) of Schaefer [Sc78], when instances are specified as in (1), (2) or (3).

  11. Mixing Bandt-Pompe and Lempel-Ziv approaches: another way to analyze the complexity of continuous-state sequences

    NASA Astrophysics Data System (ADS)

    Zozor, S.; Mateos, D.; Lamberti, P. W.

    2014-05-01

    In this paper, we propose to mix the approach underlying Bandt-Pompe permutation entropy with Lempel-Ziv complexity, to design what we call Lempel-Ziv permutation complexity. The principle consists of two steps: (i) transformation of a continuous-state series that is intrinsically multivariate or arises from embedding into a sequence of permutation vectors, where the components are the positions of the components of the initial vector when re-arranged; (ii) performing the Lempel-Ziv complexity for this series of `symbols', as part of a discrete finite-size alphabet. On the one hand, the permutation entropy of Bandt-Pompe aims at the study of the entropy of such a sequence; i.e., the entropy of patterns in a sequence (e.g., local increases or decreases). On the other hand, the Lempel-Ziv complexity of a discrete-state sequence aims at the study of the temporal organization of the symbols (i.e., the rate of compressibility of the sequence). Thus, the Lempel-Ziv permutation complexity aims to take advantage of both of these methods. The potential from such a combined approach - of a permutation procedure and a complexity analysis - is evaluated through the illustration of some simulated data and some real data. In both cases, we compare the individual approaches and the combined approach.

  12. An HPLC chromatographic framework to analyze the β-cyclodextrin/solute complexation mechanism using a carbon nanotube stationary phase.

    PubMed

    Aljhni, Rania; Andre, Claire; Lethier, Lydie; Guillaume, Yves Claude

    2015-11-01

    A carbon nanotube (CNT) stationary phase was used for the first time to study the β-cyclodextrin (β-CD) solute complexation mechanism using high performance liquid chromatography (HPLC). For this, the β-CD was added at various concentrations in the mobile phase and the effect of column temperature was studied on both the retention of a series of aniline and benzoic acid derivatives with the CNT stationary phase and their complexation mechanism with β-CD. A decrease in the solute retention factor was observed for all the studied molecules without change in the retention order. The apparent formation constant KF of the inclusion complex β-CD/solute was determined at various temperatures. Our results showed that the interaction of β-CD with both the mobile phase and the stationary phase interfered in the complex formation. The enthalpy and entropy of the complex formation (ΔHF and ΔSF) between the solute molecule and CD were determined using a thermodynamic approach. Negative enthalpies and entropies indicated that the inclusion process of the studied molecule in the CD cavity was enthalpically driven and that the hydrogen bonds between carboxylic or aniline groups and the functional groups on the β-CD rim play an important role in the complex formation. PMID:26452814

  13. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem.

    PubMed

    Williams, Patricia Ah; Woodward, Andrew J

    2015-01-01

    The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat. PMID:26229513

  14. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem

    PubMed Central

    Williams, Patricia AH; Woodward, Andrew J

    2015-01-01

    The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat. PMID:26229513

  15. Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving.

    PubMed

    Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni

    2015-03-01

    It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load.

  16. Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving

    PubMed Central

    Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni

    2015-01-01

    It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load. PMID:25652466

  17. Solving the three-body Coulomb breakup problem using exterior complex scaling

    SciTech Connect

    McCurdy, C.W.; Baertschy, M.; Rescigno, T.N.

    2004-05-17

    Electron-impact ionization of the hydrogen atom is the prototypical three-body Coulomb breakup problem in quantum mechanics. The combination of subtle correlation effects and the difficult boundary conditions required to describe two electrons in the continuum have made this one of the outstanding challenges of atomic physics. A complete solution of this problem in the form of a ''reduction to computation'' of all aspects of the physics is given by the application of exterior complex scaling, a modern variant of the mathematical tool of analytic continuation of the electronic coordinates into the complex plane that was used historically to establish the formal analytic properties of the scattering matrix. This review first discusses the essential difficulties of the three-body Coulomb breakup problem in quantum mechanics. It then describes the formal basis of exterior complex scaling of electronic coordinates as well as the details of its numerical implementation using a variety of methods including finite difference, finite elements, discrete variable representations, and B-splines. Given these numerical implementations of exterior complex scaling, the scattering wave function can be generated with arbitrary accuracy on any finite volume in the space of electronic coordinates, but there remains the fundamental problem of extracting the breakup amplitudes from it. Methods are described for evaluating these amplitudes. The question of the volume-dependent overall phase that appears in the formal theory of ionization is resolved. A summary is presented of accurate results that have been obtained for the case of electron-impact ionization of hydrogen as well as a discussion of applications to the double photoionization of helium.

  18. Dirac's formalism combined with complex Fourier operational matrices to solve initial and boundary value problems

    NASA Astrophysics Data System (ADS)

    Labecca, William; Guimarães, Osvaldo; Piqueira, José Roberto C.

    2014-08-01

    Approximations of functions in terms of orthogonal polynomials have been used to develop and implement numerical approaches to solve spectrally initial and boundary value problems. The main idea behind these approaches is to express differential and integral operators by using matrices, and this, in turn, makes the numerical implementation easier to be expressed in computational algebraic languages. In this paper, the application of the methodology is enlarged by using Dirac's formalism, combined with complex Fourier series.

  19. Complexation studies with lanthanides and humic acid analyzed by ultrafiltration and capillary electrophoresis-inductively coupled plasma mass spectrometry.

    PubMed

    Kautenburger, Ralf; Beck, Horst Philipp

    2007-08-01

    For the long-term storage of radioactive waste, detailed information about geo-chemical behavior of radioactive and toxic metal ions under environmental conditions is necessary. Humic acid (HA) can play an important role in the immobilisation or mobilisation of metal ions due to complexation and colloid formation. Therefore, we investigate the complexation behavior of HA and its influence on the migration or retardation of selected lanthanides (europium and gadolinium as homologues of the actinides americium and curium). Two independent speciation techniques, ultrafiltration and capillary electrophoresis coupled with inductively coupled plasma mass spectrometry (CE-ICP-MS) have been compared for the study of Eu and Gd interaction with (purified Aldrich) HA. The degree of complexation of Eu and Gd in 25 mg l(-1) Aldrich HA solutions was determined with a broad range of metal loading (Eu and Gd total concentration between 10(-6) and 10(-4) mol l(-1)), ionic strength of 10 mM (NaClO4) and different pH-values. From the CE-ICP-MS electropherograms, additional information on the charge of the Eu species was obtained by the use of 1-bromopropane as neutral marker. To detect HA in the ICP-MS and separate between HA complexed and non complexed metal ions in the CE-ICP-MS, we have halogenated the HA with iodine as ICP-MS marker. PMID:17459403

  20. The Role of Prior Knowledge and Problem Contexts in Students' Explanations of Complex System

    NASA Astrophysics Data System (ADS)

    Barth-Cohen, Lauren April

    The purpose of this dissertation is to study students' competencies in generating scientific explanations within the domain of complex systems, an interdisciplinary area in which students tend to have difficulties. While considering students' developing explanations of how complex systems work, I investigate the role of prior knowledge and how students' explanations systematically vary across seven problem contexts (e.g. the movement of sand dunes, the formation of traffic jams, and diffusion in water). Using the Knowledge in Pieces epistemological perspective, I build a mini-theory of how students construct explanations about the behavior of complex systems. The mini-theory shows how advanced, "decentralized" explanations evolve from a variety of prior knowledge resources, which depend on specific features of the problem. A general emphasis on students' competences is exhibited through three strands of analysis: (1) a focus on moment-to-moment shifts in individuals' explanations in the direction of a normative understanding; (2) a comparison of explanations across the seven problem contexts in order to highlight variation in kinds of prior knowledge that are used; and (3) a concentration on the diversity within explanations that can be all considered examples of emergent thinking. First, I document cases of students' shifting explanations as they become less prototypically centralized (a more naive causality) and then become more prototypically decentralized over short time periods. The analysis illustrates the lines of continuity between these two ways of understanding and how change can occur during the process of students generating a progression of increasingly sophisticated transitional explanations. Second, I find a variety of students' understandings across the problem contexts, expressing both variation in their prior knowledge and how the nature of a specific domain influences reasoning. Certain problem contexts are easier or harder for students

  1. Analyzing Student Motivation at the Confluence of Achievement Goals and Their Underlying Reasons: An Investigation of Goal Complexes

    ERIC Educational Resources Information Center

    Hodis, Flaviu A.; Tait, Carolyn; Hodis, Georgeta M.; Hodis, Monica A.; Scornavacca, Eusebio

    2016-01-01

    This research investigated the interrelations among achievement goals and the underlying reasons for pursuing them. To do so, it utilized the framework of goal complexes, which are regulatory constructs defined at the intersection of aims and reasons. Data from two independent large samples of New Zealand university students showed that across…

  2. You Need to Know: There Is a Causal Relationship between Structural Knowledge and Control Performance in Complex Problem Solving Tasks

    ERIC Educational Resources Information Center

    Goode, Natassia; Beckmann, Jens F.

    2010-01-01

    This study investigates the relationships between structural knowledge, control performance and fluid intelligence in a complex problem solving (CPS) task. 75 participants received either complete, partial or no information regarding the underlying structure of a complex problem solving task, and controlled the task to reach specific goals.…

  3. How to solve complex problems in foundry plants - future of casting simulation -

    NASA Astrophysics Data System (ADS)

    Ohnaka, I.

    2015-06-01

    Although the computer simulation of casting has progressed dramatically over the last decades, there are still many challenges and problems. This paper discusses how to solve complex engineering problems in foundry plants and what we should do in the future, in particular, for casting simulation. First, problem solving procedures including application of computer simulation are demonstrated and various difficulties are pointed-out exemplifying mainly porosity defects in sand castings of spheroidal graphite cast irons. Next, looking back conventional scientific and engineering research to understand casting phenomena, challenges and problems are discussed from problem solving view point, followed by discussion on the issues we should challenge such as how to integrate huge amount of dispersed knowledge in various disciplines, differentiation of science-oriented and engineering-oriented models, professional ethics, how to handle fluctuating materials, initial and boundary conditions, error accumulation, simulation codes as black-box, etc. Finally some suggestions are made on how to challenge the issues such as promotion of research on the simulation based on the science- oriented model and publication of reliable data of casting phenomena in complicated-shaped castings including reconsideration of the evaluation system.

  4. Low complexity interference alignment algorithms for desired signal power maximization problem of MIMO channels

    NASA Astrophysics Data System (ADS)

    Sun, Cong; Yang, Yunchuan; Yuan, Yaxiang

    2012-12-01

    In this article, we investigate the interference alignment (IA) solution for a K-user MIMO interference channel. Proper users' precoders and decoders are designed through a desired signal power maximization model with IA conditions as constraints, which forms a complex matrix optimization problem. We propose two low complexity algorithms, both of which apply the Courant penalty function technique to combine the leakage interference and the desired signal power together as the new objective function. The first proposed algorithm is the modified alternating minimization algorithm (MAMA), where each subproblem has closed-form solution with an eigenvalue decomposition. To further reduce algorithm complexity, we propose a hybrid algorithm which consists of two parts. As the first part, the algorithm iterates with Householder transformation to preserve the orthogonality of precoders and decoders. In each iteration, the matrix optimization problem is considered in a sequence of 2D subspaces, which leads to one dimensional optimization subproblems. From any initial point, this algorithm obtains precoders and decoders with low leakage interference in short time. In the second part, to exploit the advantage of MAMA, it continues to iterate to perfectly align the interference from the output point of the first part. Analysis shows that in one iteration generally both proposed two algorithms have lower computational complexity than the existed maximum signal power (MSP) algorithm, and the hybrid algorithm enjoys lower complexity than MAMA. Simulations reveal that both proposed algorithms achieve similar performances as the MSP algorithm with less executing time, and show better performances than the existed alternating minimization algorithm in terms of sum rate. Besides, from the view of convergence rate, simulation results show that the MAMA enjoys fastest speed with respect to a certain sum rate value, while hybrid algorithm converges fastest to eliminate interference.

  5. Numerical calculation of thermo-mechanical problems at large strains based on complex step derivative approximation of tangent stiffness matrices

    NASA Astrophysics Data System (ADS)

    Balzani, Daniel; Gandhi, Ashutosh; Tanaka, Masato; Schröder, Jörg

    2015-05-01

    In this paper a robust approximation scheme for the numerical calculation of tangent stiffness matrices is presented in the context of nonlinear thermo-mechanical finite element problems and its performance is analyzed. The scheme extends the approach proposed in Kim et al. (Comput Methods Appl Mech Eng 200:403-413, 2011) and Tanaka et al. (Comput Methods Appl Mech Eng 269:454-470, 2014 and bases on applying the complex-step-derivative approximation to the linearizations of the weak forms of the balance of linear momentum and the balance of energy. By incorporating consistent perturbations along the imaginary axis to the displacement as well as thermal degrees of freedom, we demonstrate that numerical tangent stiffness matrices can be obtained with accuracy up to computer precision leading to quadratically converging schemes. The main advantage of this approach is that contrary to the classical forward difference scheme no round-off errors due to floating-point arithmetics exist within the calculation of the tangent stiffness. This enables arbitrarily small perturbation values and therefore leads to robust schemes even when choosing small values. An efficient algorithmic treatment is presented which enables a straightforward implementation of the method in any standard finite-element program. By means of thermo-elastic and thermo-elastoplastic boundary value problems at finite strains the performance of the proposed approach is analyzed.

  6. Complexation of europium and uranium by humic acids analyzed by capillary electrophoresis-inductively coupled plasma mass spectrometry.

    PubMed

    Möser, Christina; Kautenburger, Ralf; Philipp Beck, Horst

    2012-05-01

    Investigations of the mobility of radioactive and nonradioactive substances in the environment are important tasks for the development of a future disposal in deep geological formations. Dissolved organic matter (DOM) can play an important role in the mobilization of metal ions due to complexation. In this study, we investigate the complexation behavior of humic acid (HA) as a model substance for DOM and its influence on the migration of europium as homologue for the actinide americium and uranium as the principal component of nuclear fuel. As speciation technique, capillary electrophoresis (CE) was hyphenated with inductively coupled plasma mass spectrometry (ICP-MS). For the study, 0.5 mg·L⁻¹ of the metals and 25 mg·L⁻¹ of (purified Aldrich) HA and an aqueous solution sodium-perchlorate with an ionic strength of 10 mM at pH 5 were used. CE-ICP-MS clearly shows the different speciation of the triple positively charged europium and the double positively charged uranyl cation with HA. PMID:22648819

  7. Use of a field model to analyze probable fire environments encountered within the complex geometries of nuclear power plants

    SciTech Connect

    Boccio, J.L.; Usher, J.L.; Singhal, A.K.; Tam, L.T.

    1985-08-01

    A fire in a nuclear power plant (NPP) can damage equipment needed to safely operate the plant and thereby either directly cause an accident or else reduce the plant's margin of safety. The development of a field-model fire code to analyze the probable fire environments encountered within NPP is discussed. A set of fire tests carried out under the aegis of the US Nuclear Regulatory Commission (NRC) is described. The results of these tests are then utilized to validate the field model.

  8. Propositional reasoning: the differential contribution of "rules" to the difficulty of complex reasoning problems.

    PubMed

    Rijmen, F; De Boeck, P

    2001-01-01

    In Experiment 1, complex propositional reasoning problems were constructed as a combination of several types of logical inferences: modus ponens, modus tollens, disjunctive modus ponens, disjunctive syllogism, and conjunction. Rule theories of propositional reasoning can account for how one combines these inferences, but the difficulty of the problems can be accounted for only if a differential psychological cost is allowed for different basic rules. Experiment 2 ruled out some alternative explanations for these differences that did not refer to the intrinsic difficulty of the basic rules. It was also found that part of the results could be accounted for by the notion of representational cost, as it is used in the mental model theory of propositional reasoning. However, the number of models as a measure of representational cost seems to be too coarsely defined to capture all of the observed effects. PMID:11277459

  9. Convergent validity of the aberrant behavior checklist and behavior problems inventory with people with complex needs.

    PubMed

    Hill, Jennie; Powlitch, Stephanie; Furniss, Frederick

    2008-01-01

    The current study aimed to replicate and extend Rojahn et al. [Rojahn, J., Aman, M. G., Matson, J. L., & Mayville, E. (2003). The aberrant behavior checklist and the behavior problems inventory: Convergent and divergent validity. Research in Developmental Disabilities, 24, 391-404] by examining the convergent validity of the behavior problems inventory (BPI) and the aberrant behavior checklist (ABC) for individuals presenting with multiple complex behavior problems. Data were collected from 69 children and adults with severe intellectual disabilities and challenging behavior living in residential establishments. MANCOVA analyses showed that individuals with elevated BPI stereotyped behavior subscale scores had higher scores on ABC lethargy and stereotypy subscales, while those with elevated BPI aggressive/destructive behavior subscale scores obtained higher scores on ABC irritability, stereotypy and hyperactivity subscales. Multiple regression analyses showed a corresponding pattern of results in the prediction of ABC subscale scores by BPI subscale scores. Exploratory factor analysis of the BPI data suggested a six-factor solution with an aggressive/destructive behavior factor, four factors relating to stereotypy, and one related to stereotypy and self-injury. These results, discussed with reference to Rojahn et al. [Rojahn, J., Aman, M. G., Matson, J. L., & Mayville, E. (2003). The aberrant behavior checklist and the behavior problems inventory: Convergent and divergent validity. Research in Developmental Disabilities, 24, 391-404], support the existence of relationships between specific subscales of the two instruments in addition to an overall association between total scores related to general severity of behavioral disturbance.

  10. A boundary collocation meshfree method for the treatment of Poisson problems with complex morphologies

    NASA Astrophysics Data System (ADS)

    Soghrati, Soheil; Mai, Weijie; Liang, Bowen; Buchheit, Rudolph G.

    2015-01-01

    A new meshfree method based on a discrete transformation of Green's basis functions is introduced to simulate Poisson problems with complex morphologies. The proposed Green's Discrete Transformation Method (GDTM) uses source points that are located along a virtual boundary outside the problem domain to construct the basis functions needed to approximate the field. The optimal number of Green's functions source points and their relative distances with respect to the problem boundaries are evaluated to obtain the best approximation of the partition of unity condition. A discrete transformation technique together with the boundary point collocation method is employed to evaluate the unknown coefficients of the solution series via satisfying the problem boundary conditions. A comprehensive convergence study is presented to investigate the accuracy and convergence rate of the GDTM. We will also demonstrate the application of this meshfree method for simulating the conductive heat transfer in a heterogeneous materials system and the dissolved aluminum ions concentration in the electrolyte solution formed near a passive corrosion pit.

  11. Attentional bias induced by solving simple and complex addition and subtraction problems.

    PubMed

    Masson, Nicolas; Pesenti, Mauro

    2014-01-01

    The processing of numbers has been shown to induce shifts of spatial attention in simple probe detection tasks, with small numbers orienting attention to the left and large numbers to the right side of space. Recently, the investigation of this spatial-numerical association has been extended to mental arithmetic with the hypothesis that solving addition or subtraction problems may induce attentional displacements (to the right and to the left, respectively) along a mental number line onto which the magnitude of the numbers would range from left to right, from small to large numbers. Here we investigated such attentional shifts using a target detection task primed by arithmetic problems in healthy participants. The constituents of the addition and subtraction problems (first operand; operator; second operand) were flashed sequentially in the centre of a screen, then followed by a target on the left or the right side of the screen, which the participants had to detect. This paradigm was employed with arithmetic facts (Experiment 1) and with more complex arithmetic problems (Experiment 2) in order to assess the effects of the operation, the magnitude of the operands, the magnitude of the results, and the presence or absence of a requirement for the participants to carry or borrow numbers. The results showed that arithmetic operations induce some spatial shifts of attention, possibly through a semantic link between the operation and space. PMID:24833320

  12. Attentional bias induced by solving simple and complex addition and subtraction problems.

    PubMed

    Masson, Nicolas; Pesenti, Mauro

    2014-01-01

    The processing of numbers has been shown to induce shifts of spatial attention in simple probe detection tasks, with small numbers orienting attention to the left and large numbers to the right side of space. Recently, the investigation of this spatial-numerical association has been extended to mental arithmetic with the hypothesis that solving addition or subtraction problems may induce attentional displacements (to the right and to the left, respectively) along a mental number line onto which the magnitude of the numbers would range from left to right, from small to large numbers. Here we investigated such attentional shifts using a target detection task primed by arithmetic problems in healthy participants. The constituents of the addition and subtraction problems (first operand; operator; second operand) were flashed sequentially in the centre of a screen, then followed by a target on the left or the right side of the screen, which the participants had to detect. This paradigm was employed with arithmetic facts (Experiment 1) and with more complex arithmetic problems (Experiment 2) in order to assess the effects of the operation, the magnitude of the operands, the magnitude of the results, and the presence or absence of a requirement for the participants to carry or borrow numbers. The results showed that arithmetic operations induce some spatial shifts of attention, possibly through a semantic link between the operation and space.

  13. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    NASA Astrophysics Data System (ADS)

    Tahvili, Sahar; Österberg, Jonas; Silvestrov, Sergei; Biteus, Jonas

    2014-12-01

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation.

  14. The anatomical problem posed by brain complexity and size: a potential solution

    PubMed Central

    DeFelipe, Javier

    2015-01-01

    Over the years the field of neuroanatomy has evolved considerably but unraveling the extraordinary structural and functional complexity of the brain seems to be an unattainable goal, partly due to the fact that it is only possible to obtain an imprecise connection matrix of the brain. The reasons why reaching such a goal appears almost impossible to date is discussed here, together with suggestions of how we could overcome this anatomical problem by establishing new methodologies to study the brain and by promoting interdisciplinary collaboration. Generating a realistic computational model seems to be the solution rather than attempting to fully reconstruct the whole brain or a particular brain region. PMID:26347617

  15. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    SciTech Connect

    Tahvili, Sahar; Österberg, Jonas; Silvestrov, Sergei; Biteus, Jonas

    2014-12-10

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation.

  16. Analyzing the tradeoff between electrical complexity and accuracy in patient-specific computational models of deep brain stimulation

    NASA Astrophysics Data System (ADS)

    Howell, Bryan; McIntyre, Cameron C.

    2016-06-01

    Objective. Deep brain stimulation (DBS) is an adjunctive therapy that is effective in treating movement disorders and shows promise for treating psychiatric disorders. Computational models of DBS have begun to be utilized as tools to optimize the therapy. Despite advancements in the anatomical accuracy of these models, there is still uncertainty as to what level of electrical complexity is adequate for modeling the electric field in the brain and the subsequent neural response to the stimulation. Approach. We used magnetic resonance images to create an image-based computational model of subthalamic DBS. The complexity of the volume conductor model was increased by incrementally including heterogeneity, anisotropy, and dielectric dispersion in the electrical properties of the brain. We quantified changes in the load of the electrode, the electric potential distribution, and stimulation thresholds of descending corticofugal (DCF) axon models. Main results. Incorporation of heterogeneity altered the electric potentials and subsequent stimulation thresholds, but to a lesser degree than incorporation of anisotropy. Additionally, the results were sensitive to the choice of method for defining anisotropy, with stimulation thresholds of DCF axons changing by as much as 190%. Typical approaches for defining anisotropy underestimate the expected load of the stimulation electrode, which led to underestimation of the extent of stimulation. More accurate predictions of the electrode load were achieved with alternative approaches for defining anisotropy. The effects of dielectric dispersion were small compared to the effects of heterogeneity and anisotropy. Significance. The results of this study help delineate the level of detail that is required to accurately model electric fields generated by DBS electrodes.

  17. Knowledge to action for solving complex problems: insights from a review of nine international cases

    PubMed Central

    Riley, B. L.; Robinson, K. L.; Gamble, J.; Finegood, D. T.; Sheppard, D.; Penney, T. L.; Best, A.

    2015-01-01

    Introduction: Solving complex problems such as preventing chronic diseases introduces unique challenges for the creation and application of knowledge, or knowledge to action (KTA). KTA approaches that apply principles of systems thinking are thought to hold promise, but practical strategies for their application are not well understood. In this paper we report the results of a scan of systems approaches to KTA with a goal to identify how to optimize their implementation and impact. Methods: A 5-person advisory group purposefully selected 9 initiatives to achieve diversity on issues addressed and organizational forms. Information on each case was gathered from documents and through telephone interviews with primary contacts within each organization. Following verification of case descriptions, an inductive analysis was conducted within and across cases. Results: The cases revealed 5 guidelines for moving from conceiving KTA systems to implementing them: 1) establish and nurture relationships, 2) co-produce and curate knowledge, 3) create feedback loops, 4) frame as systems interventions rather than projects, and 5) consider variations across time and place. Conclusion: Results from the environmental scan are a modest start to translating systems concepts for KTA into practice. Use of the strategies revealed in the scan may improve KTA for solving complex public health problems. The strategies themselves will benefit from the development of a science that aims to understand adaptation and ongoing learning from policy and practice interventions, strengthens enduring relationships, and fills system gaps in addition to evidence gaps. Systems approaches to KTA will also benefit from robust evaluations. PMID:25970804

  18. Analyzing Katana referral hospital as a complex adaptive system: agents, interactions and adaptation to a changing environment.

    PubMed

    Karemere, Hermès; Ribesse, Nathalie; Marchal, Bruno; Macq, Jean

    2015-01-01

    This study deals with the adaptation of Katana referral hospital in Eastern Democratic Republic of Congo in a changing environment that is affected for more than a decade by intermittent armed conflicts. His objective is to generate theoretical proposals for addressing differently the analysis of hospitals governance in the aims to assess their performance and how to improve that performance. The methodology applied approach uses a case study using mixed methods ( qualitative and quantitative) for data collection. It uses (1) hospital data to measure the output of hospitals, (2) literature review to identify among others, events and interventions recorded in the history of hospital during the study period and (3) information from individual interviews to validate the interpretation of the results of the previous two sources of data and understand the responsiveness of management team referral hospital during times of change. The study brings four theoretical propositions: (1) Interaction between key agents is a positive force driving adaptation if the actors share a same vision, (2) The strength of the interaction between agents is largely based on the nature of institutional arrangements, which in turn are shaped by the actors themselves, (3) The owner and the management team play a decisive role in the implementation of effective institutional arrangements and establishment of positive interactions between agents, (4) The analysis of recipient population's perception of health services provided allow to better tailor and adapt the health services offer to the population's needs and expectations. Research shows that it isn't enough just to provide support (financial and technical), to manage a hospital for operate and adapt to a changing environment but must still animate, considering that it is a complex adaptive system and that this animation is nothing other than the induction of a positive interaction between agents.

  19. Quantitatively analyzing metathesis catalyst activity and structural features in silica-supported tungsten imido-alkylidene complexes.

    PubMed

    Mougel, Victor; Santiago, Celine B; Zhizhko, Pavel A; Bess, Elizabeth N; Varga, Jeno; Frater, Georg; Sigman, Matthew S; Copéret, Christophe

    2015-05-27

    A broad series of fully characterized, well-defined silica-supported W metathesis catalysts with the general formula [(≡SiO)W(═NAr)(═CHCMe2R)(X)] (Ar = 2,6-iPr2C6H3 (AriPr), 2,6-Cl2C6H3 (ArCl), 2-CF3C6H4 (ArCF3), and C6F5 (ArF5); X = OC(CF3)3 (OtBuF9), OCMe(CF3)2 (OtBuF6), OtBu, OSi(OtBu)3, 2,5-dimethylpyrrolyl (Me2Pyr) and R = Me or Ph) was prepared by grafting bis-X substituted complexes [W(NAr)(═CHCMe2R)(X)2] on silica partially dehydroxylated at 700 °C (SiO2-(700)), and their activity was evaluated with the goal to obtain detailed structure-activity relationships. Quantitative influence of the ligand set on the activity (turnover frequency, TOF) in self-metathesis of cis-4-nonene was investigated using multivariate linear regression analysis tools. The TOF of these catalysts (activity) can be well predicted from simple steric and electronic parameters of the parent protonated ligands; it is described by the mutual contribution of the NBO charge of the nitrogen or the IR intensity of the symmetric N-H stretch of the ArNH2, corresponding to the imido ligand, together with the Sterimol B5 and pKa of HX, representing the X ligand. This quantitative and predictive structure-activity relationship analysis of well-defined heterogeneous catalysts shows that high activity is associated with the combination of X and NAr ligands of opposite electronic character and paves the way toward rational development of metathesis catalysts.

  20. Communication: Overcoming the root search problem in complex quantum trajectory calculations

    SciTech Connect

    Zamstein, Noa; Tannor, David J.

    2014-01-28

    Three new developments are presented regarding the semiclassical coherent state propagator. First, we present a conceptually different derivation of Huber and Heller's method for identifying complex root trajectories and their equations of motion [D. Huber and E. J. Heller, J. Chem. Phys. 87, 5302 (1987)]. Our method proceeds directly from the time-dependent Schrödinger equation and therefore allows various generalizations of the formalism. Second, we obtain an analytic expression for the semiclassical coherent state propagator. We show that the prefactor can be expressed in a form that requires solving significantly fewer equations of motion than in alternative expressions. Third, the semiclassical coherent state propagator is used to formulate a final value representation of the time-dependent wavefunction that avoids the root search, eliminates problems with caustics and automatically includes interference. We present numerical results for the 1D Morse oscillator showing that the method may become an attractive alternative to existing semiclassical approaches.

  1. Leadership and leadership development in healthcare settings - a simplistic solution to complex problems?

    PubMed

    McDonald, Ruth

    2014-10-01

    There is a trend in health systems around the world to place great emphasis on and faith in improving 'leadership'. Leadership has been defined in many ways and the elitist implications of traditional notions of leadership sit uncomfortably with modern healthcare organisations. The concept of distributed leadership incorporates inclusivity, collectiveness and collaboration, with the result that, to some extent, all staff, not just those in senior management roles, are viewed as leaders. Leadership development programmes are intended to equip individuals to improve leadership skills, but we know little about their effectiveness. Furthermore, the content of these programmes varies widely and the fact that many lack a sense of how they fit with individual or organisational goals raises questions about how they are intended to achieve their aims. It is important to avoid simplistic assumptions about the ability of improved leadership to solve complex problems. It is also important to evaluate leadership development programmes in ways that go beyond descriptive accounts.

  2. Communication: overcoming the root search problem in complex quantum trajectory calculations.

    PubMed

    Zamstein, Noa; Tannor, David J

    2014-01-28

    Three new developments are presented regarding the semiclassical coherent state propagator. First, we present a conceptually different derivation of Huber and Heller's method for identifying complex root trajectories and their equations of motion [D. Huber and E. J. Heller, J. Chem. Phys. 87, 5302 (1987)]. Our method proceeds directly from the time-dependent Schrödinger equation and therefore allows various generalizations of the formalism. Second, we obtain an analytic expression for the semiclassical coherent state propagator. We show that the prefactor can be expressed in a form that requires solving significantly fewer equations of motion than in alternative expressions. Third, the semiclassical coherent state propagator is used to formulate a final value representation of the time-dependent wavefunction that avoids the root search, eliminates problems with caustics and automatically includes interference. We present numerical results for the 1D Morse oscillator showing that the method may become an attractive alternative to existing semiclassical approaches.

  3. Validation Study of a Method for Assessing Complex Ill-Structured Problem Solving by Using Causal Representations

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ifenthaler, Dirk; Ge, Xun

    2013-01-01

    The important but little understood problem that motivated this study was the lack of research on valid assessment methods to determine progress in higher-order learning in situations involving complex and ill-structured problems. Without a valid assessment method, little progress can occur in instructional design research with regard to designing…

  4. Seeing around a Ball: Complex, Technology-Based Problems in Calculus with Applications in Science and Engineering-Redux

    ERIC Educational Resources Information Center

    Winkel, Brian

    2008-01-01

    A complex technology-based problem in visualization and computation for students in calculus is presented. Strategies are shown for its solution and the opportunities for students to put together sequences of concepts and skills to build for success are highlighted. The problem itself involves placing an object under water in order to actually see…

  5. Using Educational Data Mining Methods to Assess Field-Dependent and Field-Independent Learners' Complex Problem Solving

    ERIC Educational Resources Information Center

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    The present study investigated the problem-solving performance of 101 university students and their interactions with a computer modeling tool in order to solve a complex problem. Based on their performance on the hidden figures test, students were assigned to three groups of field-dependent (FD), field-mixed (FM), and field-independent (FI)…

  6. An Investigation of the Interrelationships between Motivation, Engagement, and Complex Problem Solving in Game-Based Learning

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Law, Victor; Ifenthaler, Dirk; Ge, Xun; Miller, Raymond

    2014-01-01

    Digital game-based learning, especially massively multiplayer online games, has been touted for its potential to promote student motivation and complex problem-solving competency development. However, current evidence is limited to anecdotal studies. The purpose of this empirical investigation is to examine the complex interplay between…

  7. An immersed boundary computational model for acoustic scattering problems with complex geometries.

    PubMed

    Sun, Xiaofeng; Jiang, Yongsong; Liang, An; Jing, Xiaodong

    2012-11-01

    An immersed boundary computational model is presented in order to deal with the acoustic scattering problem by complex geometries, in which the wall boundary condition is treated as a direct body force determined by satisfying the non-penetrating boundary condition. Two distinct discretized grids are used to discrete the fluid domain and immersed boundary, respectively. The immersed boundaries are represented by Lagrangian points and the direct body force determined on these points is applied on the neighboring Eulerian points. The coupling between the Lagrangian points and Euler points is linked by a discrete delta function. The linearized Euler equations are spatially discretized with a fourth-order dispersion-relation-preserving scheme and temporal integrated with a low-dissipation and low-dispersion Runge-Kutta scheme. A perfectly matched layer technique is applied to absorb out-going waves and in-going waves in the immersed bodies. Several benchmark problems for computational aeroacoustic solvers are performed to validate the present method.

  8. Enhancements of evolutionary algorithm for the complex requirements of a nurse scheduling problem

    NASA Astrophysics Data System (ADS)

    Tein, Lim Huai; Ramli, Razamin

    2014-12-01

    Over the years, nurse scheduling is a noticeable problem that is affected by the global nurse turnover crisis. The more nurses are unsatisfied with their working environment the more severe the condition or implication they tend to leave. Therefore, the current undesirable work schedule is partly due to that working condition. Basically, there is a lack of complimentary requirement between the head nurse's liability and the nurses' need. In particular, subject to highly nurse preferences issue, the sophisticated challenge of doing nurse scheduling is failure to stimulate tolerance behavior between both parties during shifts assignment in real working scenarios. Inevitably, the flexibility in shifts assignment is hard to achieve for the sake of satisfying nurse diverse requests with upholding imperative nurse ward coverage. Hence, Evolutionary Algorithm (EA) is proposed to cater for this complexity in a nurse scheduling problem (NSP). The restriction of EA is discussed and thus, enhancement on the EA operators is suggested so that the EA would have the characteristic of a flexible search. This paper consists of three types of constraints which are the hard, semi-hard and soft constraints that can be handled by the EA with enhanced parent selection and specialized mutation operators. These operators and EA as a whole contribute to the efficiency of constraint handling, fitness computation as well as flexibility in the search, which correspond to the employment of exploration and exploitation principles.

  9. Fibromyalgia and disability adjudication: No simple solutions to a complex problem

    PubMed Central

    Harth, Manfred; Nielson, Warren R

    2014-01-01

    BACKGROUND: Adjudication of disability claims related to fibromyalgia (FM) syndrome can be a challenging and complex process. A commentary published in the current issue of Pain Research & Management makes suggestions for improvement. The authors of the commentary contend that: previously and currently used criteria for the diagnosis of FM are irrelevant to clinical practice; the opinions of family physicians should supersede those of experts; there is little evidence that trauma can cause FM; no formal instruments are necessary to assess disability; and many FM patients on or applying for disability are exaggerating or malingering, and tests of symptoms validity should be used to identify malingerers. OBJECTIVES: To assess the assertions made by Fitzcharles et al. METHODS: A narrative review of the available research literature was performed. RESULTS: Available diagnostic criteria should be used in a medicolegal context; family physicians are frequently uncertain about FM and/or biased; there is considerable evidence that trauma can be a cause of FM; it is essential to use validated instruments to assess functional impairment; and the available tests of physical effort and symptom validity are of uncertain value in identifying malingering in FM. CONCLUSIONS: The available evidence does not support many of the suggestions presented in the commentary. Caution is advised in adopting simple solutions for disability adjudication in FM because they are generally incompatible with the inherently complex nature of the problem. PMID:25479149

  10. The Problem with Word Problems: Solving Word Problems in Math Requires a Complex Web of Skills. But There's No Reason Why it Can't Be Fun

    ERIC Educational Resources Information Center

    Forsten, Char

    2004-01-01

    Children need to combine reading, thinking, and computational skills to solve math word problems. The author provides some strategies that principals can share with their teachers to help students become proficient and advanced problem-solvers. They include creating a conducive classroom environment, providing daily mental math activities, making…

  11. Decision Analysis for Environmental Problems

    EPA Science Inventory

    Environmental management problems are often complex and uncertain. A formal process with proper guidance is needed to understand the issues, identify sources of disagreement, and analyze the major uncertainties in environmental problems. This course will present a process that fo...

  12. Exploring Corn-Ethanol As A Complex Problem To Teach Sustainability Concepts Across The Science-Business-Liberal Arts Curriculum

    NASA Astrophysics Data System (ADS)

    Oches, E. A.; Szymanski, D. W.; Snyder, B.; Gulati, G. J.; Davis, P. T.

    2012-12-01

    The highly interdisciplinary nature of sustainability presents pedagogic challenges when sustainability concepts are incorporated into traditional disciplinary courses. At Bentley University, where over 90 percent of students major in business disciplines, we have created a multidisciplinary course module centered on corn ethanol that explores a complex social, environmental, and economic problem and develops basic data analysis and analytical thinking skills in several courses spanning the natural, physical, and social sciences within the business curriculum. Through an NSF-CCLI grant, Bentley faculty from several disciplines participated in a summer workshop to define learning objectives, create course modules, and develop an assessment plan to enhance interdisciplinary sustainability teaching. The core instructional outcome was a data-rich exercise for all participating courses in which students plot and analyze multiple parameters of corn planted and harvested for various purposes including food (human), feed (animal), ethanol production, and commodities exchanged for the years 1960 to present. Students then evaluate patterns and trends in the data and hypothesize relationships among the plotted data and environmental, social, and economic drivers, responses, and unintended consequences. After the central data analysis activity, students explore corn ethanol production as it relates to core disciplinary concepts in their individual classes. For example, students in Environmental Chemistry produce ethanol using corn and sugar as feedstocks and compare the efficiency of each process, while learning about enzymes, fermentation, distillation, and other chemical principles. Principles of Geology students examine the effects of agricultural runoff on surface water quality associated with extracting greater agricultural yield from mid-continent croplands. The American Government course examines the role of political institutions, the political process, and various

  13. Subspace Iteration Method for Complex Eigenvalue Problems with Nonsymmetric Matrices in Aeroelastic System

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Lung, Shu

    2009-01-01

    Modern airplane design is a multidisciplinary task which combines several disciplines such as structures, aerodynamics, flight controls, and sometimes heat transfer. Historically, analytical and experimental investigations concerning the interaction of the elastic airframe with aerodynamic and in retia loads have been conducted during the design phase to determine the existence of aeroelastic instabilities, so called flutter .With the advent and increased usage of flight control systems, there is also a likelihood of instabilities caused by the interaction of the flight control system and the aeroelastic response of the airplane, known as aeroservoelastic instabilities. An in -house code MPASES (Ref. 1), modified from PASES (Ref. 2), is a general purpose digital computer program for the analysis of the closed-loop stability problem. This program used subroutines given in the International Mathematical and Statistical Library (IMSL) (Ref. 3) to compute all of the real and/or complex conjugate pairs of eigenvalues of the Hessenberg matrix. For high fidelity configuration, these aeroelastic system matrices are large and compute all eigenvalues will be time consuming. A subspace iteration method (Ref. 4) for complex eigenvalues problems with nonsymmetric matrices has been formulated and incorporated into the modified program for aeroservoelastic stability (MPASES code). Subspace iteration method only solve for the lowest p eigenvalues and corresponding eigenvectors for aeroelastic and aeroservoelastic analysis. In general, the selection of p is ranging from 10 for wing flutter analysis to 50 for an entire aircraft flutter analysis. The application of this newly incorporated code is an experiment known as the Aerostructures Test Wing (ATW) which was designed by the National Aeronautic and Space Administration (NASA) Dryden Flight Research Center, Edwards, California to research aeroelastic instabilities. Specifically, this experiment was used to study an instability

  14. Computational issues in complex water-energy optimization problems: Time scales, parameterizations, objectives and algorithms

    NASA Astrophysics Data System (ADS)

    Efstratiadis, Andreas; Tsoukalas, Ioannis; Kossieris, Panayiotis; Karavokiros, George; Christofides, Antonis; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris

    2015-04-01

    Modelling of large-scale hybrid renewable energy systems (HRES) is a challenging task, for which several open computational issues exist. HRES comprise typical components of hydrosystems (reservoirs, boreholes, conveyance networks, hydropower stations, pumps, water demand nodes, etc.), which are dynamically linked with renewables (e.g., wind turbines, solar parks) and energy demand nodes. In such systems, apart from the well-known shortcomings of water resources modelling (nonlinear dynamics, unknown future inflows, large number of variables and constraints, conflicting criteria, etc.), additional complexities and uncertainties arise due to the introduction of energy components and associated fluxes. A major difficulty is the need for coupling two different temporal scales, given that in hydrosystem modeling, monthly simulation steps are typically adopted, yet for a faithful representation of the energy balance (i.e. energy production vs. demand) a much finer resolution (e.g. hourly) is required. Another drawback is the increase of control variables, constraints and objectives, due to the simultaneous modelling of the two parallel fluxes (i.e. water and energy) and their interactions. Finally, since the driving hydrometeorological processes of the integrated system are inherently uncertain, it is often essential to use synthetically generated input time series of large length, in order to assess the system performance in terms of reliability and risk, with satisfactory accuracy. To address these issues, we propose an effective and efficient modeling framework, key objectives of which are: (a) the substantial reduction of control variables, through parsimonious yet consistent parameterizations; (b) the substantial decrease of computational burden of simulation, by linearizing the combined water and energy allocation problem of each individual time step, and solve each local sub-problem through very fast linear network programming algorithms, and (c) the substantial

  15. Can fuzzy logic bring complex problems into focus? Modeling imprecise factors in environmental policy

    SciTech Connect

    McKone, Thomas E.; Deshpande, Ashok W.

    2004-06-14

    In modeling complex environmental problems, we often fail to make precise statements about inputs and outcome. In this case the fuzzy logic method native to the human mind provides a useful way to get at these problems. Fuzzy logic represents a significant change in both the approach to and outcome of environmental evaluations. Risk assessment is currently based on the implicit premise that probability theory provides the necessary and sufficient tools for dealing with uncertainty and variability. The key advantage of fuzzy methods is the way they reflect the human mind in its remarkable ability to store and process information which is consistently imprecise, uncertain, and resistant to classification. Our case study illustrates the ability of fuzzy logic to integrate statistical measurements with imprecise health goals. But we submit that fuzzy logic and probability theory are complementary and not competitive. In the world of soft computing, fuzzy logic has been widely used and has often been the ''smart'' behind smart machines. But it will require more effort and case studies to establish its niche in risk assessment or other types of impact assessment. Although we often hear complaints about ''bright lines,'' could we adapt to a system that relaxes these lines to fuzzy gradations? Would decision makers and the public accept expressions of water or air quality goals in linguistic terms with computed degrees of certainty? Resistance is likely. In many regions, such as the US and European Union, it is likely that both decision makers and members of the public are more comfortable with our current system in which government agencies avoid confronting uncertainties by setting guidelines that are crisp and often fail to communicate uncertainty. But some day perhaps a more comprehensive approach that includes exposure surveys, toxicological data, epidemiological studies coupled with fuzzy modeling will go a long way in resolving some of the conflict, divisiveness

  16. Accurate gradient approximation for complex interface problems in 3D by an improved coupling interface method

    SciTech Connect

    Shu, Yu-Chen; Chern, I-Liang; Chang, Chien C.

    2014-10-15

    Most elliptic interface solvers become complicated for complex interface problems at those “exceptional points” where there are not enough neighboring interior points for high order interpolation. Such complication increases especially in three dimensions. Usually, the solvers are thus reduced to low order accuracy. In this paper, we classify these exceptional points and propose two recipes to maintain order of accuracy there, aiming at improving the previous coupling interface method [26]. Yet the idea is also applicable to other interface solvers. The main idea is to have at least first order approximations for second order derivatives at those exceptional points. Recipe 1 is to use the finite difference approximation for the second order derivatives at a nearby interior grid point, whenever this is possible. Recipe 2 is to flip domain signatures and introduce a ghost state so that a second-order method can be applied. This ghost state is a smooth extension of the solution at the exceptional point from the other side of the interface. The original state is recovered by a post-processing using nearby states and jump conditions. The choice of recipes is determined by a classification scheme of the exceptional points. The method renders the solution and its gradient uniformly second-order accurate in the entire computed domain. Numerical examples are provided to illustrate the second order accuracy of the presently proposed method in approximating the gradients of the original states for some complex interfaces which we had tested previous in two and three dimensions, and a real molecule ( (1D63)) which is double-helix shape and composed of hundreds of atoms.

  17. Speed and Complexity Characterize Attention Problems in Children with Localization-Related Epilepsy

    PubMed Central

    Berl, Madison; Terwilliger, Virginia; Scheller, Alexandra; Sepeta, Leigh; Walkowiak, Jenifer; Gaillard, William D.

    2015-01-01

    Summary Objective Children with epilepsy (EPI) have a higher rate of ADHD (28–70%) than typically developing (TD) children (5–10%); however, attention is multidimensional. Thus, we aimed to characterize the profile of attention difficulties in children with epilepsy. Methods Seventy-five children with localization-related epilepsy ages 6–16 and 75 age-matched controls were evaluated using multimodal, multidimensional measures of attention including direct performance and parent ratings of attention as well as intelligence testing. We assessed group differences across attention measures, determined if parent rating predicted performance on attention measures, and examined if epilepsy characteristics were associated with attention skills. Results The EPI group performed worse than the TD group on timed and complex attention aspects of attention (p<.05), while performance on simple visual and simple auditory attention tasks was comparable. Children with EPI were 12 times as likely as TD children to have clinically elevated symptoms of inattention as rated by parents, but ratings were a weak predictor of attention performance. Earlier age of onset was associated with slower motor speed (p<.01), but no other epilepsy-related clinical characteristics were associated with attention skills. Significance This study clarifies the nature of the attention problems in pediatric epilepsy, which may be under recognized. Children with EPI had difficulty with complex attention and rapid response, not simple attention. As such, they may not exhibit difficulty until later in primary school when demands increase. Parent report with standard ADHD screening tools may underdetect these higher order attention difficulties. Thus, monitoring through direct neuropsychological performance is recommended. PMID:25940056

  18. Managing the Complexity of Design Problems through Studio-Based Learning

    ERIC Educational Resources Information Center

    Cennamo, Katherine; Brandt, Carol; Scott, Brigitte; Douglas, Sarah; McGrath, Margarita; Reimer, Yolanda; Vernon, Mitzi

    2011-01-01

    The ill-structured nature of design problems makes them particularly challenging for problem-based learning. Studio-based learning (SBL), however, has much in common with problem-based learning and indeed has a long history of use in teaching students to solve design problems. The purpose of this ethnographic study of an industrial design class,…

  19. Leadership and leadership development in healthcare settings – a simplistic solution to complex problems?

    PubMed Central

    McDonald, Ruth

    2014-01-01

    There is a trend in health systems around the world to place great emphasis on and faith in improving ‘leadership’. Leadership has been defined in many ways and the elitist implications of traditional notions of leadership sit uncomfortably with modern healthcare organisations. The concept of distributed leadership incorporates inclusivity, collectiveness and collaboration, with the result that, to some extent, all staff, not just those in senior management roles, are viewed as leaders. Leadership development programmes are intended to equip individuals to improve leadership skills, but we know little about their effectiveness. Furthermore, the content of these programmes varies widely and the fact that many lack a sense of how they fit with individual or organisational goals raises questions about how they are intended to achieve their aims. It is important to avoid simplistic assumptions about the ability of improved leadership to solve complex problems. It is also important to evaluate leadership development programmes in ways that go beyond descriptive accounts. PMID:25337595

  20. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  1. Boundary value problems for a class of planar complex vector fields

    NASA Astrophysics Data System (ADS)

    Campana, C.; Meziani, A.

    2016-11-01

    This paper deals with a Riemann-Hilbert problem and a Riemann problem for a class of planar elliptic vector fields with degeneracies. Existence of Hölder continuous solutions is established when the associated index is nonnegative.

  2. A framework to approach problems of forensic anthropology using complex networks

    NASA Astrophysics Data System (ADS)

    Caridi, Inés; Dorso, Claudio O.; Gallo, Pablo; Somigliana, Carlos

    2011-05-01

    We have developed a method to analyze and interpret emerging structures in a set of data which lacks some information. It has been conceived to be applied to the problem of getting information about people who disappeared in the Argentine state of Tucumán from 1974 to 1981. Even if the military dictatorship formally started in Argentina had begun in 1976 and lasted until 1983, the disappearance and assassination of people began some months earlier. During this period several circuits of Illegal Detention Centres (IDC) were set up in different locations all over the country. In these secret centres, disappeared people were illegally kept without any sort of constitutional guarantees, and later assassinated. Even today, the final destination of most of the disappeared people’s remains is still unknown. The fundamental hypothesis in this work is that a group of people with the same political affiliation whose disappearances were closely related in time and space shared the same place of captivity (the same IDC or circuit of IDCs). This hypothesis makes sense when applied to the systematic method of repression and disappearances which was actually launched in Tucumán, Argentina (2007) [11]. In this work, the missing individuals are identified as nodes on a network and connections are established among them based on the individuals’ attributes while they were alive, by using rules to link them. In order to determine which rules are the most effective in defining the network, we use other kind of knowledge available in this problem: previous results from the anthropological point of view (based on other sources of information, both oral and written, historical and anthropological data, etc.); and information about the place (one or more IDCs) where some people were kept during their captivity. For these best rules, a prediction about these people’s possible destination is assigned (one or more IDCs where they could have been kept), and the success of the

  3. Untangling the Complex Needs of People Experiencing Gambling Problems and Homelessness

    ERIC Educational Resources Information Center

    Holdsworth, Louise; Tiyce, Margaret

    2013-01-01

    People with gambling problems are now recognised among those at increased risk of homelessness, and the link between housing and gambling problems has been identified as an area requiring further research. This paper discusses the findings of a qualitative study that explored the relationship between gambling problems and homelessness. Interviews…

  4. Generalist solutions to complex problems: generating practice-based evidence - the example of managing multi-morbidity

    PubMed Central

    2013-01-01

    Background A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet the concept of multi-morbidity (the presence of >2 diseases) is a product of the design of health care systems which define health care need on the basis of disease status. So does the solution lie in an alternative model of healthcare? Discussion Strengthening generalist practice has been proposed as part of the solution to tackling multi-morbidity. Generalism is a professional philosophy of practice, deeply known to many practitioners, and described as expertise in whole person medicine. But generalism lacks the evidence base needed by policy makers and planners to support service redesign. The challenge is to fill this practice-research gap in order to critically explore if and when generalist care offers a robust alternative to management of this complex problem. We need practice-based evidence to fill this gap. By recognising generalist practice as a ‘complex intervention’ (intervening in a complex system), we outline an approach to evaluate impact using action-research principles. We highlight the implications for those who both commission and undertake research in order to tackle this problem. Summary Answers to the complex problem of multi-morbidity won’t come from doing more of the same. We need to change systems of care, and so the systems for generating evidence to support that care. This paper contributes to that work through outlining a process for generating practice-based evidence of generalist solutions to the complex problem of person-centred care for people with multi-morbidity. PMID:23919296

  5. Learning by Preparing to Teach: Fostering Self-Regulatory Processes and Achievement during Complex Mathematics Problem Solving

    ERIC Educational Resources Information Center

    Muis, Krista R.; Psaradellis, Cynthia; Chevrier, Marianne; Di Leo, Ivana; Lajoie, Susanne P.

    2016-01-01

    We developed an intervention based on the learning by teaching paradigm to foster self-regulatory processes and better learning outcomes during complex mathematics problem solving in a technology-rich learning environment. Seventy-eight elementary students were randomly assigned to 1 of 2 conditions: learning by preparing to teach, or learning for…

  6. Social and Ethical Dimension of the Natural Sciences, Complex Problems of the Age, Interdisciplinarity, and the Contribution of Education

    ERIC Educational Resources Information Center

    Develaki, Maria

    2008-01-01

    In view of the complex problems of this age, the question of the socio-ethical dimension of science acquires particular importance. We approach this matter from a philosophical and sociological standpoint, looking at such focal concerns as the motivation, purposes and methods of scientific activity, the ambivalence of scientific research and the…

  7. The Effect of Multiple Scaffolding Tools on Students' Understanding, Consideration of Different Perspectives, and Misconceptions of a Complex Problem

    ERIC Educational Resources Information Center

    Zydney, Janet Mannheimer

    2010-01-01

    This study investigated the effectiveness of multiple scaffolding tools in helping students understand a complex problem. In order to support students with this task, a multimedia learning environment was developed based on the cognitive flexibility theory (CFT) and scaffolding through computer-based tools. Seventy-nine 10th-grade students in an…

  8. Does Visualization Enhance Complex Problem Solving? The Effect of Causal Mapping on Performance in the Computer-Based Microworld Tailorshop

    ERIC Educational Resources Information Center

    Öllinger, Michael; Hammon, Stephanie; von Grundherr, Michael; Funke, Joachim

    2015-01-01

    Causal mapping is often recognized as a technique to support strategic decisions and actions in complex problem situations. Such drawing of causal structures is supposed to particularly foster the understanding of the interaction of the various system elements and to further encourage holistic thinking. It builds on the idea that humans make use…

  9. Linking Complex Problem Solving and General Mental Ability to Career Advancement: Does a Transversal Skill Reveal Incremental Predictive Validity?

    ERIC Educational Resources Information Center

    Mainert, Jakob; Kretzschmar, André; Neubert, Jonas C.; Greiff, Samuel

    2015-01-01

    Transversal skills, such as complex problem solving (CPS) are viewed as central twenty-first-century skills. Recent empirical findings have already supported the importance of CPS for early academic advancement. We wanted to determine whether CPS could also contribute to the understanding of career advancement later in life. Towards this end, we…

  10. Validity of the MicroDYN Approach: Complex Problem Solving Predicts School Grades beyond Working Memory Capacity

    ERIC Educational Resources Information Center

    Schweizer, Fabian; Wustenberg, Sascha; Greiff, Samuel

    2013-01-01

    This study examines the validity of the complex problem solving (CPS) test MicroDYN by investigating a) the relation between its dimensions--rule identification (exploration strategy), rule knowledge (acquired knowledge), rule application (control performance)--and working memory capacity (WMC), and b) whether CPS predicts school grades in…

  11. Evoked potential correlates of intelligence: some problems with Hendrickson's string measure of evoked potential complexity and error theory of intelligence.

    PubMed

    Vetterli, C F; Furedy, J J

    1985-07-01

    The string measure of evoked potential (EP) complexity is based on a new error theory of intelligence, which differs from the older speed-based formulations which focus on EP latency rather than complexity. In this note we first raise a methodological problem of arbitrariness with respect to one version of the string measure. We then provide a comparative empirical assessment of EP-IQ correlations with respect to a revised string measure (which does not suffer from the methodological problem), a latency measure, and another measure of EP complexity: average voltage. This assessment indicates that the string measure, in particular, yields quite disorderly results, and that, in general, the results favor the speed over the error formulation.

  12. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information. Part 1—Methodology

    NASA Astrophysics Data System (ADS)

    Mejer Hansen, Thomas; Skou Cordua, Knud; Caroline Looms, Majken; Mosegaard, Klaus

    2013-03-01

    From a probabilistic point-of-view, the solution to an inverse problem can be seen as a combination of independent states of information quantified by probability density functions. Typically, these states of information are provided by a set of observed data and some a priori information on the solution. The combined states of information (i.e. the solution to the inverse problem) is a probability density function typically referred to as the a posteriori probability density function. We present a generic toolbox for Matlab and Gnu Octave called SIPPI that implements a number of methods for solving such probabilistically formulated inverse problems by sampling the a posteriori probability density function. In order to describe the a priori probability density function, we consider both simple Gaussian models and more complex (and realistic) a priori models based on higher order statistics. These a priori models can be used with both linear and non-linear inverse problems. For linear inverse Gaussian problems we make use of least-squares and kriging-based methods to describe the a posteriori probability density function directly. For general non-linear (i.e. non-Gaussian) inverse problems, we make use of the extended Metropolis algorithm to sample the a posteriori probability density function. Together with the extended Metropolis algorithm, we use sequential Gibbs sampling that allow computationally efficient sampling of complex a priori models. The toolbox can be applied to any inverse problem as long as a way of solving the forward problem is provided. Here we demonstrate the methods and algorithms available in SIPPI. An application of SIPPI, to a tomographic cross borehole inverse problems, is presented in a second part of this paper.

  13. DIFFERENTIAL ANALYZER

    DOEpatents

    Sorensen, E.G.; Gordon, C.M.

    1959-02-10

    Improvements in analog eomputing machines of the class capable of evaluating differential equations, commonly termed differential analyzers, are described. In general form, the analyzer embodies a plurality of basic computer mechanisms for performing integration, multiplication, and addition, and means for directing the result of any one operation to another computer mechanism performing a further operation. In the device, numerical quantities are represented by the rotation of shafts, or the electrical equivalent of shafts.

  14. Complex Problem Solving in Radiologic Technology: Understanding the Roles of Experience, Reflective Judgment, and Workplace Culture

    ERIC Educational Resources Information Center

    Yates, Jennifer L.

    2011-01-01

    The purpose of this research study was to explore the process of learning and development of problem solving skills in radiologic technologists. The researcher sought to understand the nature of difficult problems encountered in clinical practice, to identify specific learning practices leading to the development of professional expertise, and to…

  15. Introducing the Hero Complex and the Mythic Iconic Pathway of Problem Gambling

    ERIC Educational Resources Information Center

    Nixon, Gary; Solowoniuk, Jason

    2009-01-01

    Early research into the motivations behind problem gambling reflected separate paradigms of thought splitting our understanding of the gambler into divergent categories. However, over the past 25 years, problem gambling is now best understood to arise from biological, environmental, social, and psychological processes, and is now encapsulated…

  16. The Role of Prior Knowledge and Problem Contexts in Students' Explanations of Complex System

    ERIC Educational Resources Information Center

    Barth-Cohen, Lauren April

    2012-01-01

    The purpose of this dissertation is to study students' competencies in generating scientific explanations within the domain of complex systems, an interdisciplinary area in which students tend to have difficulties. While considering students' developing explanations of how complex systems work, I investigate the role of prior knowledge…

  17. Gas Analyzer

    NASA Astrophysics Data System (ADS)

    1989-01-01

    The M200 originated in the 1970's under an Ames Research Center/Stanford University contract to develop a small, lightweight gas analyzer for Viking Landers. Although the unit was not used on the spacecraft, it was further developed by The National Institute for Occupational Safety and Health (NIOSH). Three researchers from the project later formed Microsensor Technology, Inc. (MTI) to commercialize the analyzer. The original version (Micromonitor 500) was introduced in 1982, and the M200 in 1988. The M200, a more advanced version, features dual gas chromatograph which separate a gaseous mixture into components and measure concentrations of each gas. It is useful for monitoring gas leaks, chemical spills, etc. Many analyses are completed in less than 30 seconds, and a wide range of mixtures can be analyzed.

  18. Process Analyzer

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The ChemScan UV-6100 is a spectrometry system originally developed by Biotronics Technologies, Inc. under a Small Business Innovation Research (SBIR) contract. It is marketed to the water and wastewater treatment industries, replacing "grab sampling" with on-line data collection. It analyzes the light absorbance characteristics of a water sample, simultaneously detects hundreds of individual wavelengths absorbed by chemical substances in a process solution, and quantifies the information. Spectral data is then processed by ChemScan analyzer and compared with calibration files in the system's memory in order to calculate concentrations of chemical substances that cause UV light absorbance in specific patterns. Monitored substances can be analyzed for quality and quantity. Applications include detection of a variety of substances, and the information provided enables an operator to control a process more efficiently.

  19. Blood Analyzer

    NASA Technical Reports Server (NTRS)

    1992-01-01

    In the 1970's, NASA provided funding for development of an automatic blood analyzer for Skylab at the Oak Ridge National Laboratory (ORNL). ORNL devised "dynamic loading," which employed a spinning rotor to load, transfer, and analyze blood samples by centrifugal processing. A refined, commercial version of the system was produced by ABAXIS and is marketed as portable ABAXIS MiniLab MCA. Used in a doctor's office, the equipment can perform 80 to 100 chemical blood tests on a single drop of blood and report results in five minutes. Further development is anticipated.

  20. Contextual approach to technology assessment: Implications for one-factor fix solutions to complex social problems

    NASA Technical Reports Server (NTRS)

    Mayo, L. H.

    1975-01-01

    The contextual approach is discussed which undertakes to demonstrate that technology assessment assists in the identification of the full range of implications of taking a particular action and facilitates the consideration of alternative means by which the total affected social problem context might be changed by available project options. It is found that the social impacts of an application on participants, institutions, processes, and social interests, and the accompanying interactions may not only induce modifications in the problem contest delineated for examination with respect to the design, operations, regulation, and use of the posited application, but also affect related social problem contexts.

  1. Synthesis of Complex Natural Products as a Vehicle for Student-Centered, Problem-Based Learning

    NASA Astrophysics Data System (ADS)

    Cannon, Kevin J.; Krow, Grant R.

    1998-10-01

    Management strategies for upper-level undergraduate and graduate courses in organic synthesis at Temple University are described, and both student and faculty responsibilities are discussed. Using natural product synthesis as a vehicle, students choose a synthetic problem from the literature, identify the knowledge needed to solve the problem, explore resources for attaining that knowledge, identify the goals and criteria for a successful synthetic plan, and create and do assessments of their work. The method is an example of teacher-guided, student-directed, interdependent, small-group, problem-based learning.

  2. Defibrillator analyzers.

    PubMed

    1999-12-01

    Defibrillator analyzers automate the inspection and preventive maintenance (IPM) testing of defibrillators. They need to be able to test at least four basic defibrillator performance characteristics: discharge energy, synchronized-mode operation, automated external defibrillation, and ECG monitoring. We prefer that they also be able to test a defibrillator's external noninvasive pacing function--but this is not essential if a facility already has a pacemaker analyzer that can perform this testing. In this Evaluation, we tested seven defibrillator analyzers from six suppliers. All seven units accurately measure the energies of a variety of discharge wave-forms over a wide range of energy levels--from 1 J for use in a neonatal intensive care unit to 360 J for use on adult patients requiring maximum discharge energy. Most of the analyzers are easy to use. However, only three of the evaluated units could perform the full range of defibrillator tests that we prefer. We rated these units Acceptable--Preferred. Three more units could perform four of the five tests, they could not test the pacing feature of a defibrillator. These units were rated Acceptable. The seventh unit could perform only discharge energy testing and synchronized-mode testing and was difficult to use. We rate that unit Acceptable--Not Recommended. PMID:10604089

  3. Process Analyzer

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Under a NASA Small Business Innovation Research (SBIR) contract, Axiomatics Corporation developed a shunting Dielectric Sensor to determine the nutrient level and analyze plant nutrient solutions in the CELSS, NASA's space life support program. (CELSS is an experimental facility investigating closed-cycle plant growth and food processing for long duration manned missions.) The DiComp system incorporates a shunt electrode and is especially sensitive to changes in dielectric property changes in materials at measurements much lower than conventional sensors. The analyzer has exceptional capabilities for predicting composition of liquid streams or reactions. It measures concentrations and solids content up to 100 percent in applications like agricultural products, petrochemicals, food and beverages. The sensor is easily installed; maintenance is low, and it can be calibrated on line. The software automates data collection and analysis.

  4. Oxygen analyzer

    DOEpatents

    Benner, W.H.

    1984-05-08

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N/sub 2/), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable obtained by decomposing the sample at 1135/sup 0/C, or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135/sup 0/C as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N/sub 2/, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  5. Oxygen analyzer

    DOEpatents

    Benner, William H.

    1986-01-01

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  6. MULTICHANNEL ANALYZER

    DOEpatents

    Kelley, G.G.

    1959-11-10

    A multichannel pulse analyzer having several window amplifiers, each amplifier serving one group of channels, with a single fast pulse-lengthener and a single novel interrogation circuit serving all channels is described. A pulse followed too closely timewise by another pulse is disregarded by the interrogation circuit to prevent errors due to pulse pileup. The window amplifiers are connected to the pulse lengthener output, rather than the linear amplifier output, so need not have the fast response characteristic formerly required.

  7. The Research of Solution to the Problems of Complex Task Scheduling Based on Self-adaptive Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Zhu, Li; He, Yongxiang; Xue, Haidong; Chen, Leichen

    Traditional genetic algorithms (GA) displays a disadvantage of early-constringency in dealing with scheduling problem. To improve the crossover operators and mutation operators self-adaptively, this paper proposes a self-adaptive GA at the target of multitask scheduling optimization under limited resources. The experiment results show that the proposed algorithm outperforms the traditional GA in evolutive ability to deal with complex task scheduling optimization.

  8. Dealing with wicked problems: conducting a causal layered analysis of complex social psychological issues.

    PubMed

    Bishop, Brian J; Dzidic, Peta L

    2014-03-01

    Causal layered analysis (CLA) is an emerging qualitative methodology adopted in the discipline of planning as an approach to deconstruct complex social issues. With psychologists increasingly confronted with complex, and "wicked" social and community issues, we argue that the discipline of psychology would benefit from adopting CLA as an analytical method. Until now, the application of CLA for data interpretation has generally been poorly defined and overwhelming for the novice. In this paper we propose an approach to CLA that provides a method for the deconstruction and analysis of complex social psychological issues. We introduce CLA as a qualitative methodology well suited for psychology, introduce the epistemological foundations of CLA, define a space for it adoption within the discipline, and, outline the steps for conducting a CLA using an applied example.

  9. Solving Hard Computational Problems Efficiently: Asymptotic Parametric Complexity 3-Coloring Algorithm

    PubMed Central

    Martín H., José Antonio

    2013-01-01

    Many practical problems in almost all scientific and technological disciplines have been classified as computationally hard (NP-hard or even NP-complete). In life sciences, combinatorial optimization problems frequently arise in molecular biology, e.g., genome sequencing; global alignment of multiple genomes; identifying siblings or discovery of dysregulated pathways. In almost all of these problems, there is the need for proving a hypothesis about certain property of an object that can be present if and only if it adopts some particular admissible structure (an NP-certificate) or be absent (no admissible structure), however, none of the standard approaches can discard the hypothesis when no solution can be found, since none can provide a proof that there is no admissible structure. This article presents an algorithm that introduces a novel type of solution method to “efficiently” solve the graph 3-coloring problem; an NP-complete problem. The proposed method provides certificates (proofs) in both cases: present or absent, so it is possible to accept or reject the hypothesis on the basis of a rigorous proof. It provides exact solutions and is polynomial-time (i.e., efficient) however parametric. The only requirement is sufficient computational power, which is controlled by the parameter . Nevertheless, here it is proved that the probability of requiring a value of to obtain a solution for a random graph decreases exponentially: , making tractable almost all problem instances. Thorough experimental analyses were performed. The algorithm was tested on random graphs, planar graphs and 4-regular planar graphs. The obtained experimental results are in accordance with the theoretical expected results. PMID:23349711

  10. Contamination Analyzer

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  11. Simplified Digital Spectrum Analyzer

    NASA Technical Reports Server (NTRS)

    Cole, Steven W.

    1992-01-01

    Spectrum analyzer computes approximate cross-correlations between noisy input signal and reference signal of known frequency, yielding measure of amplitude of sinusoidal component of input. Complexity and power consumed less than other digital spectrum analyzers. Performs no multiplications, and because processes data on each frequency independently, focuses on narrow spectral range without processing data on rest of spectrum.

  12. New ecology education: Preparing students for the complex human-environmental problems of dryland East Asia

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Present-day environmental problems of Dryland East Asia are serious, and future prospects look especially disconcerting owing to current trends in population growth and economic development. Land degradation and desertification, invasive species, biodiversity losses, toxic waste and air pollution, a...

  13. Convergent Validity of the Aberrant Behavior Checklist and Behavior Problems Inventory with People with Complex Needs

    ERIC Educational Resources Information Center

    Hill, Jennie; Powlitch, Stephanie; Furniss, Frederick

    2008-01-01

    The current study aimed to replicate and extend Rojahn et al. [Rojahn, J., Aman, M. G., Matson, J. L., & Mayville, E. (2003). "The aberrant behavior checklist and the behavior problems inventory: Convergent and divergent validity." "Research in Developmental Disabilities", 24, 391-404] by examining the convergent validity of the behavior problems…

  14. Foucault as Complexity Theorist: Overcoming the Problems of Classical Philosophical Analysis

    ERIC Educational Resources Information Center

    Olssen, Mark

    2008-01-01

    This article explores the affinities and parallels between Foucault's Nietzschean view of history and models of complexity developed in the physical sciences in the twentieth century. It claims that Foucault's rejection of structuralism and Marxism can be explained as a consequence of his own approach which posits a radical ontology whereby the…

  15. The Complexities of Participatory Action Research and the Problems of Power, Identity and Influence

    ERIC Educational Resources Information Center

    Hawkins, Karen A.

    2015-01-01

    This article highlights the complexity of participatory action research (PAR) in that the study outlined was carried out with and by, as opposed to on, participants. The project was contextualised in two prior-to-school settings in Australia, with the early childhood professionals and, to some extent, the preschoolers involved in this PAR project…

  16. Analogize This! The Politics of Scale and the Problem of Substance in Complexity-Based Composition

    ERIC Educational Resources Information Center

    Roderick, Noah R.

    2012-01-01

    In light of recent enthusiasm in composition studies (and in the social sciences more broadly) for complexity theory and ecology, this article revisits the debate over how much composition studies can or should align itself with the natural sciences. For many in the discipline, the science debate--which was ignited in the 1970s, both by the…

  17. The Species Problem and the Value of Teaching and the Complexities of Species

    ERIC Educational Resources Information Center

    Chung, Carl

    2004-01-01

    Discussions on species taxa directly refer to a range of complex biological phenomena. Given these phenomena, biologists have developed and continue to appeal to a series of species concepts and do not have a clear definition for it as each species concept tells us part of the story or helps the biologists to explain and understand a subset of…

  18. ETD QA CORE TEAM: AN ELOQUENT SOLUTION TO A COMPLEX PROBLEM

    EPA Science Inventory

    ETD QA CORE TEAM: AN ELOQUENT SOLUTION TO A COMPLEX PROBLEMThomas J. Hughes, QA and Records Manager, Experimental Toxicology Division (ETD), National Health and Environmental Effects Research Laboratory (NHEERL), ORD, U.S. EPA, RTP, NC 27709

    ETD is the largest health divis...

  19. Individual versus Collaborative Problem Solving: Divergent Outcomes Depending on Task Complexity

    ERIC Educational Resources Information Center

    Sears, David A.; Reagin, James Michael

    2013-01-01

    Many studies have tested external supports for promoting productive collaboration, but relatively few have examined what features characterize naturally productive collaborative tasks. Two lines of research have come to distinct conclusions on the primary task feature associated with productive collaboration: demonstrability versus complexity.…

  20. Stress Analyzer

    NASA Technical Reports Server (NTRS)

    1990-01-01

    SPATE 900 Dynamic Stress Analyzer is an acronym for Stress Pattern Analysis by Thermal Emission. It detects stress-induced temperature changes in a structure and indicates the degree of stress. Ometron, Inc.'s SPATE 9000 consists of a scan unit and a data display. The scan unit contains an infrared channel focused on the test structure to collect thermal radiation, and a visual channel used to set up the scan area and interrogate the stress display. Stress data is produced by detecting minute temperature changes, down to one-thousandth of a degree Centigrade, resulting from the application to the structure of dynamic loading. The electronic data processing system correlates the temperature changes with a reference signal to determine stress level.

  1. Optical analyzer

    DOEpatents

    Hansen, A.D.

    1987-09-28

    An optical analyzer wherein a sample of particulate matter, and particularly of organic matter, which has been collected on a quartz fiber filter is placed in a combustion tube, and light from a light source is passed through the sample. The temperature of the sample is raised at a controlled rate and in a controlled atmosphere. The magnitude of the transmission of light through the sample is detected as the temperature is raised. A data processor, differentiator and a two pen recorder provide a chart of the optical transmission versus temperature and the rate of change of optical transmission versus temperature signatures (T and D) of the sample. These signatures provide information as to physical and chemical processes and a variety of quantitative and qualitative information about the sample. Additional information is obtained by repeating the run in different atmospheres and/or different rates or heating with other samples of the same particulate material collected on other filters. 7 figs.

  2. Studying PubMed usages in the field for complex problem solving: Implications for tool design.

    PubMed

    Mirel, Barbara; Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa

    2013-05-01

    Many recent studies on MEDLINE-based information seeking have shed light on scientists' behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists' problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375

  3. Simulations for Complex Fluid Flow Problems from Berkeley Lab's Center for Computational Sciences and Engineering (CCSE)

    DOE Data Explorer

    The Center for Computational Sciences and Engineering (CCSE) develops and applies advanced computational methodologies to solve large-scale scientific and engineering problems arising in the Department of Energy (DOE) mission areas involving energy, environmental, and industrial technology. The primary focus is in the application of structured-grid finite difference methods on adaptive grid hierarchies for compressible, incompressible, and low Mach number flows. The diverse range of scientific applications that drive the research typically involve a large range of spatial and temporal scales (e.g. turbulent reacting flows) and require the use of extremely large computing hardware, such as the 153,000-core computer, Hopper, at NERSC. The CCSE approach to these problems centers on the development and application of advanced algorithms that exploit known separations in scale; for many of the application areas this results in algorithms are several orders of magnitude more efficient than traditional simulation approaches.

  4. Studying PubMed usages in the field for complex problem solving: Implications for tool design

    PubMed Central

    Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa

    2012-01-01

    Many recent studies on MEDLINE-based information seeking have shed light on scientists’ behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists’ problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375

  5. Dynamic Modeling as a Cognitive Regulation Scaffold for Developing Complex Problem-Solving Skills in an Educational Massively Multiplayer Online Game Environment

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ge, Xun; Ifenthaler, Dirk; Law, Victor

    2011-01-01

    Following a design-based research framework, this article reports two empirical studies with an educational MMOG, called "McLarin's Adventures," on facilitating 9th-grade students' complex problem-solving skill acquisition in interdisciplinary STEM education. The article discusses the nature of complex and ill-structured problem solving and,…

  6. Criteria for assessing problem solving and decision making in complex environments

    NASA Technical Reports Server (NTRS)

    Orasanu, Judith

    1993-01-01

    Training crews to cope with unanticipated problems in high-risk, high-stress environments requires models of effective problem solving and decision making. Existing decision theories use the criteria of logical consistency and mathematical optimality to evaluate decision quality. While these approaches are useful under some circumstances, the assumptions underlying these models frequently are not met in dynamic time-pressured operational environments. Also, applying formal decision models is both labor and time intensive, a luxury often lacking in operational environments. Alternate approaches and criteria are needed. Given that operational problem solving and decision making are embedded in ongoing tasks, evaluation criteria must address the relation between those activities and satisfaction of broader task goals. Effectiveness and efficiency become relevant for judging reasoning performance in operational environments. New questions must be addressed: What is the relation between the quality of decisions and overall performance by crews engaged in critical high risk tasks? Are different strategies most effective for different types of decisions? How can various decision types be characterized? A preliminary model of decision types found in air transport environments will be described along with a preliminary performance model based on an analysis of 30 flight crews. The performance analysis examined behaviors that distinguish more and less effective crews (based on performance errors). Implications for training and system design will be discussed.

  7. Improved parameterized complexity of the maximum agreement subtree and maximum compatible tree problems.

    PubMed

    Berry, Vincent; Nicolas, François

    2006-01-01

    Given a set of evolutionary trees on a same set of taxa, the maximum agreement subtree problem (MAST), respectively, maximum compatible tree problem (MCT), consists of finding a largest subset of taxa such that all input trees restricted to these taxa are isomorphic, respectively compatible. These problems have several applications in phylogenetics such as the computation of a consensus of phylogenies obtained from different data sets, the identification of species subjected to horizontal gene transfers and, more recently, the inference of supertrees, e.g., Trees Of Life. We provide two linear time algorithms to check the isomorphism, respectively, compatibility, of a set of trees or otherwise identify a conflict between the trees with respect to the relative location of a small subset of taxa. Then, we use these algorithms as subroutines to solve MAST and MCT on rooted or unrooted trees of unbounded degree. More precisely, we give exact fixed-parameter tractable algorithms, whose running time is uniformly polynomial when the number of taxa on which the trees disagree is bounded. The improves on a known result for MAST and proves fixed-parameter tractability for MCT.

  8. Outcomes, moderators, and mediators of empathic-emotion recognition training for complex conduct problems in childhood.

    PubMed

    Dadds, Mark Richard; Cauchi, Avril Jessica; Wimalaweera, Subodha; Hawes, David John; Brennan, John

    2012-10-30

    Impairments in emotion recognition skills are a trans-diagnostic indicator of early mental health problems and may be responsive to intervention. We report on a randomized controlled trial of "Emotion-recognition-training" (ERT) versus treatment-as-usual (TAU) with N=195 mixed diagnostic children (mean age 10.52 years) referred for behavioral/emotional problems measured at pre- and 6 months post-treatment. We tested overall outcomes plus moderation and mediation models, whereby diagnostic profile was tested as a moderator of change. ERT had no impact on the group as a whole. Diagnostic status of the child did not moderate outcomes; however, levels of callous-unemotional (CU) traits moderated outcomes such that children with high CU traits responded less well to TAU, while ERT produced significant improvements in affective empathy and conduct problems in these children. Emotion recognition training has potential as an adjunctive intervention specifically for clinically referred children with high CU traits, regardless of their diagnostic status. PMID:22703720

  9. Optical analyzer

    DOEpatents

    Hansen, Anthony D.

    1989-02-07

    An optical analyzer (10) wherein a sample (19) of particulate matter, and particularly of organic matter, which has been collected on a quartz fiber filter (20) is placed in a combustion tube (11), and light from a light source (14) is passed through the sample (19). The temperature of the sample (19) is raised at a controlled rate and in a controlled atmosphere. The magnitude of the transmission of light through the sample (19) is detected (18) as the temperature is raised. A data processor (23), differentiator (28) and a two pen recorder (24) provide a chart of the optical transmission versus temperature and the rate of change of optical transmission versus temperature signatures (T and D) of the sample (19). These signatures provide information as to physical and chemical processes and a variety of quantitative and qualitative information about the sample (19). Additional information is obtained by repeating the run in different atmospheres and/or different rates of heating with other samples of the same particulate material collected on other filters.

  10. Optical analyzer

    DOEpatents

    Hansen, Anthony D.

    1989-01-01

    An optical analyzer (10) wherein a sample (19) of particulate matter, and particularly of organic matter, which has been collected on a quartz fiber filter (20) is placed in a combustion tube (11), and light from a light source (14) is passed through the sample (19). The temperature of the sample (19) is raised at a controlled rate and in a controlled atmosphere. The magnitude of the transmission of light through the sample (19) is detected (18) as the temperature is raised. A data processor (23), differentiator (28) and a two pen recorder (24) provide a chart of the optical transmission versus temperature and the rate of change of optical transmission versus temperature signatures (T and D) of the sample (19). These signatures provide information as to physical and chemical processes and a variety of quantitative and qualitative information about the sample (19). Additional information is obtained by repeating the run in different atmospheres and/or different rates of heating with other samples of the same particulate material collected on other filters.

  11. Infinite-range exterior complex scaling as a perfect absorber in time-dependent problems

    SciTech Connect

    Scrinzi, Armin

    2010-05-15

    We introduce infinite range exterior complex scaling (irECS) which provides for complete absorption of outgoing flux in numerical solutions of the time-dependent Schroedinger equation with strong infrared fields. This is demonstrated by computing high harmonic spectra and wave-function overlaps with the exact solution for a one-dimensional model system and by three-dimensional calculations for the H atom and an Ne atom model. We lay out the key ingredients for correct implementation and identify criteria for efficient discretization.

  12. OBSESSIVE COMPULSIVE DISORDER: IS IT A PROBLEM OF COMPLEX MOTOR PROGRAMMING?*

    PubMed Central

    Khanna, Sumant; Mukundan, C.R.; Channabasavanna, S.M.

    1987-01-01

    SUMMARY 44 subjects with Obsessive compulsive disorder (OCD) and 40 normals were compared using an experimental paradigm involving recording of the bereitschaftspotential. A decreased onset latency and increased amplitude was found in the OCD sample as compared to normals. A neurophysiological substrate for the bereitschaftspotential has been proposed. The implications of these findings in OCD as compared to Gilles de la Tourette syndrome, and for a focal neuro-physiological dysfunction have also been discussed. The findings of this study implicate a dysfunction in complex motor programming in OCD, with the possibility of this dysfunction being in the prefrontal area. PMID:21927207

  13. ABSORPTION ANALYZER

    DOEpatents

    Brooksbank, W.A. Jr.; Leddicotte, G.W.; Strain, J.E.; Hendon, H.H. Jr.

    1961-11-14

    A means was developed for continuously computing and indicating the isotopic assay of a process solution and for automatically controlling the process output of isotope separation equipment to provide a continuous output of the desired isotopic ratio. A counter tube is surrounded with a sample to be analyzed so that the tube is exactly in the center of the sample. A source of fast neutrons is provided and is spaced from the sample. The neutrons from the source are thermalized by causing them to pass through a neutron moderator, and the neutrons are allowed to diffuse radially through the sample to actuate the counter. A reference counter in a known sample of pure solvent is also actuated by the thermal neutrons from the neutron source. The number of neutrons which actuate the detectors is a function of a concentration of the elements in solution and their neutron absorption cross sections. The pulses produced by the detectors responsive to each neu tron passing therethrough are amplified and counted. The respective times required to accumulate a selected number of counts are measured by associated timing devices. The concentration of a particular element in solution may be determined by utilizing the following relation: T2/Ti = BCR, where B is a constant proportional to the absorption cross sections, T2 is the time of count collection for the unknown solution, Ti is the time of count collection for the pure solvent, R is the isotopic ratlo, and C is the molar concentration of the element to be determined. Knowing the slope constant B for any element and when the chemical concentration is known, the isotopic concentration may be readily determined, and conversely when the isotopic ratio is known, the chemical concentrations may be determined. (AEC)

  14. Motion artifacts in MRI: A complex problem with many partial solutions.

    PubMed

    Zaitsev, Maxim; Maclaren, Julian; Herbst, Michael

    2015-10-01

    Subject motion during magnetic resonance imaging (MRI) has been problematic since its introduction as a clinical imaging modality. While sensitivity to particle motion or blood flow can be used to provide useful image contrast, bulk motion presents a considerable problem in the majority of clinical applications. It is one of the most frequent sources of artifacts. Over 30 years of research have produced numerous methods to mitigate or correct for motion artifacts, but no single method can be applied in all imaging situations. Instead, a "toolbox" of methods exists, where each tool is suitable for some tasks, but not for others. This article reviews the origins of motion artifacts and presents current mitigation and correction methods. In some imaging situations, the currently available motion correction tools are highly effective; in other cases, appropriate tools still need to be developed. It seems likely that this multifaceted approach will be what eventually solves the motion sensitivity problem in MRI, rather than a single solution that is effective in all situations. This review places a strong emphasis on explaining the physics behind the occurrence of such artifacts, with the aim of aiding artifact detection and mitigation in particular clinical situations.

  15. Motion Artefacts in MRI: a Complex Problem with Many Partial Solutions

    PubMed Central

    Zaitsev, Maxim; Maclaren, Julian.; Herbst, Michael

    2015-01-01

    Subject motion during magnetic resonance imaging (MRI) has been problematic since its introduction as a clinical imaging modality. While sensitivity to particle motion or blood flow can be used to provide useful image contrast, bulk motion presents a considerable problem in the majority of clinical applications. It is one of the most frequent sources of artefacts. Over 30 years of research have produced numerous methods to mitigate or correct for motion artefacts, but no single method can be applied in all imaging situations. Instead, a ‘toolbox’ of methods exists, where each tool is suitable for some tasks, but not for others. This article reviews the origins of motion artefacts and presents current mitigation and correction methods. In some imaging situations, the currently available motion correction tools are highly effective; in other cases, appropriate tools still need to be developed. It seems likely that this multifaceted approach will be what eventually solves the motion sensitivity problem in MRI, rather than a single solution that is effective in all situations. This review places a strong emphasis on explaining the physics behind the occurrence of such artefacts, with the aim of aiding artefact detection and mitigation in particular clinical situations. PMID:25630632

  16. Traveling salesman problems with PageRank Distance on complex networks reveal community structure

    NASA Astrophysics Data System (ADS)

    Jiang, Zhongzhou; Liu, Jing; Wang, Shuai

    2016-12-01

    In this paper, we propose a new algorithm for community detection problems (CDPs) based on traveling salesman problems (TSPs), labeled as TSP-CDA. Since TSPs need to find a tour with minimum cost, cities close to each other are usually clustered in the tour. This inspired us to model CDPs as TSPs by taking each vertex as a city. Then, in the final tour, the vertices in the same community tend to cluster together, and the community structure can be obtained by cutting the tour into a couple of paths. There are two challenges. The first is to define a suitable distance between each pair of vertices which can reflect the probability that they belong to the same community. The second is to design a suitable strategy to cut the final tour into paths which can form communities. In TSP-CDA, we deal with these two challenges by defining a PageRank Distance and an automatic threshold-based cutting strategy. The PageRank Distance is designed with the intrinsic properties of CDPs in mind, and can be calculated efficiently. In the experiments, benchmark networks with 1000-10,000 nodes and varying structures are used to test the performance of TSP-CDA. A comparison is also made between TSP-CDA and two well-established community detection algorithms. The results show that TSP-CDA can find accurate community structure efficiently and outperforms the two existing algorithms.

  17. Content-Adaptive Finite Element Mesh Generation of 3-D Complex MR Volumes for Bioelectromagnetic Problems.

    PubMed

    Lee, W; Kim, T-S; Cho, M; Lee, S

    2005-01-01

    In studying bioelectromagnetic problems, finite element method offers several advantages over other conventional methods such as boundary element method. It allows truly volumetric analysis and incorporation of material properties such as anisotropy. Mesh generation is the first requirement in the finite element analysis and there are many different approaches in mesh generation. However conventional approaches offered by commercial packages and various algorithms do not generate content-adaptive meshes, resulting in numerous elements in the smaller volume regions, thereby increasing computational load and demand. In this work, we present an improved content-adaptive mesh generation scheme that is efficient and fast along with options to change the contents of meshes. For demonstration, mesh models of the head from a volume MRI are presented in 2-D and 3-D.

  18. An unstructured-grid software system for solving complex aerodynamic problems

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Pirzadeh, Shahyar; Parikh, Paresh

    1995-01-01

    A coordinated effort has been underway over the past four years to elevate unstructured-grid methodology to a mature level. The goal of this endeavor is to provide a validated capability to non-expert users for performing rapid aerodynamic analysis and design of complex configurations. The Euler component of the system is well developed, and is impacting a broad spectrum of engineering needs with capabilities such as rapid grid generation and inviscid flow analysis, inverse design, interactive boundary layers, and propulsion effects. Progress is also being made in the more tenuous Navier-Stokes component of the system. A robust grid generator is under development for constructing quality thin-layer tetrahedral grids, along with a companion Navier-Stokes flow solver. This paper presents an overview of this effort, along with a perspective on the present and future status of the methodology.

  19. Review of the Uruguayan Kidney Allocation System: the solution to a complex problem, preliminary data.

    PubMed

    Bengochea, M; Alvarez, I; Toledo, R; Carretto, E; Forteza, D

    2010-01-01

    The National Kidney Transplant Program with cadaveric donors is based on centralized and unique waitlist, serum bank, and allocation criteria, approved by Instituto Nacional de Donación y Trasplante (INDT) in agreement with clinical teams. The median donor rates over last 3 years is 20 per million population and the median number of waitlist candidates is 450. The increased number of waiting list patients and the rapid aging of our populations demanded strategies for donor acceptance, candidate assignment, and analysis of more efficient and equitable allocation models. The objectives of the new national allocation system were to improve posttransplant patient and graft survivals, allow equal access to transplantation, and reduce waitlist times. The objective of this study was to analyze variables in our current allocation system and to create a mathematical/simulation model to evaluate a new allocation system. We compared candidates and transplanted patients for gender, age, ABO blood group, human leukocyte agents (HLA), percentage of reactive antibodies (PRA), and waiting list and dialysis times. Only 2 factors showed differences: highly sensitized and patients >65 years old (Bernoulli test). An agreement between INDT and Engineering Faculty yielded a major field of study. During 2008 the data analysis and model building began. The waiting list data of the last decade of donors and transplants were processed to develop a virtual model. We used inputs of candidates and donors, with outputs and structure of the simulation system to evaluate the proposed changes. Currently, the INDT and the Mathematics and Statistics Institute are working to develop a simulation model, that is able to analyze our new national allocation system.

  20. Alternative hybrid and staged interventional treatment of congenital heart defects in critically ill children with complex and non-cardiac problems

    PubMed Central

    Chojnicki, Maciej; Jaworski, Radosław; Steffens, Mariusz; Szofer-Sendrowska, Aneta; Paczkowski, Konrad; Kwaśniak, Ewelina; Zieliński, Jacek; Gierat-Haponiuk, Katarzyna; Leszczyńska, Katarzyna

    2015-01-01

    Introduction An individually designed strategy of comprehensive alternative hybrid and staged interventional treatment (AHASIT) can be a reasonable alternative to conventional treatment of congenital heart defects, reduce the risk of cardiac surgery or interventions performed separately, and give an additional chance for critically ill children. Aim To present our experience and the results of AHASIT of severely ill or borderline children referred for surgery with the diagnosis of congenital heart defects. Material and methods A group of 22 patients with complex cardiac and non-cardiac pathologies was retrospectively selected and analyzed. An individual preoperative severity scale was established for AHASIT patients, with one point for each of the following preoperative complications: prematurity, low body weight, cyanosis, intolerance to drug therapy, failed interventional treatment prior to admission, mechanical ventilation prior to the procedure, chronic respiratory failure and non-cardiac, mainly congenital malformations (congenital diaphragmatic hernia, lower extremity agenesia, duodenal atresia) and acquired problems (newborn edema, necrotic enterocolitis, intracranial hemorrhage, liver and renal failure, anemia and thrombocytopenia, infections or colonization with drug-resistant pathogens). Results The analysis of the postoperative course showed that the patients with 5 AHASIT points or more had a more complicated postoperative course than the patients with 1 to 4 AHASIT points. Conclusions The AHASIT of pediatric congenital heart defects with complex and non-cardiac problems appeared to be an attractive option for selected severely ill patients. The strategy was found to be effective in selected neonates suffering from complex and accompanying non-cardiac pathologies, with positive final results of both cardiological intervention and planned surgery. PMID:26240625

  1. ATSDR evaluation of health effects of chemicals. IV. Polycyclic aromatic hydrocarbons (PAHs): understanding a complex problem.

    PubMed

    Mumtaz, M M; George, J D; Gold, K W; Cibulas, W; DeRosa, C T

    1996-01-01

    Polycyclic Aromatic Hydrocarbons (PAHs) are a group of chemicals that are formed during the incomplete burning of coal, oil, gas, wood, garbage, or other organic substances, such as tobacco and charbroiled meat. There are more than 100 PAHs. PAHs generally occur as complex mixtures (for example, as part of products such as soot), not as single compounds. PAHs are found throughout the environment in the air, water, and soil. As part of its mandate, the Agency for Toxic Substances and Disease Registry (ATSDR) prepares toxicological profiles on hazardous chemicals, including PAHs (ATSDR, 1995), found at facilities on the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) National Priorities List (NPL) and which pose the most significant potential threat to human health, as determined by ATSDR and the Environmental Protection Agency (EPA). These profiles include information on health effects of chemicals from different routes and durations of exposure, their potential for exposure, regulations and advisories, and the adequacy of the existing database. Assessing the health effects of PAHs is a major challenge because environmental exposures to these chemicals are usually to complex mixtures of PAHs with other chemicals. The biological consequences of human exposure to mixtures of PAHs depend on the toxicity, carcinogenic and noncarcinogenic, of the individual components of the mixture, the types of interactions among them, and confounding factors that are not thoroughly understood. Also identified are components of exposure and health effects research needed on PAHs that will allow estimation of realistic human health risks posed by exposures to PAHs. The exposure assessment component of research should focus on (1) development of reliable analytical methods for the determination of bioavailable PAHs following ingestion, (2) estimation of bioavailable PAHs from environmental media, particularly the determination of particle-bound PAHs, (3

  2. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  3. Application of a low order panel method to complex three-dimensional internal flow problems

    NASA Technical Reports Server (NTRS)

    Ashby, D. L.; Sandlin, D. R.

    1986-01-01

    An evaluation of the ability of a low order panel method to predict complex three-dimensional internal flow fields was made. The computer code VSAERO was used as a basis for the evaluation. Guidelines for modeling internal flow geometries were determined and the effects of varying the boundary conditions and the use of numerical approximations on the solutions accuracy were studied. Several test cases were run and the results were compared with theoretical or experimental results. Modeling an internal flow geometry as a closed box with normal velocities specified on an inlet and exit face provided accurate results and gave the user control over the boundary conditions. The values of the boundary conditions greatly influenced the amount of leakage an internal flow geometry suffered and could be adjusted to eliminate leakage. The use of the far-field approximation to reduce computation time influenced the accuracy of a solution and was coupled with the values of the boundary conditions needed to eliminate leakage. The error induced in the influence coefficients by using the far-field approximation was found to be dependent on the type of influence coefficient, the far-field radius, and the aspect ratio of the panels.

  4. Combination Therapies for Lysosomal Storage Diseases: A Complex Answer to a Simple Problem.

    PubMed

    Macauley, Shannon L

    2016-06-01

    Abstract Lysosomal storage diseases (LSDs) are a group of 40-50 rare monogenic disorders that result in disrupted lysosomal function and subsequent lysosomal pathology. Depending on the protein or enzyme deficiency associated with each disease, LSDs affect an array of organ systems and elicit a complex set of secondary disease mechanisms that make many of these disorders difficult to fully treat. The etiology of most LSDs is known and the innate biology of lysosomal enzymes favors therapeutic intervention, yet most attempts at treating LSDs with enzyme replacement strategies fall short of being curative. Even with the advent of more sophisticated approaches, like substrate reduction therapy, pharmacologic chaperones, gene therapy or stem cell therapy, comprehensive treatments for LSDs have yet to be achieved. Given the limitations with individual therapies, recent research has focused on using a combination approach to treat LSDs. By coupling protein-, cell-, and gene- based therapies with small molecule drugs, researchers have found greater success in eradicating the clinical features of disease. This review seeks to discuss the positive and negatives of singular therapies used to treat LSDs, and discuss how, in combination, studies have demonstrated a more holistic benefit on pathological and functional parameters. By optimizing routes of delivery, therapeutic timing, and targeting secondary disease mechanisms, combination therapy represents the future for LSD treatment. PMID:27491211

  5. A simple framework for a complex problem? Predicting wildlife-vehicle collisions.

    PubMed

    Visintin, Casey; van der Ree, Rodney; McCarthy, Michael A

    2016-09-01

    Collisions of vehicles with wildlife kill and injure animals and are also a risk to vehicle occupants, but preventing these collisions is challenging. Surveys to identify problem areas are expensive and logistically difficult. Computer modeling has identified correlates of collisions, yet these can be difficult for managers to interpret in a way that will help them reduce collision risk. We introduce a novel method to predict collision risk by modeling hazard (presence and movement of vehicles) and exposure (animal presence) across geographic space. To estimate the hazard, we predict relative traffic volume and speed along road segments across southeastern Australia using regression models based on human demographic variables. We model exposure by predicting suitable habitat for our case study species (Eastern Grey Kangaroo Macropus giganteus) based on existing fauna survey records and geographic and climatic variables. Records of reported kangaroo-vehicle collisions are used to investigate how these factors collectively contribute to collision risk. The species occurrence (exposure) model generated plausible predictions across the study area, reducing the null deviance by 30.4%. The vehicle (hazard) models explained 54.7% variance in the traffic volume data and 58.7% in the traffic speed data. Using these as predictors of collision risk explained 23.7% of the deviance in incidence of collisions. Discrimination ability of the model was good when predicting to an independent dataset. The research demonstrates that collision risks can be modeled across geographic space with a conceptual analytical framework using existing sources of data, reducing the need for expensive or time-consuming field data collection. The framework is novel because it disentangles natural and anthropogenic effects on the likelihood of wildlife-vehicle collisions by representing hazard and exposure with separate, tunable submodels. PMID:27648252

  6. Social and ethical dimension of the natural sciences, complex problems of the age, interdisciplinarity, and the contribution of education

    NASA Astrophysics Data System (ADS)

    Develaki, Maria

    2008-09-01

    In view of the complex problems of this age, the question of the socio-ethical dimension of science acquires particular importance. We approach this matter from a philosophical and sociological standpoint, looking at such focal concerns as the motivation, purposes and methods of scientific activity, the ambivalence of scientific research and the concomitant risks, and the conflict between research freedom and external socio-political intervention. We then point out the impediments to the effectiveness of cross-disciplinary or broader meetings for addressing these complex problems and managing the associated risks, given the difficulty in communication between experts in different fields and non-experts, difficulties that education is challenged to help resolve. We find that the social necessity of informed decision-making on the basis of cross-disciplinary collaboration is reflected in the newer curricula, such as that of Greece, in aims like the acquisition of cross-subject knowledge and skills, and the ability to make decisions on controversial issues involving value conflicts. The interest and the reflections of the science education community in these matters increase its—traditionally limited—contribution to the theoretical debate on education and, by extension, the value of science education in the education system.

  7. Improving and validating 3D models for the leaf energy balance in canopy-scale problems with complex geometry

    NASA Astrophysics Data System (ADS)

    Bailey, B.; Stoll, R., II; Miller, N. E.; Pardyjak, E.; Mahaffee, W.

    2014-12-01

    Plants cover the majority of Earth's land surface, and thus play a critical role in the surface energy balance. Within individual plant communities, the leaf energy balance is a fundamental component of most biophysical processes. Absorbed radiation drives the energy balance and provides the means by which plants produce food. Available energy is partitioned into sensible and latent heat fluxes to determine surface temperature, which strongly influences rates of metabolic activity and growth. The energy balance of an individual leaf is coupled with other leaves in the community through longwave radiation emission and advection through the air. This complex coupling can make scaling models from leaves to whole-canopies difficult, specifically in canopies with complex, heterogeneous geometries. We present a new three-dimensional canopy model that simultaneously resolves sub-tree to whole-canopy scales. The model provides spatially explicit predictions of net radiation exchange, boundary-layer and stomatal conductances, evapotranspiration rates, and ultimately leaf surface temperature. The radiation model includes complex physics such as anisotropic emission and scattering. Radiation calculations are accelerated by leveraging graphics processing unit (GPU) technology, which allows canopy-scale problems to be performed on a standard desktop workstation. Since validating the three-dimensional distribution of leaf temperature can be extremely challenging, we used several independent measurement techniques to quantify errors in measured and modeled values. When compared with measured leaf temperatures, the model gave a mean error of about 2°C, which was close to the estimated measurement uncertainty.

  8. Analyzing Bilingual Education Costs.

    ERIC Educational Resources Information Center

    Bernal, Joe J.

    This paper examines the particular problems involved in analyzing the costs of bilingual education and suggests that cost analysis of bilingual education requires a fundamentally different approach than that followed in other recent school finance studies. Focus of the discussion is the Intercultural Development Research Association's (IDRA)…

  9. Eddy covariance measurements in complex terrain with a new fast response, closed-path analyzer: spectral characteristics and cross-system comparisons

    EPA Science Inventory

    In recent years, a new class of enclosed, closed-path gas analyzers suitable for eddy covariance applications has come to market, designed to combine the advantages of traditional closed-path systems (small density corrections, good performance in poor weather) and open-path syst...

  10. Analyzing failures: the problems and the solutions

    SciTech Connect

    Goel, V.S.

    1986-01-01

    Papers are presented on a failure analysis of a large centrifugal blower, field fractures in heavy equipment, a failure analysis of a liquid propane gas cylinder, an analysis of helicopter blade fatigue fracture by digital fractographic imaging analysis, and the influence of failure analyses on materials technology and design. Also considered are an analysis of aircraft component failures, the growth of short cracks in IN718, the improper fabrication of rotating blades (resulting in premature failure), and low cycle thermal fatigue and fracture of reinforced piping. Other topics include a nonlinear finite element analysis of stress concentration at high temperature, the inelastic analysis of a hot spot on a heavy vessel wall, the accuracy and precision of mechanical test data generated using computerized testing systems, and maintenance related failures.

  11. Complex modeling: a strategy and software program for combining multiple information sources to solve ill posed structure and nanostructure inverse problems.

    PubMed

    Juhás, Pavol; Farrow, Christopher L; Yang, Xiaohao; Knox, Kevin R; Billinge, Simon J L

    2015-11-01

    A strategy is described for regularizing ill posed structure and nanostructure scattering inverse problems (i.e. structure solution) from complex material structures. This paper describes both the philosophy and strategy of the approach, and a software implementation, DiffPy Complex Modeling Infrastructure (DiffPy-CMI). PMID:26522405

  12. The Cauchy Problem in Local Spaces for the Complex Ginzburg-Landau EquationII. Contraction Methods

    NASA Astrophysics Data System (ADS)

    Ginibre, J.; Velo, G.

    We continue the study of the initial value problem for the complex Ginzburg-Landau equation (with a > 0, b > 0, g>= 0) in initiated in a previous paper [I]. We treat the case where the initial data and the solutions belong to local uniform spaces, more precisely to spaces of functions satisfying local regularity conditions and uniform bounds in local norms, but no decay conditions (or arbitrarily weak decay conditions) at infinity in . In [I] we used compactness methods and an extended version of recent local estimates [3] and proved in particular the existence of solutions globally defined in time with local regularity of the initial data corresponding to the spaces Lr for r>= 2 or H1. Here we treat the same problem by contraction methods. This allows us in particular to prove that the solutions obtained in [I] are unique under suitable subcriticality conditions, and to obtain for them additional regularity properties and uniform bounds. The method extends some of those previously applied to the nonlinear heat equation in global spaces to the framework of local uniform spaces.

  13. A system for measuring complex dielectric properties of thin films at submillimeter wavelengths using an open hemispherical cavity and a vector network analyzer.

    PubMed

    Rahman, Rezwanur; Taylor, P C; Scales, John A

    2013-08-01

    Quasi-optical (QO) methods of dielectric spectroscopy are well established in the millimeter and submillimeter frequency bands. These methods exploit standing wave structure in the sample produced by a transmitted Gaussian beam to achieve accurate, low-noise measurement of the complex permittivity of the sample [e.g., J. A. Scales and M. Batzle, Appl. Phys. Lett. 88, 062906 (2006); R. N. Clarke and C. B. Rosenberg, J. Phys. E 15, 9 (1982); T. M. Hirovnen, P. Vainikainen, A. Lozowski, and A. V. Raisanen, IEEE Trans. Instrum. Meas. 45, 780 (1996)]. In effect the sample itself becomes a low-Q cavity. On the other hand, for optically thin samples (films of thickness much less than a wavelength) or extremely low loss samples (loss tangents below 10(-5)) the QO approach tends to break down due to loss of signal. In such a case it is useful to put the sample in a high-Q cavity and measure the perturbation of the cavity modes. Provided that the average mode frequency divided by the shift in mode frequency is less than the Q (quality factor) of the mode, then the perturbation should be resolvable. Cavity perturbation techniques are not new, but there are technological difficulties in working in the millimeter/submillimeter wave region. In this paper we will show applications of cavity perturbation to the dielectric characterization of semi-conductor thin films of the type used in the manufacture of photovoltaics in the 100 and 350 GHz range. We measured the complex optical constants of hot-wire chemical deposition grown 1-μm thick amorphous silicon (a-Si:H) film on borosilicate glass substrate. The real part of the refractive index and dielectric constant of the glass-substrate varies from frequency-independent to linearly frequency-dependent. We also see power-law behavior of the frequency-dependent optical conductivity from 316 GHz (9.48 cm(-1)) down to 104 GHz (3.12 cm(-1)). PMID:24007073

  14. A system for measuring complex dielectric properties of thin films at submillimeter wavelengths using an open hemispherical cavity and a vector network analyzer

    NASA Astrophysics Data System (ADS)

    Rahman, Rezwanur; Taylor, P. C.; Scales, John A.

    2013-08-01

    Quasi-optical (QO) methods of dielectric spectroscopy are well established in the millimeter and submillimeter frequency bands. These methods exploit standing wave structure in the sample produced by a transmitted Gaussian beam to achieve accurate, low-noise measurement of the complex permittivity of the sample [e.g., J. A. Scales and M. Batzle, Appl. Phys. Lett. 88, 062906 (2006);, 10.1063/1.2172403 R. N. Clarke and C. B. Rosenberg, J. Phys. E 15, 9 (1982);, 10.1088/0022-3735/15/1/002 T. M. Hirovnen, P. Vainikainen, A. Lozowski, and A. V. Raisanen, IEEE Trans. Instrum. Meas. 45, 780 (1996)], 10.1109/19.516996. In effect the sample itself becomes a low-Q cavity. On the other hand, for optically thin samples (films of thickness much less than a wavelength) or extremely low loss samples (loss tangents below 10-5) the QO approach tends to break down due to loss of signal. In such a case it is useful to put the sample in a high-Q cavity and measure the perturbation of the cavity modes. Provided that the average mode frequency divided by the shift in mode frequency is less than the Q (quality factor) of the mode, then the perturbation should be resolvable. Cavity perturbation techniques are not new, but there are technological difficulties in working in the millimeter/submillimeter wave region. In this paper we will show applications of cavity perturbation to the dielectric characterization of semi-conductor thin films of the type used in the manufacture of photovoltaics in the 100 and 350 GHz range. We measured the complex optical constants of hot-wire chemical deposition grown 1-μm thick amorphous silicon (a-Si:H) film on borosilicate glass substrate. The real part of the refractive index and dielectric constant of the glass-substrate varies from frequency-independent to linearly frequency-dependent. We also see power-law behavior of the frequency-dependent optical conductivity from 316 GHz (9.48 cm-1) down to 104 GHz (3.12 cm-1).

  15. Characterization and use of new monoclonal antibodies to CD11c, CD14, and CD163 to analyze the phenotypic complexity of ruminant monocyte subsets.

    PubMed

    Elnaggar, Mahmoud M; Abdellrazeq, Gaber S; Mack, Victoria; Fry, Lindsay M; Davis, William C; Park, Kun Taek

    2016-10-01

    The sequencing of the bovine genome and development of mass spectrometry, in conjunction with flow cytometry (FC), have afforded an opportunity to complete the characterization of the specificity of monoclonal antibodies (mAbs), only partially characterized during previous international workshops focused on antibody development for livestock (1991, Leukocyte Antigens in Cattle, Sheep, and Goats; 1993, Leukocyte Antigens of Cattle and Sheep; 1996, Third Workshop on Ruminant Leukocyte Antigens). The objective of this study was to complete the characterization of twelve mAbs incompletely characterized during the workshops that reacted with molecules predominantly expressed on bovine monocytes and use them to provide further information on the phenotypic complexity of monocyte subsets in ruminants. Analysis revealed that the mAbs could be grouped into three clusters that recognize three different molecules: CD11c, CD14, and CD163. Following characterization, comparison of the patterns of expression of CD14 and CD163 with expression of CD16, CD172a, and CD209 revealed the mononuclear cell population is comprised of multiple subsets with differential expression of these molecules. Further analysis revealed the epitopes recognized by mAbs to CD14 and CD163 are conserved on orthologues in sheep and goats. In contrast to CD14 that is also expressed on sheep and goat granulocytes, CD163 is a definitive marker for their monocytes. PMID:27496743

  16. Analyzing Leakage Through Cracks

    NASA Technical Reports Server (NTRS)

    Romine, William D.

    1993-01-01

    Two related computer programs written for use in analyzing leakage through cracks. Leakage flow laminar or turbulent. One program used to determine dimensions of crack under given flow conditions and given measured rate of leakage. Other used to determine rate of leakage of gas through crack of given dimensions under given flow conditions. Programs, written in BASIC language, accelerate and facilitate iterative calculations and parametric analyses. Solve equations of Fanno flow. Enables rapid solution of leakage problem.

  17. Promoting Experimental Problem-Solving Ability in Sixth-Grade Students through Problem-Oriented Teaching of Ecology: Findings of an Intervention Study in a Complex Domain

    ERIC Educational Resources Information Center

    Roesch, Frank; Nerb, Josef; Riess, Werner

    2015-01-01

    Our study investigated whether problem-oriented designed ecology lessons with phases of direct instruction and of open experimentation foster the development of cross-domain and domain-specific components of "experimental problem-solving ability" better than conventional lessons in science. We used a paper-and-pencil test to assess…

  18. Advancing our knowledge of the complexity and management of intimate partner violence and co-occurring mental health and substance abuse problems in women

    PubMed Central

    Du Mont, Janice

    2015-01-01

    Globally, intimate partner violence (IPV) is a pervasive and insidious human rights problem with significant adverse physical health outcomes for women. Intimate partner violence has also been closely associated with poor mental health and substance use problems. However, little is known about the relationship among these co-occurring problems and how to best intervene or manage them. Here, we present findings from recent systematic reviews and meta-analyses (where available) to highlight developments in understanding and managing the complex co-occurring problems of intimate partner violence and mental health and substance use in women. PMID:26097738

  19. Structural factoring approach for analyzing stochastic networks

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  20. Towards efficient uncertainty quantification in complex and large-scale biomechanical problems based on a Bayesian multi-fidelity scheme.

    PubMed

    Biehler, Jonas; Gee, Michael W; Wall, Wolfgang A

    2015-06-01

    . Additionally, the employed approach results in a tremendous reduction of computational costs, rendering uncertainty quantification with complex patient-specific nonlinear biomechanical models practical for the first time. Second, we also analyze the impact of the uncertainty in the input parameter on mechanical quantities typically related to abdominal aortic aneurysm rupture potential such as von Mises stress, von Mises strain and strain energy. Thus, providing first estimates on the variability of these mechanical quantities due to an uncertain constitutive parameter, and revealing the potential error made by assuming population averaged mean values in patient-specific simulations of abdominal aortic aneurysms. Moreover, the influence of correlation length of the random field is investigated in a parameter study using MC.

  1. Fractional channel multichannel analyzer

    DOEpatents

    Brackenbush, Larry W.; Anderson, Gordon A.

    1994-01-01

    A multichannel analyzer incorporating the features of the present invention obtains the effect of fractional channels thus greatly reducing the number of actual channels necessary to record complex line spectra. This is accomplished by using an analog-to-digital converter in the asynscronous mode, i.e., the gate pulse from the pulse height-to-pulse width converter is not synchronized with the signal from a clock oscillator. This saves power and reduces the number of components required on the board to achieve the effect of radically expanding the number of channels without changing the circuit board.

  2. Promoting Experimental Problem-solving Ability in Sixth-grade Students Through Problem-oriented Teaching of Ecology: Findings of an intervention study in a complex domain

    NASA Astrophysics Data System (ADS)

    Roesch, Frank; Nerb, Josef; Riess, Werner

    2015-03-01

    Our study investigated whether problem-oriented designed ecology lessons with phases of direct instruction and of open experimentation foster the development of cross-domain and domain-specific components of experimental problem-solving ability better than conventional lessons in science. We used a paper-and-pencil test to assess students' abilities in a quasi-experimental intervention study utilizing a pretest/posttest control-group design (N = 340; average performing sixth-grade students). The treatment group received lessons on forest ecosystems consistent with the principle of education for sustainable development. This learning environment was expected to help students enhance their ecological knowledge and their theoretical and methodological experimental competencies. Two control groups received either the teachers' usual lessons on forest ecosystems or non-specific lessons on other science topics. We found that the treatment promoted specific components of experimental problem-solving ability (generating epistemic questions, planning two-factorial experiments, and identifying correct experimental controls). However, the observed effects were small, and awareness for aspects of higher ecological experimental validity was not promoted by the treatment.

  3. The Benefit of Being Naïve and Knowing It: The Unfavourable Impact of Perceived Context Familiarity on Learning in Complex Problem Solving Tasks

    ERIC Educational Resources Information Center

    Beckmann, Jens F.; Goode, Natassia

    2014-01-01

    Previous research has found that embedding a problem into a familiar context does not necessarily confer an advantage over a novel context in the acquisition of new knowledge about a complex, dynamic system. In fact, it has been shown that a semantically familiar context can be detrimental to knowledge acquisition. This has been described as the…

  4. The Computer-Based Assessment of Complex Problem Solving and How It Is Influenced by Students' Information and Communication Technology Literacy

    ERIC Educational Resources Information Center

    Greiff, Samuel; Kretzschmar, André; Müller, Jonas C.; Spinath, Birgit; Martin, Romain

    2014-01-01

    The 21st-century work environment places strong emphasis on nonroutine transversal skills. In an educational context, complex problem solving (CPS) is generally considered an important transversal skill that includes knowledge acquisition and its application in new and interactive situations. The dynamic and interactive nature of CPS requires a…

  5. FORTRAN Static Source Code Analyzer

    NASA Technical Reports Server (NTRS)

    Merwarth, P.

    1982-01-01

    FORTRAN Static Source Code Analyzer program (SAP) automatically gathers and reports statistics on occurrences of statements and structures within FORTRAN program. Provisions are made for weighting each statistic, providing user with overall figure of complexity. Statistics, as well as figures of complexity, are gathered on module-by-module basis. Overall summed statistics are accumulated for complete input source file.

  6. The ESPAT tool: a general-purpose DSS shell for solving stochastic optimization problems in complex river-aquifer systems

    NASA Astrophysics Data System (ADS)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel; Tilmant, Amaury

    2015-04-01

    Stochastic programming methods are better suited to deal with the inherent uncertainty of inflow time series in water resource management. However, one of the most important hurdles in their use in practical implementations is the lack of generalized Decision Support System (DSS) shells, usually based on a deterministic approach. The purpose of this contribution is to present a general-purpose DSS shell, named Explicit Stochastic Programming Advanced Tool (ESPAT), able to build and solve stochastic programming problems for most water resource systems. It implements a hydro-economic approach, optimizing the total system benefits as the sum of the benefits obtained by each user. It has been coded using GAMS, and implements a Microsoft Excel interface with a GAMS-Excel link that allows the user to introduce the required data and recover the results. Therefore, no GAMS skills are required to run the program. The tool is divided into four modules according to its capabilities: 1) the ESPATR module, which performs stochastic optimization procedures in surface water systems using a Stochastic Dual Dynamic Programming (SDDP) approach; 2) the ESPAT_RA module, which optimizes coupled surface-groundwater systems using a modified SDDP approach; 3) the ESPAT_SDP module, capable of performing stochastic optimization procedures in small-size surface systems using a standard SDP approach; and 4) the ESPAT_DET module, which implements a deterministic programming procedure using non-linear programming, able to solve deterministic optimization problems in complex surface-groundwater river basins. The case study of the Mijares river basin (Spain) is used to illustrate the method. It consists in two reservoirs in series, one aquifer and four agricultural demand sites currently managed using historical (XIV century) rights, which give priority to the most traditional irrigation district over the XX century agricultural developments. Its size makes it possible to use either the SDP or

  7. Training Preschool Children to Use Visual Imagining as a Problem-Solving Strategy for Complex Categorization Tasks

    ERIC Educational Resources Information Center

    Kisamore, April N.; Carr, James E.; LeBlanc, Linda A.

    2011-01-01

    It has been suggested that verbally sophisticated individuals engage in a series of precurrent behaviors (e.g., covert intraverbal behavior, grouping stimuli, visual imagining) to solve problems such as answering questions (Palmer, 1991; Skinner, 1953). We examined the effects of one problem solving strategy--visual imagining--on increasing…

  8. Solving Problems.

    ERIC Educational Resources Information Center

    Hale, Norman; Lindelow, John

    Chapter 12 in a volume on school leadership, this chapter cites the work of several authorities concerning problem-solving or decision-making techniques based on the belief that group problem-solving effort is preferable to individual effort. The first technique, force-field analysis, is described as a means of dissecting complex problems into…

  9. The management of cognitive load during complex cognitive skill acquisition by means of computer-simulated problem solving.

    PubMed

    Kester, Liesbeth; Kirschner, Paul A; van Merriënboer, Jeroen J G

    2005-03-01

    This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram, while information in the integrated format condition was integrated in the circuit diagram. It was hypothesized that learners in the integrated format would achieve better test results than the learners in the split-source format. Equivalent-test problem and transfer-test problem performance were studied. Transfertest scores confirmed the hypothesis, though no differences were found on the equivalent-test scores.

  10. Uniqueness of self-similar solutions to the Riemann problem for the Hopf equation with complex nonlinearity

    NASA Astrophysics Data System (ADS)

    Kulikovskii, A. G.; Chugainova, A. P.; Shargatov, V. A.

    2016-07-01

    Solutions of the Riemann problem for a generalized Hopf equation are studied. The solutions are constructed using a sequence of non-overturning Riemann waves and shock waves with stable stationary and nonstationary structures.

  11. Making Visible the Complexities of Problem Solving: An Ethnographic Study of a General Chemistry Course in a Studio Learning Environment

    NASA Astrophysics Data System (ADS)

    Kalainoff, Melinda Zapata

    Studio classrooms, designed such that laboratory and lecture functions can occur in the same physical space, have been recognized as a promising contributing factor in promoting collaborative learning in the sciences (NRC, 2011). Moreover, in designing for instruction, a critical goal, especially in the sciences and engineering, is to foster an environment where students have opportunities for learning problem solving practices (NRC, 2012a). However, few studies show how this type of innovative learning environment shapes opportunities for learning in the sciences, which is critical to informing future curricular and instructional designs for these environments. Even fewer studies show how studio environments shape opportunities to develop problem solving practices specifically. In order to make visible how the learning environment promotes problem solving practices, this study explores problem solving phenomena in the daily life of an undergraduate General Chemistry studio class using an ethnographic perspective. By exploring problem solving as a sociocultural process, this study shows how the instructor and students co-construct opportunities for learning in whole class and small group interactional spaces afforded in this studio environment and how the differential demands on students in doing problems requires re-conceptualizing what it means to "apply a concept".

  12. System performance analyzer

    NASA Technical Reports Server (NTRS)

    Helbig, H. R.

    1981-01-01

    The System Performance Analyzer (SPA) designed to provide accurate real time information about the operation of complex systems and developed for use on the Airborne Data Analysis/Monitor System (ADAMS), a ROLM 1666 based system is described. The system uses an external processor to operate an intelligent, simulated control panel. Also provided are functions to trace operations, determine frequency of use of memory areas, and time or count user tasks in a multitask environment. This augments the information available from the standard debugger and control panel, and reduces the time and effort needed by ROLM 1666 users in optimizing their system, as well as providing documentation of the effect of any changes. The operation and state of the system are evaluated.

  13. Bourbaki's structure theory in the problem of complex systems simulation models synthesis and model-oriented programming

    NASA Astrophysics Data System (ADS)

    Brodsky, Yu. I.

    2015-01-01

    The work is devoted to the application of Bourbaki's structure theory to substantiate the synthesis of simulation models of complex multicomponent systems, where every component may be a complex system itself. An application of the Bourbaki's structure theory offers a new approach to the design and computer implementation of simulation models of complex multicomponent systems—model synthesis and model-oriented programming. It differs from the traditional object-oriented approach. The central concept of this new approach and at the same time, the basic building block for the construction of more complex structures is the concept of models-components. A model-component endowed with a more complicated structure than, for example, the object in the object-oriented analysis. This structure provides to the model-component an independent behavior-the ability of standard responds to standard requests of its internal and external environment. At the same time, the computer implementation of model-component's behavior is invariant under the integration of models-components into complexes. This fact allows one firstly to construct fractal models of any complexity, and secondly to implement a computational process of such constructions uniformly-by a single universal program. In addition, the proposed paradigm allows one to exclude imperative programming and to generate computer code with a high degree of parallelism.

  14. Simple Solutions to Complex Problems: Moral Panic and the Fluid Shift from "Equity" to "Quality" in Education

    ERIC Educational Resources Information Center

    Mockler, Nicole

    2014-01-01

    Education is increasingly conceptualised by governments and policymakers in western democracies in terms of productivity and human capital, emphasising elements of individualism and competition over concerns around democracy and equity. More and more, solutions to intransigent educational problems related to equity are seen in terms of quality and…

  15. The Management of Cognitive Load During Complex Cognitive Skill Acquisition by Means of Computer-Simulated Problem Solving

    ERIC Educational Resources Information Center

    Kester, Liesbeth; Kirschner, Paul A.; van Merrienboer, Jeroen J.G.

    2005-01-01

    This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram, while information in the integrated format condition…

  16. Tangled Narratives and Wicked Problems: A Complex Case of Positioning and Politics in a Diverse School Community

    ERIC Educational Resources Information Center

    Nguyen, Thu Suong Thi; Scribner, Samantha M. Paredes; Crow, Gary M.

    2012-01-01

    The case of Allen Elementary School presents tangled narratives and wicked problems describing the multidimensionality of school community work. Using multiple converging and diverging vignettes, the case points to the distinctiveness of individual experience in schools; the ways institutionalized organizational narratives become cultural…

  17. Advising a Bus Company on Number of Needed Buses: How High-School Physics Students' Deal With a "Complex Problem"?

    ERIC Educational Resources Information Center

    Balukovic, Jasmina; Slisko, Josip; Hadzibegovic, Zalkida

    2011-01-01

    Since 2003, international project PISA evaluates 15-year old students in solving problems that include "decision taking", "analysis and design of systems" and "trouble-shooting". This article presents the results of a pilot research conducted with 215 students from first to fourth grade of a high school in Sarajevo…

  18. The Use of the Solihull Approach with Children with Complex Neurodevelopmental Difficulties and Sleep Problems: A Case Study

    ERIC Educational Resources Information Center

    Williams, Laura; Newell, Reetta

    2013-01-01

    The following article introduces the Solihull Approach, a structured framework for intervention work with families (Douglas, "Solihull resource pack; the first five years." Cambridge: Jill Rogers Associates, 2001) and aims to demonstrate the usefulness of this approach in working with school-age children with complex neurodevelopmental…

  19. A Case-Based, Problem-Based Learning Approach to Prepare Master of Public Health Candidates for the Complexities of Global Health

    PubMed Central

    Winskell, Kate; McFarland, Deborah A.; del Rio, Carlos

    2015-01-01

    Global health is a dynamic, emerging, and interdisciplinary field. To address current and emerging global health challenges, we need a public health workforce with adaptable and collaborative problem-solving skills. In the 2013–2014 academic year, the Hubert Department of Global Health at the Rollins School of Public Health–Emory University launched an innovative required core course for its first-year Master of Public Health students in the global health track. The course uses a case-based, problem-based learning approach to develop global health competencies. Small teams of students propose solutions to these problems by identifying learning issues and critically analyzing and synthesizing new information. We describe the course structure and logistics used to apply this approach in the context of a large class and share lessons learned. PMID:25706029

  20. Disentangling the complex association between childhood sexual abuse and alcohol-related problems: a review of methodological issues and approaches.

    PubMed

    Sartor, Carolyn E; Agrawal, Arpana; McCutcheon, Vivia V; Duncan, Alexis E; Lynskey, Michael T

    2008-09-01

    This review describes and evaluates methodological approaches aimed at unraveling the association between childhood sexual abuse (CSA) and later misuse of alcohol, which is complicated by the significant overlap between factors that elevate risk for CSA exposure and those that increase risk for problem alcohol use. We critique methods used to distinguish direct effects of CSA events on alcohol-related outcomes from the effects of risk factors frequently present in families in which CSA exposure occurs (e.g., parental alcohol-related problems). These methods include measurement and adjustment for potentially confounding factors and the use of co-twin designs. The findings across methodological approaches provide support for a CSA-specific risk for alcohol misuse, despite the significant contribution of family background factors to overall risk, but much work remains to be done before a comprehensive model for this association can be proposed. Additional directions for research, including the incorporation of measured genes and the use of longitudinal designs, are proposed to further efforts to model the pathways from CSA to alcohol-related problems.

  1. Experiences with explicit finite-difference schemes for complex fluid dynamics problems on STAR-100 and CYBER-203 computers

    NASA Astrophysics Data System (ADS)

    Kumar, A.; Rudy, D. H.; Drummond, J. P.; Harris, J. E.

    1982-08-01

    Several two- and three-dimensional external and internal flow problems solved on the STAR-100 and CYBER-203 vector processing computers are described. The flow field was described by the full Navier-Stokes equations which were then solved by explicit finite-difference algorithms. Problem results and computer system requirements are presented. Program organization and data base structure for three-dimensional computer codes which will eliminate or improve on page faulting, are discussed. Storage requirements for three-dimensional codes are reduced by calculating transformation metric data in each step. As a result, in-core grid points were increased in number by 50% to 150,000, with a 10% execution time increase. An assessment of current and future machine requirements shows that even on the CYBER-205 computer only a few problems can be solved realistically. Estimates reveal that the present situation is more storage limited than compute rate limited, but advancements in both storage and speed are essential to realistically calculate three-dimensional flow.

  2. Experiences with explicit finite-difference schemes for complex fluid dynamics problems on STAR-100 and CYBER-203 computers

    NASA Technical Reports Server (NTRS)

    Kumar, A.; Rudy, D. H.; Drummond, J. P.; Harris, J. E.

    1982-01-01

    Several two- and three-dimensional external and internal flow problems solved on the STAR-100 and CYBER-203 vector processing computers are described. The flow field was described by the full Navier-Stokes equations which were then solved by explicit finite-difference algorithms. Problem results and computer system requirements are presented. Program organization and data base structure for three-dimensional computer codes which will eliminate or improve on page faulting, are discussed. Storage requirements for three-dimensional codes are reduced by calculating transformation metric data in each step. As a result, in-core grid points were increased in number by 50% to 150,000, with a 10% execution time increase. An assessment of current and future machine requirements shows that even on the CYBER-205 computer only a few problems can be solved realistically. Estimates reveal that the present situation is more storage limited than compute rate limited, but advancements in both storage and speed are essential to realistically calculate three-dimensional flow.

  3. Megacities in the coastal zone: Using a driver-pressure-state-impact-response framework to address complex environmental problems

    NASA Astrophysics Data System (ADS)

    Sekovski, Ivan; Newton, Alice; Dennison, William C.

    2012-01-01

    The purpose of this study was to elaborate on the role of coastal megacities in environmental degradation and their contribution to global climate change. Although only less than 4 percent of the total world's population resides in coastal megacities, their impact on environment is significant due to their rapid development, high population densities and high consumption rate of their residents. This study was carried out by implementing a Drivers-Pressures-States-Impacts-Responses (DPSIR) framework. This analytical framework was chosen because of its potential to link the existing data, gathered from various previous studies, in causal relationship. In this text, coastal megacities have been defined as cities exceeding 10 million inhabitants, situated in "near-coastal zone". Their high rates of the consumption of food, water, space and energy were observed and linked to the high performance rates of related economic activities (industry, transportation, power generation, agriculture and water extraction). In many of the studied coastal megacities, deteriorated quality of air and water was perceived, which can, in combination with global warming, lead to health problems and economic and social disturbance among residents. The extent of problems varied between developing and developed countries, showing higher rates of population growth and certain harmful emissions in megacities of developing countries, as well as more problems regarding food and water shortages, sanitation, and health care support. Although certain projections predict slowdown of growth in most coastal megacities, their future impact on environment is still unclear due to the uncertainties regarding future climate change and trajectories of consumption patterns.

  4. An integrated in silico approach to analyze the involvement of single amino acid polymorphisms in FANCD1/BRCA2-PALB2 and FANCD1/BRCA2-RAD51 complex.

    PubMed

    Doss, C George Priya; Nagasundaram, N

    2014-11-01

    Fanconi anemia (FA) is an autosomal recessive human disease characterized by genomic instability and a marked increase in cancer risk. The importance of FANCD1 gene is manifested by the fact that deleterious amino acid substitutions were found to confer susceptibility to hereditary breast and ovarian cancers. Attaining experimental knowledge about the possible disease-associated substitutions is laborious and time consuming. The recent introduction of genome variation analyzing in silico tools have the capability to identify the deleterious variants in an efficient manner. In this study, we conducted in silico variation analysis of deleterious non-synonymous SNPs at both functional and structural level in the breast cancer and FA susceptibility gene BRCA2/FANCD1. To identify and characterize deleterious mutations in this study, five in silico tools based on two different prediction methods namely pathogenicity prediction (SIFT, PolyPhen, and PANTHER), and protein stability prediction (I-Mutant 2.0 and MuStab) were analyzed. Based on the deleterious scores that overlap in these in silico approaches, and the availability of three-dimensional structures, structure analysis was carried out with the major mutations that occurred in the native protein coded by FANCD1/BRCA2 gene. In this work, we report the results of the first molecular dynamics (MD) simulation study performed to analyze the structural level changes in time scale level with respect to the native and mutated protein complexes (G25R, W31C, W31R in FANCD1/BRCA2-PALB2, and F1524V, V1532F in FANCD1/BRCA2-RAD51). Analysis of the MD trajectories indicated that predicted deleterious variants alter the structural behavior of BRCA2-PALB2 and BRCA2-RAD51 protein complexes. In addition, statistical analysis was employed to test the significance of these in silico tool predictions. Based on these predictions, we conclude that the identification of disease-related SNPs by in silico methods, in combination with MD

  5. Self-adaptive difference method for the effective solution of computationally complex problems of boundary layer theory

    NASA Technical Reports Server (NTRS)

    Schoenauer, W.; Daeubler, H. G.; Glotz, G.; Gruening, J.

    1986-01-01

    An implicit difference procedure for the solution of equations for a chemically reacting hypersonic boundary layer is described. Difference forms of arbitrary error order in the x and y coordinate plane were used to derive estimates for discretization error. Computational complexity and time were minimized by the use of this difference method and the iteration of the nonlinear boundary layer equations was regulated by discretization error. Velocity and temperature profiles are presented for Mach 20.14 and Mach 18.5; variables are velocity profiles, temperature profiles, mass flow factor, Stanton number, and friction drag coefficient; three figures include numeric data.

  6. Inverse problem of the multislice method in retrieving projected complex potentials from the exit-wave function.

    PubMed

    Lin, Fang; Jin, Chuanhong

    2014-03-01

    We proposed a new algorithm that retrieves the projected potentials from the EW of object. This algorithm is based on the traditional multislice method which involves the convolution operation in calculation. The retrieved potential is complex including both the electrostatic and absorptive components. Tests with the simulated exit waves of a 200 K InP crystal prove the algorithm effective for objects in wide thickness range. For thick specimen where dynamical electron diffraction prevails, the retrieved potential could present structure and chemical information of object by completely mapping an atom's scattering potential during interaction with incident electrons. PMID:24361232

  7. Downhole Fluid Analyzer Development

    SciTech Connect

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  8. Analyzing Software Piracy in Education.

    ERIC Educational Resources Information Center

    Lesisko, Lee James

    This study analyzes the controversy of software piracy in education. It begins with a real world scenario that presents the setting and context of the problem. The legalities and background of software piracy are explained and true court cases are briefly examined. Discussion then focuses on explaining why individuals and organizations pirate…

  9. You Can't Get There From Here! Problems and Potential Solutions in Developing New Classes of Complex Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy

    2005-01-01

    The explosion of capabilities and new products within the sphere of Information Technology (IT) has fostered widespread, overly optimistic opinions regarding the industry, based on common but unjustified assumptions of quality and correctness of software. These assumptions are encouraged by software producers and vendors, who at this late date have not succeeded in finding a way to overcome the lack of an automated, mathematically sound way to develop correct systems from requirements. NASA faces this dilemma as it envisages advanced mission concepts that involve large swarms of small spacecraft that will engage cooperatively to acheve science goals. Such missions entail levels of complexity that beg for new methods for system development far beyond today's methods, which are inadequate for ensuring correct behavior of large numbers of interacting intelligent mission elements. New system development techniques recently devised through NASA-led research will offer some innovative approaches to achieving correctness in complex system development, including autonomous swarm missions that exhibit emergent behavior, as well as general software products created by the computing industry.

  10. By way of introduction: modelling living systems, their diversity and their complexity: some methodological and theoretical problems.

    PubMed

    Pavé, Alain

    2006-01-01

    Some principles for a current methodology for biological systems' modelling are presented. It seems possible to promote a model-centred approach of these complex systems. Among present questions, the role of mechanisms producing random or quasi-random issues is underlined, because they are implied in biological diversification and in resulting complexity of living systems. Now, biodiversity is one of our societies' and scientific research's main concerns. Basically, it can be interpreted as a manner, for Life, to resist environmental hazards. Thus, one may assume that biodiversity producing mechanisms could be selected during evolution to face to corresponding risks of disappearance: necessity of chance? Therefore, analysing and modelling these 'biological and ecological roulettes' would be important, and not only their outputs like nowadays by using the theory of probabilities. It is then suggested that chaotic behaviours generated by deterministic dynamical systems could mimic random processes, and that 'biological and ecological roulettes' would be represented by such models. Practical consequences can be envisaged in terms of biodiversity management, and more generally in terms of these 'roulettes' control to generate selected biological and ecological events' distribution.

  11. FORTRAN Static Source Code Analyzer

    NASA Technical Reports Server (NTRS)

    Merwarth, P.

    1984-01-01

    FORTRAN Static Source Code Analyzer program, SAP (DEC VAX version), automatically gathers statistics on occurrences of statements and structures within FORTRAN program and provides reports of those statistics. Provisions made for weighting each statistic and provide an overall figure of complexity.

  12. A quasi-optimal coarse problem and an augmented Krylov solver for the variational theory of complex rays

    NASA Astrophysics Data System (ADS)

    Kovalevsky, Louis; Gosselet, Pierre

    2016-09-01

    The Variational Theory of Complex Rays (VTCR) is an indirect Trefftz method designed to study systems governed by Helmholtz-like equations. It uses wave functions to represent the solution inside elements, which reduces the dispersion error compared to classical polynomial approaches but the resulting system is prone to be ill conditioned. This paper gives a simple and original presentation of the VTCR using the discontinuous Galerkin framework and it traces back the ill-conditioning to the accumulation of eigenvalues near zero for the formulation written in terms of wave amplitude. The core of this paper presents an efficient solving strategy that overcomes this issue. The key element is the construction of a search subspace where the condition number is controlled at the cost of a limited decrease of attainable precision. An augmented LSQR solver is then proposed to solve efficiently and accurately the complete system. The approach is successfully applied to different examples.

  13. Approximate solutions to a nonintegrable problem of propagation of elliptically polarised waves in an isotropic gyrotropic nonlinear medium, and periodic analogues of multisoliton complexes

    SciTech Connect

    Makarov, V A; Petnikova, V M; Potravkin, N N; Shuvalov, V V

    2014-02-28

    Using the linearization method, we obtain approximate solutions to a one-dimensional nonintegrable problem of propagation of elliptically polarised light waves in an isotropic gyrotropic medium with local and nonlocal components of the Kerr nonlinearity and group-velocity dispersion. The consistent evolution of two orthogonal circularly polarised components of the field is described analytically in the case when their phases vary linearly during propagation. The conditions are determined for the excitation of waves with a regular and 'chaotic' change in the polarisation state. The character of the corresponding nonlinear solutions, i.e., periodic analogues of multisoliton complexes, is analysed. (nonlinear optical phenomena)

  14. Blood Gas Analyzers.

    PubMed

    Gonzalez, Anthony L; Waddell, Lori S

    2016-03-01

    Acid-base and respiratory disturbances are common in sick and hospitalized veterinary patients; therefore, blood gas analyzers have become integral diagnostic and monitoring tools. This article will discuss uses of blood gas analyzers, types of samples that can be used, sample collection methods, potential sources of error, and potential alternatives to blood gas analyzers and their limitations. It will also discuss the types of analyzers that are available, logistical considerations that should be taken into account when purchasing an analyzer, and the basic principles of how these analyzers work. PMID:27451046

  15. Attention-deficit hyperactivity disorder (ADHD), substance use disorders, and criminality: a difficult problem with complex solutions.

    PubMed

    Knecht, Carlos; de Alvaro, Raquel; Martinez-Raga, Jose; Balanza-Martinez, Vicent

    2015-05-01

    The association between attention-deficit hyperactivity disorder (ADHD) and criminality has been increasingly recognized as an important societal concern. Studies conducted in different settings have revealed high rates of ADHD among adolescent offenders. The risk for criminal behavior among individuals with ADHD is increased when there is psychiatric comorbidity, particularly conduct disorder and substance use disorder. In the present report, it is aimed to systematically review the literature on the epidemiological, neurobiological, and other risk factors contributing to this association, as well as the key aspects of the assessment, diagnosis, and treatment of ADHD among offenders. A systematic literature search of electronic databases (PubMed, EMBASE, and PsycINFO) was conducted to identify potentially relevant studies published in English, in peer-reviewed journals. Studies conducted in various settings within the judicial system and in many different countries suggest that the rate of adolescent and adult inmates with ADHD far exceeds that reported in the general population; however, underdiagnosis is common. Similarly, follow-up studies of children with ADHD have revealed high rates of criminal behaviors, arrests, convictions, and imprisonment in adolescence and adulthood. Assessment of ADHD and comorbid condition requires an ongoing and careful process. When treating offenders or inmates with ADHD, who commonly present other comorbid psychiatric disorder complex, comprehensive and tailored interventions, combining pharmacological and psychosocial strategies are likely to be needed.

  16. Attention-deficit hyperactivity disorder (ADHD), substance use disorders, and criminality: a difficult problem with complex solutions.

    PubMed

    Knecht, Carlos; de Alvaro, Raquel; Martinez-Raga, Jose; Balanza-Martinez, Vicent

    2015-05-01

    The association between attention-deficit hyperactivity disorder (ADHD) and criminality has been increasingly recognized as an important societal concern. Studies conducted in different settings have revealed high rates of ADHD among adolescent offenders. The risk for criminal behavior among individuals with ADHD is increased when there is psychiatric comorbidity, particularly conduct disorder and substance use disorder. In the present report, it is aimed to systematically review the literature on the epidemiological, neurobiological, and other risk factors contributing to this association, as well as the key aspects of the assessment, diagnosis, and treatment of ADHD among offenders. A systematic literature search of electronic databases (PubMed, EMBASE, and PsycINFO) was conducted to identify potentially relevant studies published in English, in peer-reviewed journals. Studies conducted in various settings within the judicial system and in many different countries suggest that the rate of adolescent and adult inmates with ADHD far exceeds that reported in the general population; however, underdiagnosis is common. Similarly, follow-up studies of children with ADHD have revealed high rates of criminal behaviors, arrests, convictions, and imprisonment in adolescence and adulthood. Assessment of ADHD and comorbid condition requires an ongoing and careful process. When treating offenders or inmates with ADHD, who commonly present other comorbid psychiatric disorder complex, comprehensive and tailored interventions, combining pharmacological and psychosocial strategies are likely to be needed. PMID:25411986

  17. Wideband digital spectrum analyzer

    NASA Technical Reports Server (NTRS)

    Morris, G. A., Jr.; Wilck, H. C.

    1979-01-01

    Modular spectrum analyzer consisting of RF receiver, fast fourier transform spectrum analyzer, and data processor samples stochastic signals in 220 channels. Construction reduces design and fabrication costs of assembled unit.

  18. Image quality analyzer

    NASA Astrophysics Data System (ADS)

    Lukin, V. P.; Botugina, N. N.; Emaleev, O. N.; Antoshkin, L. V.; Konyaev, P. A.

    2012-07-01

    Image quality analyzer (IQA) which used as device for efficiency analysis of adaptive optics application is described. In analyzer marketed possibility estimations quality of images on three different criterions of quality images: contrast, sharpnesses and the spectral criterion. At present given analyzer is introduced on Big Solar Vacuum Telescope in stale work that allows at observations to conduct the choice of the most contrasting images of Sun. Is it hereinafter planned use the analyzer in composition of the ANGARA adaptive correction system.

  19. A longitudinal study of higher-order thinking skills: working memory and fluid reasoning in childhood enhance complex problem solving in adolescence

    PubMed Central

    Greiff, Samuel; Wüstenberg, Sascha; Goetz, Thomas; Vainikainen, Mari-Pauliina; Hautamäki, Jarkko; Bornstein, Marc H.

    2015-01-01

    Scientists have studied the development of the human mind for decades and have accumulated an impressive number of empirical studies that have provided ample support for the notion that early cognitive performance during infancy and childhood is an important predictor of later cognitive performance during adulthood. As children move from childhood into adolescence, their mental development increasingly involves higher-order cognitive skills that are crucial for successful planning, decision-making, and problem solving skills. However, few studies have employed higher-order thinking skills such as complex problem solving (CPS) as developmental outcomes in adolescents. To fill this gap, we tested a longitudinal developmental model in a sample of 2,021 Finnish sixth grade students (M = 12.41 years, SD = 0.52; 1,041 female, 978 male, 2 missing sex). We assessed working memory (WM) and fluid reasoning (FR) at age 12 as predictors of two CPS dimensions: knowledge acquisition and knowledge application. We further assessed students’ CPS performance 3 years later as a developmental outcome (N = 1696; M = 15.22 years, SD = 0.43; 867 female, 829 male). Missing data partly occurred due to dropout and technical problems during the first days of testing and varied across indicators and time with a mean of 27.2%. Results revealed that FR was a strong predictor of both CPS dimensions, whereas WM exhibited only a small influence on one of the two CPS dimensions. These results provide strong support for the view that CPS involves FR and, to a lesser extent, WM in childhood and from there evolves into an increasingly complex structure of higher-order cognitive skills in adolescence. PMID:26283992

  20. A longitudinal study of higher-order thinking skills: working memory and fluid reasoning in childhood enhance complex problem solving in adolescence.

    PubMed

    Greiff, Samuel; Wüstenberg, Sascha; Goetz, Thomas; Vainikainen, Mari-Pauliina; Hautamäki, Jarkko; Bornstein, Marc H

    2015-01-01

    Scientists have studied the development of the human mind for decades and have accumulated an impressive number of empirical studies that have provided ample support for the notion that early cognitive performance during infancy and childhood is an important predictor of later cognitive performance during adulthood. As children move from childhood into adolescence, their mental development increasingly involves higher-order cognitive skills that are crucial for successful planning, decision-making, and problem solving skills. However, few studies have employed higher-order thinking skills such as complex problem solving (CPS) as developmental outcomes in adolescents. To fill this gap, we tested a longitudinal developmental model in a sample of 2,021 Finnish sixth grade students (M = 12.41 years, SD = 0.52; 1,041 female, 978 male, 2 missing sex). We assessed working memory (WM) and fluid reasoning (FR) at age 12 as predictors of two CPS dimensions: knowledge acquisition and knowledge application. We further assessed students' CPS performance 3 years later as a developmental outcome (N = 1696; M = 15.22 years, SD = 0.43; 867 female, 829 male). Missing data partly occurred due to dropout and technical problems during the first days of testing and varied across indicators and time with a mean of 27.2%. Results revealed that FR was a strong predictor of both CPS dimensions, whereas WM exhibited only a small influence on one of the two CPS dimensions. These results provide strong support for the view that CPS involves FR and, to a lesser extent, WM in childhood and from there evolves into an increasingly complex structure of higher-order cognitive skills in adolescence.

  1. A Study To Identify and Analyze the Perceived Nature and Causes of the English Language-Based Problems and the Coping Strategies of the Indonesian and Malaysian Students Studying in an American University.

    ERIC Educational Resources Information Center

    Ali, M. Solaiman

    A study investigated the nature and causes of the English language-based problems and the coping strategies of 44 Indonesian and 57 Malaysian students studying at Indiana University, Bloomington. The Indonesian and Malaysian student groups represented non-Commonwealth and Commonwealth students sharing the same native language roots but differed in…

  2. Explorations of the Concept of Local Capacity for Problem Solving: An Introduction to a Series of Papers Analyzing Nine School Improvement Projects. Draft. Documentation and Technical Assistance in Urban Schools.

    ERIC Educational Resources Information Center

    Wilson, Stephen H.

    A model for enhancing the local capacity of urban schools for solving problems by restructuring school settings is the subject of this paper. In identifying the strength and weaknesses of such a concept, the paper reviews data from nine sites studied by the Documentation and Technical Assistance (DTA) project for their applicability to other…

  3. Imbalance problem in community detection

    NASA Astrophysics Data System (ADS)

    Sun, Peng Gang

    2016-09-01

    Community detection gives us a simple way to understand complex networks' structures. However, there is an imbalance problem in community detection. This paper first introduces the imbalance problem and then proposes a new measure to alleviate the imbalance problem. In addition, we study two variants of the measure and further analyze the resolution scale of community detection. Finally, we compare our approach with some state of the art methods on random networks as well as real-world networks for community detection. Both the theoretical analysis and the experimental results show that our approach achieves better performance for community detection. We also find that our approach tends to separate densely connected subgroups preferentially.

  4. Collab-Analyzer: An Environment for Conducting Web-Based Collaborative Learning Activities and Analyzing Students' Information-Searching Behaviors

    ERIC Educational Resources Information Center

    Wu, Chih-Hsiang; Hwang, Gwo-Jen; Kuo, Fan-Ray

    2014-01-01

    Researchers have found that students might get lost or feel frustrated while searching for information on the Internet to deal with complex problems without real-time guidance or supports. To address this issue, a web-based collaborative learning system, Collab-Analyzer, is proposed in this paper. It is not only equipped with a collaborative…

  5. Analyzing Peace Pedagogies

    ERIC Educational Resources Information Center

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  6. Portable automatic blood analyzer

    NASA Technical Reports Server (NTRS)

    Coleman, R. L.

    1975-01-01

    Analyzer employs chemical-sensing electrodes for determination of blood, gas, and ion concentrations. It is rugged, easily serviced, and comparatively simple to operate. System can analyze up to eight parameters and can be modified to measure other blood constituents including nonionic species, such as urea, glucose, and oxygen.

  7. Analyzing Costs of Services.

    ERIC Educational Resources Information Center

    Cox, James O.; Black, Talbot

    A simplified method to gather and analyze cost data is presented for administrators of Handicapped Children's Early Education Programs, and specifically for members of the Technical Assistance Development System, North Carolina. After identifying benefits and liabilities associated with analyzing program costs, attention is focused on the internal…

  8. Rigged or rigorous? Partnerships for research and evaluation of complex social problems: Lessons from the field of violence against women and girls.

    PubMed

    Zimmerman, Cathy; Michau, Lori; Hossain, Mazeda; Kiss, Ligia; Borland, Rosilyne; Watts, Charlotte

    2016-09-01

    There is growing demand for robust evidence to address complex social phenomena such as violence against women and girls (VAWG). Research partnerships between scientists and non-governmental or international organizations (NGO/IO) are increasingly popular, but can pose challenges, including concerns about potential conflicts of interest. Drawing on our experience collaborating on VAWG research, we describe challenges and contributions that NGO/IO and academic partners can make at different stages of the research process and the effects that collaborations can have on scientific inquiry. Partners may struggle with differing priorities and misunderstandings about roles, limitations, and intentions. Benefits of partnerships include a shared vision of study goals, differing and complementary expertise, mutual respect, and a history of constructive collaboration. Our experience suggests that when investigating multi-faceted social problems, instead of 'rigging' study results, research collaborations can strengthen scientific rigor and offer the greatest potential for impact in the communities we seek to serve. PMID:27638245

  9. Analyzing water resources

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Report on water resources discusses problems in water measurement demand, use, and availability. Also discussed are sensing accuracies, parameter monitoring, and status of forecasting, modeling, and future measurement techniques.

  10. The "Performance of Rotavirus and Oral Polio Vaccines in Developing Countries" (PROVIDE) study: description of methods of an interventional study designed to explore complex biologic problems.

    PubMed

    Kirkpatrick, Beth D; Colgate, E Ross; Mychaleckyj, Josyf C; Haque, Rashidul; Dickson, Dorothy M; Carmolli, Marya P; Nayak, Uma; Taniuchi, Mami; Naylor, Caitlin; Qadri, Firdausi; Ma, Jennie Z; Alam, Masud; Walsh, Mary Claire; Diehl, Sean A; Petri, William A

    2015-04-01

    Oral vaccines appear less effective in children in the developing world. Proposed biologic reasons include concurrent enteric infections, malnutrition, breast milk interference, and environmental enteropathy (EE). Rigorous study design and careful data management are essential to begin to understand this complex problem while assuring research subject safety. Herein, we describe the methodology and lessons learned in the PROVIDE study (Dhaka, Bangladesh). A randomized clinical trial platform evaluated the efficacy of delayed-dose oral rotavirus vaccine as well as the benefit of an injectable polio vaccine replacing one dose of oral polio vaccine. This rigorous infrastructure supported the additional examination of hypotheses of vaccine underperformance. Primary and secondary efficacy and immunogenicity measures for rotavirus and polio vaccines were measured, as well as the impact of EE and additional exploratory variables. Methods for the enrollment and 2-year follow-up of a 700 child birth cohort are described, including core laboratory, safety, regulatory, and data management practices. Intense efforts to standardize clinical, laboratory, and data management procedures in a developing world setting provide clinical trials rigor to all outcomes. Although this study infrastructure requires extensive time and effort, it allows optimized safety and confidence in the validity of data gathered in complex, developing country settings.

  11. A Categorization of Dynamic Analyzers

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  12. Solvent dependent photosensitized singlet oxygen production from an Ir(III) complex: pointing to problems in studies of singlet-oxygen-mediated cell death.

    PubMed

    Takizawa, Shin-ya; Breitenbach, Thomas; Westberg, Michael; Holmegaard, Lotte; Gollmer, Anita; Jensen, Rasmus L; Murata, Shigeru; Ogilby, Peter R

    2015-10-01

    A cationic cyclometallated Ir(III) complex with 1,10-phenanthroline and 2-phenylpyridine ligands photosensitizes the production of singlet oxygen, O2(a(1)Δ(g)), with yields that depend appreciably on the solvent. In water, the quantum yield of photosensitized O2(a(1)Δ(g)) production is small (ϕ(Δ) = 0.036 ± 0.008), whereas in less polar solvents, the quantum yield is much larger (ϕ(Δ) = 0.54 ± 0.05 in octan-1-ol). A solvent effect on ϕ(Δ) of this magnitude is rarely observed and, in this case, is attributed to charge-transfer-mediated processes of non-radiative excited state deactivation that are more pronounced in polar solvents and that kinetically compete with energy transfer to produce O2(a(1)Δ(g)). A key component of this non-radiative deactivation process, electronic-to-vibrational energy transfer, is also manifested in pronounced H2O/D2O isotope effects that indicate appreciable coupling between the Ir(III) complex and water. This Ir(III) complex is readily incorporated into HeLa cells and, upon irradiation, is cytotoxic as a consequence of the O2(a(1)Δ(g)) thus produced. The data reported herein point to a pervasive problem in mechanistic studies of photosensitized O2(a(1)Δ(g))-mediated cell death: care must be exercised when interpreting the effective cytotoxicity of O2(a(1)Δ(g)) photosensitizers whose photophysical properties depend strongly on the local environment. Specifically, the photophysics of the sensitizer in bulk solutions may not accurately reflect its intracellular behavior, and the control and quantification of the O2(a(1)Δ(g)) "dose" can be difficult in vivo.

  13. Software Design Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  14. Automatic amino acid analyzer

    NASA Technical Reports Server (NTRS)

    Berdahl, B. J.; Carle, G. C.; Oyama, V. I.

    1971-01-01

    Analyzer operates unattended or up to 15 hours. It has an automatic sample injection system and can be programmed. All fluid-flow valve switching is accomplished pneumatically from miniature three-way solenoid pilot valves.

  15. Analyzing binding data.

    PubMed

    Motulsky, Harvey J; Neubig, Richard R

    2010-07-01

    Measuring the rate and extent of radioligand binding provides information on the number of binding sites, and their affinity and accessibility of these binding sites for various drugs. This unit explains how to design and analyze such experiments.

  16. Soil Rock Analyzer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A redesigned version of a soil/rock analyzer developed by Martin Marietta under a Langley Research Center contract is being marketed by Aurora Tech, Inc. Known as the Aurora ATX-100, it has self-contained power, an oscilloscope, a liquid crystal readout, and a multichannel spectrum analyzer. It measures energy emissions to determine what elements in what percentages a sample contains. It is lightweight and may be used for mineral exploration, pollution monitoring, etc.

  17. Managing healthcare information: analyzing trust.

    PubMed

    Söderström, Eva; Eriksson, Nomie; Åhlfeldt, Rose-Mharie

    2016-08-01

    Purpose - The purpose of this paper is to analyze two case studies with a trust matrix tool, to identify trust issues related to electronic health records. Design/methodology/approach - A qualitative research approach is applied using two case studies. The data analysis of these studies generated a problem list, which was mapped to a trust matrix. Findings - Results demonstrate flaws in current practices and point to achieving balance between organizational, person and technology trust perspectives. The analysis revealed three challenge areas, to: achieve higher trust in patient-focussed healthcare; improve communication between patients and healthcare professionals; and establish clear terminology. By taking trust into account, a more holistic perspective on healthcare can be achieved, where trust can be obtained and optimized. Research limitations/implications - A trust matrix is tested and shown to identify trust problems on different levels and relating to trusting beliefs. Future research should elaborate and more fully address issues within three identified challenge areas. Practical implications - The trust matrix's usefulness as a tool for organizations to analyze trust problems and issues is demonstrated. Originality/value - Healthcare trust issues are captured to a greater extent and from previously unchartered perspectives. PMID:27477934

  18. Total organic carbon analyzer

    NASA Technical Reports Server (NTRS)

    Godec, Richard G.; Kosenka, Paul P.; Smith, Brian D.; Hutte, Richard S.; Webb, Johanna V.; Sauer, Richard L.

    1991-01-01

    The development and testing of a breadboard version of a highly sensitive total-organic-carbon (TOC) analyzer are reported. Attention is given to the system components including the CO2 sensor, oxidation reactor, acidification module, and the sample-inlet system. Research is reported for an experimental reagentless oxidation reactor, and good results are reported for linearity, sensitivity, and selectivity in the CO2 sensor. The TOC analyzer is developed with gravity-independent components and is designed for minimal additions of chemical reagents. The reagentless oxidation reactor is based on electrolysis and UV photolysis and is shown to be potentially useful. The stability of the breadboard instrument is shown to be good on a day-to-day basis, and the analyzer is capable of 5 sample analyses per day for a period of about 80 days. The instrument can provide accurate TOC and TIC measurements over a concentration range of 20 ppb to 50 ppm C.

  19. Electrosurgical unit analyzers.

    PubMed

    1998-07-01

    Electrosurgical unit (ESU) analyzers automate the testing and inspection of the output circuits and safety features of ESUs. They perform testing that would otherwise require several other pieces of equipment, as well as considerably more time and greater technician expertise. They are used largely by clinical engineering departments for routine inspection and preventive maintenance (IPM) procedures and, less often, for accident investigations and troubleshooting. In this Evaluation, we tested three ESU analyzers from three suppliers. We rated all three analyzers Acceptable and ranked them in two groupings. In ranking the units, we placed the greatest weight on ease of use for routine ESU inspections, and gave additional consideration to versatility for advanced applications such as ESU research. The unit in Group 1 was the easiest to use, especially for infrequent users. The units in Group 2 were satisfactory but require more frequent use to maintain proficiency and to avoid user errors. PMID:9689540

  20. Analyzing radioligand binding data.

    PubMed

    Motulsky, Harvey; Neubig, Richard

    2002-08-01

    Radioligand binding experiments are easy to perform, and provide useful data in many fields. They can be used to study receptor regulation, discover new drugs by screening for compounds that compete with high affinity for radioligand binding to a particular receptor, investigate receptor localization in different organs or regions using autoradiography, categorize receptor subtypes, and probe mechanisms of receptor signaling, via measurements of agonist binding and its regulation by ions, nucleotides, and other allosteric modulators. This unit reviews the theory of receptor binding and explains how to analyze experimental data. Since binding data are usually best analyzed using nonlinear regression, this unit also explains the principles of curve fitting with nonlinear regression.

  1. Using Networks to Visualize and Analyze Process Data for Educational Assessment

    ERIC Educational Resources Information Center

    Zhu, Mengxiao; Shu, Zhan; von Davier, Alina A.

    2016-01-01

    New technology enables interactive and adaptive scenario-based tasks (SBTs) to be adopted in educational measurement. At the same time, it is a challenging problem to build appropriate psychometric models to analyze data collected from these tasks, due to the complexity of the data. This study focuses on process data collected from SBTs. We…

  2. List mode multichannel analyzer

    SciTech Connect

    Archer, Daniel E.; Luke, S. John; Mauger, G. Joseph; Riot, Vincent J.; Knapp, David A.

    2007-08-07

    A digital list mode multichannel analyzer (MCA) built around a programmable FPGA device for onboard data analysis and on-the-fly modification of system detection/operating parameters, and capable of collecting and processing data in very small time bins (<1 millisecond) when used in histogramming mode, or in list mode as a list mode MCA.

  3. Analyzing Workforce Education. Monograph.

    ERIC Educational Resources Information Center

    Texas Community & Technical Coll. Workforce Education Consortium.

    This monograph examines the issue of task analysis as used in workplace literacy programs, debating the need for it and how to perform it in a rapidly changing environment. Based on experiences of community colleges in Texas, the report analyzes ways that task analysis can be done and how to implement work force education programs more quickly.…

  4. Electronic sleep analyzer

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.

    1970-01-01

    Electronic instrument automatically monitors the stages of sleep of a human subject. The analyzer provides a series of discrete voltage steps with each step corresponding to a clinical assessment of level of consciousness. It is based on the operation of an EEG and requires very little telemetry bandwidth or time.

  5. Micro acoustic spectrum analyzer

    DOEpatents

    Schubert, W. Kent; Butler, Michael A.; Adkins, Douglas R.; Anderson, Larry F.

    2004-11-23

    A micro acoustic spectrum analyzer for determining the frequency components of a fluctuating sound signal comprises a microphone to pick up the fluctuating sound signal and produce an alternating current electrical signal; at least one microfabricated resonator, each resonator having a different resonant frequency, that vibrate in response to the alternating current electrical signal; and at least one detector to detect the vibration of the microfabricated resonators. The micro acoustic spectrum analyzer can further comprise a mixer to mix a reference signal with the alternating current electrical signal from the microphone to shift the frequency spectrum to a frequency range that is a better matched to the resonant frequencies of the microfabricated resonators. The micro acoustic spectrum analyzer can be designed specifically for portability, size, cost, accuracy, speed, power requirements, and use in a harsh environment. The micro acoustic spectrum analyzer is particularly suited for applications where size, accessibility, and power requirements are limited, such as the monitoring of industrial equipment and processes, detection of security intrusions, or evaluation of military threats.

  6. PULSE AMPLITUDE ANALYZER

    DOEpatents

    Greenblatt, M.H.

    1958-03-25

    This patent pertains to pulse amplitude analyzers for sorting and counting a serles of pulses, and specifically discloses an analyzer which ls simple in construction and presents the puise height distribution visually on an oscilloscope screen. According to the invention, the pulses are applied to the vertical deflection plates of an oscilloscope and trigger the horizontal sweep. Each pulse starts at the same point on the screen and has a maximum amplitude substantially along the same vertical line. A mask is placed over the screen except for a slot running along the line where the maximum amplitudes of the pulses appear. After the slot has been scanned by a photocell in combination with a slotted rotating disk, the photocell signal is displayed on an auxiliary oscilloscope as vertical deflection along a horizontal time base to portray the pulse amplitude distribution.

  7. Analyzing radioligand binding data.

    PubMed

    Motulsky, H; Neubig, R

    2001-05-01

    A radioligand is a radioactively labeled drug that can associate with a receptor, transporter, enzyme, or any protein of interest. Measuring the rate and extent of binding provides information on the number of binding sites, and their affinity and accessibility for various drugs. Radioligand binding experiments are easy to perform, and provide useful data in many fields. For example, radioligand binding studies are used to study receptor regulation, investigate receptor localization in different organs or regions using autoradiography, categorize receptor subtypes, and probe mechanisms of receptor signaling. This unit reviews the theory of receptor binding and explains how to analyze experimental data. Since binding data are usually best analyzed using nonlinear regression, this unit also explains the principles of curve fitting with nonlinear regression.

  8. Analyzing Optical Communications Links

    NASA Technical Reports Server (NTRS)

    Marshall, William K.; Burk, Brian D.

    1990-01-01

    Optical Communication Link Analysis Program, OPTI, analyzes optical and near-infrared communication links using pulse-position modulation (PPM) and direct detention. Link margins and design-control tables generated from input parameters supplied by user. Enables user to save sets of input parameters that define given link and read them back into program later. Alters automatically any of input parameters to achieve desired link margin. Written in FORTRAN 77.

  9. Magnetoresistive emulsion analyzer.

    PubMed

    Lin, Gungun; Baraban, Larysa; Han, Luyang; Karnaushenko, Daniil; Makarov, Denys; Cuniberti, Gianaurelio; Schmidt, Oliver G

    2013-01-01

    We realize a magnetoresistive emulsion analyzer capable of detection, multiparametric analysis and sorting of ferrofluid-containing nanoliter-droplets. The operation of the device in a cytometric mode provides high throughput and quantitative information about the dimensions and magnetic content of the emulsion. Our method offers important complementarity to conventional optical approaches involving ferrofluids, and paves the way to the development of novel compact tools for diagnostics and nanomedicine including drug design and screening. PMID:23989504

  10. Magnetoresistive Emulsion Analyzer

    PubMed Central

    Lin, Gungun; Baraban, Larysa; Han, Luyang; Karnaushenko, Daniil; Makarov, Denys; Cuniberti, Gianaurelio; Schmidt, Oliver G.

    2013-01-01

    We realize a magnetoresistive emulsion analyzer capable of detection, multiparametric analysis and sorting of ferrofluid-containing nanoliter-droplets. The operation of the device in a cytometric mode provides high throughput and quantitative information about the dimensions and magnetic content of the emulsion. Our method offers important complementarity to conventional optical approaches involving ferrofluids, and paves the way to the development of novel compact tools for diagnostics and nanomedicine including drug design and screening. PMID:23989504

  11. Gauge cooling for the singular-drift problem in the complex Langevin method — a test in Random Matrix Theory for finite density QCD

    NASA Astrophysics Data System (ADS)

    Nagata, Keitaro; Nishimura, Jun; Shimasaki, Shinji

    2016-07-01

    Recently, the complex Langevin method has been applied successfully to finite density QCD either in the deconfinement phase or in the heavy dense limit with the aid of a new technique called the gauge cooling. In the confinement phase with light quarks, however, convergence to wrong limits occurs due to the singularity in the drift term caused by small eigenvalues of the Dirac operator including the mass term. We propose that this singular-drift problem should also be overcome by the gauge cooling with different criteria for choosing the complexified gauge transformation. The idea is tested in chiral Random Matrix Theory for finite density QCD, where exact results are reproduced at zero temperature with light quarks. It is shown that the gauge cooling indeed changes drastically the eigenvalue distribution of the Dirac operator measured during the Langevin process. Despite its non-holomorphic nature, this eigenvalue distribution has a universal diverging behavior at the origin in the chiral limit due to a generalized Banks-Casher relation as we confirm explicitly.

  12. Design Problems for Secondary Students

    ERIC Educational Resources Information Center

    Jonassen, David H.

    2011-01-01

    Are there different kinds of design problems? Jonassen (2011) argued that problems vary in terms of structuredness, complexity, and context. On the structuredness and complexity continua, design problems tend to be the most ill-structured and complex. Brown and Chandrasekaran suggest that design problems may vary along a continuum from…

  13. Portable Gas Analyzer

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The Michromonitor M500 universal gas analyzer contains a series of miniature modules, each of which is a complete gas chromatograph, an instrument which separates a gaseous mixture into its components and measures the concentrations of each gas in the mixture. The system is manufactured by Microsensor Technology, and is used for environmental analysis, monitoring for gas leaks and chemical spills, compliance with pollution laws, etc. The technology is based on a Viking attempt to detect life on Mars. Ames/Stanford miniaturized the system and NIOSH funded further development. Three Stanford researchers commercialized the technology, which can be operated by unskilled personnel.

  14. RELAPS desktop analyzer

    SciTech Connect

    Beelman, R.J.; Grush, W.H.; Mortensen, G.A.; Snider, D.M.; Wagner, K.L.

    1989-01-01

    The previously mainframe bound RELAP5 reactor safety computer code has been installed on a microcomputer. A simple color-graphic display driver has been developed to enable the user to view the code results as the calculation advances. In order to facilitate future interactive desktop applications, the Nuclear Plant Analyzer (NPA), also previously mainframe bound, is being redesigned to encompass workstation applications. The marriage of RELAP5 simulation capabilities with NPA interactive graphics on a desktop workstation promises to revolutionize reactor safety analysis methodology. 8 refs.

  15. Fluorescence analyzer for lignin

    DOEpatents

    Berthold, John W.; Malito, Michael L.; Jeffers, Larry

    1993-01-01

    A method and apparatus for measuring lignin concentration in a sample of wood pulp or black liquor comprises a light emitting arrangement for emitting an excitation light through optical fiber bundles into a probe which has an undiluted sensing end facing the sample. The excitation light causes the lignin concentration to produce fluorescent emission light which is then conveyed through the probe to analyzing equipment which measures the intensity of the emission light. Measures a This invention was made with Government support under Contract Number DOE: DE-FC05-90CE40905 awarded by the Department of Energy (DOE). The Government has certain rights in this invention.

  16. Exploiting phase transitions for fusion optimization problems

    NASA Astrophysics Data System (ADS)

    Svenson, Pontus

    2005-05-01

    Many optimization problems that arise in multi-target tracking and fusion applications are known to be NP-complete, ie, believed to have worst-case complexities that are exponential in problem size. Recently, many such NP-complete problems have been shown to display threshold phenomena: it is possible to define a parameter such that the probability of a random problem instance having a solution jumps from 1 to 0 at a specific value of the parameter. It is also found that the amount of resources needed to solve the problem instance peaks at the transition point. Among the problems found to display this behavior are graph coloring (aka clustering, relevant for multi-target tracking), satisfiability (which occurs in resource allocation and planning problem), and the travelling salesperson problem. Physicists studying these problems have found intriguing similarities to phase transitions in spin models of statistical mechanics. Many methods previously used to analyze spin glasses have been used to explain some of the properties of the behavior at the transition point. It turns out that the transition happens because the fitness landscape of the problem changes as the parameter is varied. Some algorithms have been introduced that exploit this knowledge of the structure of the fitness landscape. In this paper, we review some of the experimental and theoretical work on threshold phenomena in optimization problems and indicate how optimization problems from tracking and sensor resource allocation could be analyzed using these results.

  17. Analyzing Aeroelasticity in Turbomachines

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Srivastava, R.

    2003-01-01

    ASTROP2-LE is a computer program that predicts flutter and forced responses of blades, vanes, and other components of such turbomachines as fans, compressors, and turbines. ASTROP2-LE is based on the ASTROP2 program, developed previously for analysis of stability of turbomachinery components. In developing ASTROP2- LE, ASTROP2 was modified to include a capability for modeling forced responses. The program was also modified to add a capability for analysis of aeroelasticity with mistuning and unsteady aerodynamic solutions from another program, LINFLX2D, that solves the linearized Euler equations of unsteady two-dimensional flow. Using LINFLX2D to calculate unsteady aerodynamic loads, it is possible to analyze effects of transonic flow on flutter and forced response. ASTROP2-LE can be used to analyze subsonic, transonic, and supersonic aerodynamics and structural mistuning for rotors with blades of differing structural properties. It calculates the aerodynamic damping of a blade system operating in airflow so that stability can be assessed. The code also predicts the magnitudes and frequencies of the unsteady aerodynamic forces on the airfoils of a blade row from incoming wakes. This information can be used in high-cycle fatigue analysis to predict the fatigue lives of the blades.

  18. Ring Image Analyzer

    NASA Technical Reports Server (NTRS)

    Strekalov, Dmitry V.

    2012-01-01

    Ring Image Analyzer software analyzes images to recognize elliptical patterns. It determines the ellipse parameters (axes ratio, centroid coordinate, tilt angle). The program attempts to recognize elliptical fringes (e.g., Newton Rings) on a photograph and determine their centroid position, the short-to-long-axis ratio, and the angle of rotation of the long axis relative to the horizontal direction on the photograph. These capabilities are important in interferometric imaging and control of surfaces. In particular, this program has been developed and applied for determining the rim shape of precision-machined optical whispering gallery mode resonators. The program relies on a unique image recognition algorithm aimed at recognizing elliptical shapes, but can be easily adapted to other geometric shapes. It is robust against non-elliptical details of the image and against noise. Interferometric analysis of precision-machined surfaces remains an important technological instrument in hardware development and quality analysis. This software automates and increases the accuracy of this technique. The software has been developed for the needs of an R&TD-funded project and has become an important asset for the future research proposal to NASA as well as other agencies.

  19. Plutonium solution analyzer

    SciTech Connect

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  20. Multiple capillary biochemical analyzer

    DOEpatents

    Dovichi, Norman J.; Zhang, Jian Z.

    1995-01-01

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibres to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands.

  1. Multiple capillary biochemical analyzer

    DOEpatents

    Dovichi, N.J.; Zhang, J.Z.

    1995-08-08

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibers to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands. 21 figs.

  2. Field Deployable DNA analyzer

    SciTech Connect

    Wheeler, E; Christian, A; Marion, J; Sorensen, K; Arroyo, E; Vrankovich, G; Hara, C; Nguyen, C

    2005-02-09

    This report details the feasibility of a field deployable DNA analyzer. Steps for swabbing cells from surfaces and extracting DNA in an automatable way are presented. Since enzymatic amplification reactions are highly sensitive to environmental contamination, sample preparation is a crucial step to make an autonomous deployable instrument. We perform sample clean up and concentration in a flow through packed bed. For small initial samples, whole genome amplification is performed in the packed bed resulting in enough product for subsequent PCR amplification. In addition to DNA, which can be used to identify a subject, protein is also left behind, the analysis of which can be used to determine exposure to certain substances, such as radionuclides. Our preparative step for DNA analysis left behind the protein complement as a waste stream; we determined to learn if the proteins themselves could be analyzed in a fieldable device. We successfully developed a two-step lateral flow assay for protein analysis and demonstrate a proof of principle assay.

  3. Analyzing crime scene videos

    NASA Astrophysics Data System (ADS)

    Cunningham, Cindy C.; Peloquin, Tracy D.

    1999-02-01

    Since late 1996 the Forensic Identification Services Section of the Ontario Provincial Police has been actively involved in state-of-the-art image capture and the processing of video images extracted from crime scene videos. The benefits and problems of this technology for video analysis are discussed. All analysis is being conducted on SUN Microsystems UNIX computers, networked to a digital disk recorder that is used for video capture. The primary advantage of this system over traditional frame grabber technology is reviewed. Examples from actual cases are presented and the successes and limitations of this approach are explored. Suggestions to companies implementing security technology plans for various organizations (banks, stores, restaurants, etc.) will be made. Future directions for this work and new technologies are also discussed.

  4. Analyzing nonrenewable resource supply

    SciTech Connect

    Bohi, D.R.; Toman, M.A.

    1984-01-01

    Starting with their vision of a useful model of supply behavior as dynamic and market oriented, the authors examine the literature to see what it offers, to fill in some of the missing elements, and to direct attention to the research that is required. Following an introduction, separate chapters deal with the basic theory of supply behavior; joint products, externalities, and technical change; uncertainty, expectations, and supply behavior; aggregate supply and market behavior; and empirical methods and problems. The authors argue that practical understanding of nonrenewable resource supply is hampered by gaps among theory, methodology, and data, and offer a standard designed to achieve consistency among theory, data, and estimation methods. Their recommendations for additional research focus on general specification issues, uncertainty and expectations, market-level analysis, and strategic behavioral issues. 151 references, 9 figures.

  5. Analyzing Water's Optical Absorption

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  6. Motion detector and analyzer

    DOEpatents

    Unruh, W.P.

    1987-03-23

    Method and apparatus are provided for deriving positive and negative Doppler spectrum to enable analysis of objects in motion, and particularly, objects having rotary motion. First and second returned radar signals are mixed with internal signals to obtain an in-phase process signal and a quadrature process signal. A broad-band phase shifter shifts the quadrature signal through 90/degree/ relative to the in-phase signal over a predetermined frequency range. A pair of signals is output from the broad-band phase shifter which are then combined to provide a first side band signal which is functionally related to a negative Doppler shift spectrum. The distinct positive and negative Doppler spectra may then be analyzed for the motion characteristics of the object being examined.

  7. Analyzing geographic clustered response

    SciTech Connect

    Merrill, D.W.; Selvin, S.; Mohr, M.S.

    1991-08-01

    In the study of geographic disease clusters, an alternative to traditional methods based on rates is to analyze case locations on a transformed map in which population density is everywhere equal. Although the analyst's task is thereby simplified, the specification of the density equalizing map projection (DEMP) itself is not simple and continues to be the subject of considerable research. Here a new DEMP algorithm is described, which avoids some of the difficulties of earlier approaches. The new algorithm (a) avoids illegal overlapping of transformed polygons; (b) finds the unique solution that minimizes map distortion; (c) provides constant magnification over each map polygon; (d) defines a continuous transformation over the entire map domain; (e) defines an inverse transformation; (f) can accept optional constraints such as fixed boundaries; and (g) can use commercially supported minimization software. Work is continuing to improve computing efficiency and improve the algorithm. 21 refs., 15 figs., 2 tabs.

  8. Analyzing a Cometary 'Sneeze'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Figure 1: Analyzing a Cometary 'Sneeze'

    This display shows highly processed images of the outburst of comet Tempel 1 between June 22 and 23, 2005. The pictures were taken by Deep Impact's medium-resolution camera. An average image of the comet has been subtracted from each picture to provide an enhanced view of the outburst. The intensity has also been stretched to show the faintest parts. This processing enables measurement of the outflow speed and the details of the dissipation of the outburst. The left image was taken when the comet was very close to its normal, non-bursting state, so almost nothing is visible.

  9. Residual gas analyzer calibration

    NASA Technical Reports Server (NTRS)

    Lilienkamp, R. H.

    1972-01-01

    A technique which employs known gas mixtures to calibrate the residual gas analyzer (RGA) is described. The mass spectra from the RGA are recorded for each gas mixture. This mass spectra data and the mixture composition data each form a matrix. From the two matrices the calibration matrix may be computed. The matrix mathematics requires the number of calibration gas mixtures be equal to or greater than the number of gases included in the calibration. This technique was evaluated using a mathematical model of an RGA to generate the mass spectra. This model included shot noise errors in the mass spectra. Errors in the gas concentrations were also included in the valuation. The effects of these errors was studied by varying their magnitudes and comparing the resulting calibrations. Several methods of evaluating an actual calibration are presented. The effects of the number of gases in then, the composition of the calibration mixture, and the number of mixtures used are discussed.

  10. ROBOT TASK SCENE ANALYZER

    SciTech Connect

    William R. Hamel; Steven Everett

    2000-08-01

    Environmental restoration and waste management (ER and WM) challenges in the United States Department of Energy (DOE), and around the world, involve radiation or other hazards which will necessitate the use of remote operations to protect human workers from dangerous exposures. Remote operations carry the implication of greater costs since remote work systems are inherently less productive than contact human work due to the inefficiencies/complexities of teleoperation. To reduce costs and improve quality, much attention has been focused on methods to improve the productivity of combined human operator/remote equipment systems; the achievements to date are modest at best. The most promising avenue in the near term is to supplement conventional remote work systems with robotic planning and control techniques borrowed from manufacturing and other domains where robotic automation has been used. Practical combinations of teleoperation and robotic control will yield telerobotic work systems that outperform currently available remote equipment. It is believed that practical telerobotic systems may increase remote work efficiencies significantly. Increases of 30% to 50% have been conservatively estimated for typical remote operations. It is important to recognize that the basic hardware and software features of most modern remote manipulation systems can readily accommodate the functionality required for telerobotics. Further, several of the additional system ingredients necessary to implement telerobotic control--machine vision, 3D object and workspace modeling, automatic tool path generation and collision-free trajectory planning--are existent.

  11. Monte Carlo techniques for analyzing deep penetration problems

    SciTech Connect

    Cramer, S.N.; Gonnord, J.; Hendricks, J.S.

    1985-01-01

    A review of current methods and difficulties in Monte Carlo deep-penetration calculations is presented. Statistical uncertainty is discussed, and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing is reviewed. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multi-group Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications. 29 refs.

  12. Monte Carlo techniques for analyzing deep-penetration problems

    SciTech Connect

    Cramer, S.N.; Gonnord, J.; Hendricks, J.S.

    1986-02-01

    Current methods and difficulties in Monte Carlo deep-penetration calculations are reviewed, including statistical uncertainty and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multigroup Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications.

  13. Analyzing and Teaching English: The Problem of the Norm.

    ERIC Educational Resources Information Center

    James, C. Vaughan

    1992-01-01

    Argues that, although academic study of English development English concentrates on the norm, definitions of the norm are not clear, and many cited examples betray a lack of knowledge of historical aspects of English. It is suggested that emphasis on communication in teaching practice may lead to acceptance of nonnormative language use. (JL)

  14. Analyzing Human Communication Networks in Organizations: Applications to Management Problems.

    ERIC Educational Resources Information Center

    Farace, Richard V.; Danowski, James A.

    Investigating the networks of communication in organizations leads to an understanding of efficient and inefficient information dissemination as practiced in large systems. Most important in organizational communication is the role of the "liaison person"--the coordinator of intercommunication. When functioning efficiently, coordinators maintain…

  15. Analyzing Performance Problems; or "You Really Oughta Wanna".

    ERIC Educational Resources Information Center

    Mager, Robert F.; Pipe, Peter

    When faced with a discrepancy between the actual and the desired performance of a student, employee, or acquaintance, the usual course of action is to "train, transfer, or terminate" the individual. The authors believe that while these may sometimes be appropriate solutions appropriately applied, more often they are not. They offer a procedure for…

  16. Lorentz force particle analyzer

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Thess, André; Moreau, René; Tan, Yanqing; Dai, Shangjun; Tao, Zhen; Yang, Wenzhi; Wang, Bo

    2016-07-01

    A new contactless technique is presented for the detection of micron-sized insulating particles in the flow of an electrically conducting fluid. A transverse magnetic field brakes this flow and tends to become entrained in the flow direction by a Lorentz force, whose reaction force on the magnetic-field-generating system can be measured. The presence of insulating particles suspended in the fluid produce changes in this Lorentz force, generating pulses in it; these pulses enable the particles to be counted and sized. A two-dimensional numerical model that employs a moving mesh method demonstrates the measurement principle when such a particle is present. Two prototypes and a three-dimensional numerical model are used to demonstrate the feasibility of a Lorentz force particle analyzer (LFPA). The findings of this study conclude that such an LFPA, which offers contactless and on-line quantitative measurements, can be applied to an extensive range of applications. These applications include measurements of the cleanliness of high-temperature and aggressive molten metal, such as aluminum and steel alloys, and the clean manufacturing of semiconductors.

  17. Analyzing nocturnal noise stratification.

    PubMed

    Rey Gozalo, Guillermo; Barrigón Morillas, Juan Miguel; Gómez Escobar, Valentín

    2014-05-01

    Pollution associated to traffic can be considered as one of the most relevant pollution sources in our cities; noise is one of the major components of traffic pollution; thus, efforts are necessary to search adequate noise assessment methods and low pollution city designs. Different methods have been proposed for the evaluation of noise in cities, including the categorization method, which is based on the functionality concept. Until now, this method has only been studied (with encouraging results) for short-term, diurnal measurements, but nocturnal noise presents a behavior clearly different on respect to the diurnal one. In this work 45 continuous measurements of approximately one week each in duration are statistically analyzed to identify differences between the proposed categories. The results show that the five proposed categories highlight the noise stratification of the studied city in each period of the day (day, evening, and night). A comparison of the continuous measurements with previous short-term measurements indicates that the latter can be a good approximation of the former in diurnal period, reducing the resource expenditure for noise evaluation. Annoyance estimated from the measured noise levels was compared with the response of population obtained from a questionnaire with good agreement. The categorization method can yield good information about the distribution of a pollutant associated to traffic in our cities in each period of the day and, therefore, is a powerful tool for town planning and the design of pollution prevention policies.

  18. TEAMS Model Analyzer

    NASA Technical Reports Server (NTRS)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  19. PULSE HEIGHT ANALYZER

    DOEpatents

    Johnstone, C.W.

    1958-01-21

    An anticoincidence device is described for a pair of adjacent channels of a multi-channel pulse height analyzer for preventing the lower channel from generating a count pulse in response to an input pulse when the input pulse has sufficient magnitude to reach the upper level channel. The anticoincidence circuit comprises a window amplifier, upper and lower level discriminators, and a biased-off amplifier. The output of the window amplifier is coupled to the inputs of the discriminators, the output of the upper level discriminator is connected to the resistance end of a series R-C network, the output of the lower level discriminator is coupled to the capacitance end of the R-C network, and the grid of the biased-off amplifier is coupled to the junction of the R-C network. In operation each discriminator produces a negative pulse output when the input pulse traverses its voltage setting. As a result of the connections to the R-C network, a trigger pulse will be sent to the biased-off amplifier when the incoming pulse level is sufficient to trigger only the lower level discriminator.

  20. Analyzing Spacecraft Telecommunication Systems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  1. PULSE HEIGHT ANALYZER

    DOEpatents

    Goldsworthy, W.W.

    1958-06-01

    A differential pulse-height discriminator circuit is described which is readily adaptable for operation in a single-channel pulse-height analyzer. The novel aspect of the circuit lies in the specific arrangement of differential pulse-height discriminator which includes two pulse-height discriminators having a comnnon input and an anticoincidence circuit having two interconnected vacuum tubes with a common cathode resistor. Pulses from the output of one discriminator circuit are delayed and coupled to the grid of one of the anticoincidence tubes by a resistor. The output pulses from the other discriminator circuit are coupled through a cathode follower circuit, which has a cathode resistor of such value as to provide a long time constant with the interelectrode capacitance of the tube, to lenthen the output pulses. The pulses are then fed to the grid of the other anticoincidence tube. With such connections of the circuits, only when the incoming pulse has a pesk value between the operating levels of the two discriminators does an output pulse occur from the anticoincidence circuit.

  2. Analyzing Atmospheric Neutrino Oscillations

    SciTech Connect

    Escamilla, J.; Ernst, D. J.; Latimer, D. C.

    2007-10-26

    We provide a pedagogic derivation of the formula needed to analyze atmospheric data and then derive, for the subset of the data that are fully-contained events, an analysis tool that is quantitative and numerically efficient. Results for the full set of neutrino oscillation data are then presented. We find the following preliminary results: 1.) the sub-dominant approximation provides reasonable values for the best fit parameters for {delta}{sub 32}, {theta}{sub 23}, and {theta}{sub 13} but does not quantitatively provide the errors for these three parameters; 2.) the size of the MSW effect is suppressed in the sub-dominant approximation; 3.) the MSW effect reduces somewhat the extracted error for {delta}{sub 32}, more so for {theta}{sub 23} and {theta}{sub 13}; 4.) atmospheric data alone constrains the allowed values of {theta}{sub 13} only in the sub-dominant approximation, the full three neutrino calculations requires CHOOZ to get a clean constraint; 5.) the linear in {theta}{sub 13} terms are not negligible; and 6.) the minimum value of {theta}{sub 13} is found to be negative, but at a statistically insignificant level.

  3. Pseudostupidity and analyzability.

    PubMed

    Cohn, L S

    1989-01-01

    This paper seeks to heighten awareness of pseudostupidity and the potential analyzability of patients who manifest it by defining and explicating it, reviewing the literature, and presenting in detail the psychoanalytic treatment of a pseudostupid patient. Pseudostupidity is caused by an inhibition of the integration and synthesis of thoughts resulting in a discrepancy between intellectual capacity and apparent intellect. The patient's pseudostupidity was determined in part by his need to prevent his being more successful than father, i.e., defeating his oedipal rival. Knowing and learning were instinctualized. The patient libidinally and defensively identified with father's passive, masochistic position. He needed to frustrate the analyst as he had felt excited and frustrated by his parents' nudity and thwarted by his inhibitions. He wanted to cause the analyst to feel as helpless as he, the patient, felt. Countertransference frustration was relevant and clinically useful in the analysis. Interpretation of evolving relevant issues led to more anxiety and guilt, less pseudostupidity, a heightened alliance, and eventual working through. Negative therapeutic reactions followed the resolution of pseudostupidity. PMID:2708771

  4. Nonlinear Single Spin Spectrum Analyzer

    NASA Astrophysics Data System (ADS)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2014-03-01

    Qubits have been used as linear spectrum analyzers of their environments, through the use of decoherence spectroscopy. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis. Phys. Rev. Lett. 110, 110503 (2013). Synopsis at http://physics.aps.org/synopsis-for/10.1103/PhysRevLett.110.110503 Current position: NIST, Boulder, CO.

  5. Analyzing large biological datasets with association networks

    SciTech Connect

    Karpinets, T. V.; Park, B. H.; Uberbacher, E. C.

    2012-05-25

    Due to advances in high throughput biotechnologies biological information is being collected in databases at an amazing rate, requiring novel computational approaches for timely processing of the collected data into new knowledge. In this study we address this problem by developing a new approach for discovering modular structure, relationships and regularities in complex data. These goals are achieved by converting records of biological annotations of an object, like organism, gene, chemical, sequence, into networks (Anets) and rules (Arules) of the associated annotations. Anets are based on similarity of annotation profiles of objects and can be further analyzed and visualized providing a compact birds-eye view of most significant relationships in the collected data and a way of their clustering and classification. Arules are generated by Apriori considering each record of annotations as a transaction and augmenting each annotation item by its type. Arules provide a way to validate relationships discovered by Anets producing comprehensive statistics on frequently associated annotations and specific confident relationships among them. A combination of Anets and Arules represents condensed information on associations among the collected data, helping to discover new knowledge and generate hypothesis. As an example we have applied the approach to analyze bacterial metadata from the Genomes OnLine Database. The analysis allowed us to produce a map of sequenced bacterial and archaeal organisms based on their genomic, metabolic and physiological characteristics with three major clusters of metadata representing bacterial pathogens, environmental isolates, and plant symbionts. A signature profile of clustered annotations of environmental bacteria if compared with pathogens linked the aerobic respiration, the high GC content and the large genome size to diversity of metabolic activities and physiological features of the organisms.

  6. Digital Microfluidics Sample Analyzer

    NASA Technical Reports Server (NTRS)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  7. Soft Decision Analyzer

    NASA Technical Reports Server (NTRS)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  8. Crew Activity Analyzer

    NASA Technical Reports Server (NTRS)

    Murray, James; Kirillov, Alexander

    2008-01-01

    The crew activity analyzer (CAA) is a system of electronic hardware and software for automatically identifying patterns of group activity among crew members working together in an office, cockpit, workshop, laboratory, or other enclosed space. The CAA synchronously records multiple streams of data from digital video cameras, wireless microphones, and position sensors, then plays back and processes the data to identify activity patterns specified by human analysts. The processing greatly reduces the amount of time that the analysts must spend in examining large amounts of data, enabling the analysts to concentrate on subsets of data that represent activities of interest. The CAA has potential for use in a variety of governmental and commercial applications, including planning for crews for future long space flights, designing facilities wherein humans must work in proximity for long times, improving crew training and measuring crew performance in military settings, human-factors and safety assessment, development of team procedures, and behavioral and ethnographic research. The data-acquisition hardware of the CAA (see figure) includes two video cameras: an overhead one aimed upward at a paraboloidal mirror on the ceiling and one mounted on a wall aimed in a downward slant toward the crew area. As many as four wireless microphones can be worn by crew members. The audio signals received from the microphones are digitized, then compressed in preparation for storage. Approximate locations of as many as four crew members are measured by use of a Cricket indoor location system. [The Cricket indoor location system includes ultrasonic/radio beacon and listener units. A Cricket beacon (in this case, worn by a crew member) simultaneously transmits a pulse of ultrasound and a radio signal that contains identifying information. Each Cricket listener unit measures the difference between the times of reception of the ultrasound and radio signals from an identified beacon

  9. Regolith Evolved Gas Analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, John H.; Hedgecock, Jud; Nienaber, Terry; Cooper, Bonnie; Allen, Carlton; Ming, Doug

    2000-01-01

    The Regolith Evolved Gas Analyzer (REGA) is a high-temperature furnace and mass spectrometer instrument for determining the mineralogical composition and reactivity of soil samples. REGA provides key mineralogical and reactivity data that is needed to understand the soil chemistry of an asteroid, which then aids in determining in-situ which materials should be selected for return to earth. REGA is capable of conducting a number of direct soil measurements that are unique to this instrument. These experimental measurements include: (1) Mass spectrum analysis of evolved gases from soil samples as they are heated from ambient temperature to 900 C; and (2) Identification of liberated chemicals, e.g., water, oxygen, sulfur, chlorine, and fluorine. REGA would be placed on the surface of a near earth asteroid. It is an autonomous instrument that is controlled from earth but does the analysis of regolith materials automatically. The REGA instrument consists of four primary components: (1) a flight-proven mass spectrometer, (2) a high-temperature furnace, (3) a soil handling system, and (4) a microcontroller. An external arm containing a scoop or drill gathers regolith samples. A sample is placed in the inlet orifice where the finest-grained particles are sifted into a metering volume and subsequently moved into a crucible. A movable arm then places the crucible in the furnace. The furnace is closed, thereby sealing the inner volume to collect the evolved gases for analysis. Owing to the very low g forces on an asteroid compared to Mars or the moon, the sample must be moved from inlet to crucible by mechanical means rather than by gravity. As the soil sample is heated through a programmed pattern, the gases evolved at each temperature are passed through a transfer tube to the mass spectrometer for analysis and identification. Return data from the instrument will lead to new insights and discoveries including: (1) Identification of the molecular masses of all of the gases

  10. Some easily analyzable convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R.; Dolinar, S.; Pollara, F.; Vantilborg, H.

    1989-01-01

    Convolutional codes have played and will play a key role in the downlink telemetry systems on many NASA deep-space probes, including Voyager, Magellan, and Galileo. One of the chief difficulties associated with the use of convolutional codes, however, is the notorious difficulty of analyzing them. Given a convolutional code as specified, say, by its generator polynomials, it is no easy matter to say how well that code will perform on a given noisy channel. The usual first step in such an analysis is to computer the code's free distance; this can be done with an algorithm whose complexity is exponential in the code's constraint length. The second step is often to calculate the transfer function in one, two, or three variables, or at least a few terms in its power series expansion. This step is quite hard, and for many codes of relatively short constraint lengths, it can be intractable. However, a large class of convolutional codes were discovered for which the free distance can be computed by inspection, and for which there is a closed-form expression for the three-variable transfer function. Although for large constraint lengths, these codes have relatively low rates, they are nevertheless interesting and potentially useful. Furthermore, the ideas developed here to analyze these specialized codes may well extend to a much larger class.

  11. PROBLEM OF COMPLEX EIGENSYSTEMS INTHE SEMIANALYTICAL SOLUTION FOR ADVANCEMENT OF TIME IN SOLUTE TRANSPORT SIMULATIONS: A NEW METHOD USING REAL ARITHMETIC.

    USGS Publications Warehouse

    Umari, Amjad M. J.; Gorelick, Steven M.

    1986-01-01

    In the numerical modeling of groundwater solute transport, explicit solutions may be obtained for the concentration field at any future time without computing concentrations at intermediate times. The spatial variables are discretized and time is left continuous inthe governing differnetial equation. These semianalytical solutions have been presented in the literature and involve the eigensystem of a coefficient matrix. This eigensystem may be complex (i. e. , have imaginary components) due to the asymmetry created by the advection term in the governing advection-dispersionequation. It is shown here that the error due to ignoring the imaginary components of complex eigenvalues is large for small dispersivity values. A new algorithm that represents the complex eigensystem by converting it to a real eigensystem is presented. The method requires only real arithmetic.

  12. Problems of Indian Children.

    ERIC Educational Resources Information Center

    Linton, Marigold

    Previous approaches to the learning problems of American Indian children are viewed as inadequate. An alternative is suggested which emphasizes the problem solution strategies which these children bring to the school situation. Solutions were analyzed in terms of: (1) their probability; (2) their efficiency at permitting a present problem to be…

  13. Modular thermal analyzer routine, volume 1

    NASA Technical Reports Server (NTRS)

    Oren, J. A.; Phillips, M. A.; Williams, D. R.

    1972-01-01

    The Modular Thermal Analyzer Routine (MOTAR) is a general thermal analysis routine with strong capabilities for performing thermal analysis of systems containing flowing fluids, fluid system controls (valves, heat exchangers, etc.), life support systems, and thermal radiation situations. Its modular organization permits the analysis of a very wide range of thermal problems for simple problems containing a few conduction nodes to those containing complicated flow and radiation analysis with each problem type being analyzed with peak computational efficiency and maximum ease of use. The organization and programming methods applied to MOTAR achieved a high degree of computer utilization efficiency in terms of computer execution time and storage space required for a given problem. The computer time required to perform a given problem on MOTAR is approximately 40 to 50 percent that required for the currently existing widely used routines. The computer storage requirement for MOTAR is approximately 25 percent more than the most commonly used routines for the most simple problems but the data storage techniques for the more complicated options should save a considerable amount of space.

  14. Classroom Learning and Achievement: How the Complexity of Classroom Interaction Impacts Students' Learning

    ERIC Educational Resources Information Center

    Podschuweit, Sören; Bernholt, Sascha; Brückmann, Maja

    2016-01-01

    Background: Complexity models have provided a suitable framework in various domains to assess students' educational achievement. Complexity is often used as the analytical focus when regarding learning outcomes, i.e. when analyzing written tests or problem-centered interviews. Numerous studies reveal negative correlations between the complexity of…

  15. An approach to 1,3,4-dioxaphospholane complexes through an acid-induced ring expansion of an oxaphosphirane complex: the problem of construction and deconstruction of O,P-heterocycles.

    PubMed

    Pérez, Janaina Marinas; Helten, Holger; Schnakenburg, Gregor; Streubel, Rainer

    2011-06-01

    Treatment of oxaphosphirane complex 1, triflic acid (TfOH), and various aldehydes yielded 1,3,4-dioxaphospholane complexes 5a,b-7a,b after deprotonation with NEt(3). In addition to NMR spectroscopy, IR spectroscopy, and MS data, the X-ray structures of complexes 5a and 7a were determined. (31)P NMR spectroscopic monitoring and DFT calculations provided insight into the reaction course and revealed the transient TfOH 1,3,4-dioxaphospholanium association complex TfOH-5a,b and/or TfOH-5a,b' as key reactive intermediates. Furthermore, it was observed that the five-membered ring system was cleaved upon warming and yielded side-on (E,Z)-methylenephosphonium complexes 8a,b if deprotonation did not occur at low temperature. Overall, a novel temperature- and acid-dependent construction and deconstruction process of the 1,3,4-dioxaphospholane ring system is described. PMID:21433300

  16. Method for analyzing signaling networks in complex cellular systems.

    PubMed

    Plavec, Ivan; Sirenko, Oksana; Privat, Sylvie; Wang, Yuker; Dajee, Maya; Melrose, Jennifer; Nakao, Brian; Hytopoulos, Evangelos; Berg, Ellen L; Butcher, Eugene C

    2004-02-01

    Now that the human genome has been sequenced, the challenge of assigning function to human genes has become acute. Existing approaches using microarrays or proteomics frequently generate very large volumes of data not directly related to biological function, making interpretation difficult. Here, we describe a technique for integrative systems biology in which: (i) primary cells are cultured under biologically meaningful conditions; (ii) a limited number of biologically meaningful readouts are measured; and (iii) the results obtained under several different conditions are combined for analysis. Studies of human endothelial cells overexpressing different signaling molecules under multiple inflammatory conditions show that this system can capture a remarkable range of functions by a relatively small number of simple measurements. In particular, measurement of seven different protein levels by ELISA under four different conditions is capable of reconstructing pathway associations of 25 different proteins representing four known signaling pathways, implicating additional participants in the NF-kappaBorRAS/mitogen-activated protein kinase pathways and defining additional interactions between these pathways. PMID:14745015

  17. Method for analyzing signaling networks in complex cellular systems

    PubMed Central

    Plavec, Ivan; Sirenko, Oksana; Privat, Sylvie; Wang, Yuker; Dajee, Maya; Melrose, Jennifer; Nakao, Brian; Hytopoulos, Evangelos; Berg, Ellen L.; Butcher, Eugene C.

    2004-01-01

    Now that the human genome has been sequenced, the challenge of assigning function to human genes has become acute. Existing approaches using microarrays or proteomics frequently generate very large volumes of data not directly related to biological function, making interpretation difficult. Here, we describe a technique for integrative systems biology in which: (i) primary cells are cultured under biologically meaningful conditions; (ii) a limited number of biologically meaningful readouts are measured; and (iii) the results obtained under several different conditions are combined for analysis. Studies of human endothelial cells overexpressing different signaling molecules under multiple inflammatory conditions show that this system can capture a remarkable range of functions by a relatively small number of simple measurements. In particular, measurement of seven different protein levels by ELISA under four different conditions is capable of reconstructing pathway associations of 25 different proteins representing four known signaling pathways, implicating additional participants in the NF-κBorRAS/mitogen-activated protein kinase pathways and defining additional interactions between these pathways. PMID:14745015

  18. TECHNIQUES FOR ANALYZING COMPLEX MIXTURES OF DRINKING WATER DBPS

    EPA Science Inventory

    Although chlorine has been used to disinfect drinking water for approximately 100 years, there have been concerns raised over its use, due to the formation of potentially hazardous by-products. Trihalomethanes (THMs) were the first disinfection by-products (DBPs) identified and ...

  19. L.E.A.D.: A Framework for Evidence Gathering and Use for the Prevention of Obesity and Other Complex Public Health Problems

    ERIC Educational Resources Information Center

    Chatterji, Madhabi; Green, Lawrence W.; Kumanyika, Shiriki

    2014-01-01

    This article summarizes a comprehensive, systems-oriented framework designed to improve the use of a wide variety of evidence sources to address population-wide obesity problems. The L.E.A.D. framework (for "Locate" the evidence, "Evaluate" the evidence, "Assemble" the evidence, and inform "Decisions"),…

  20. Tourette Syndrome: Overview and Classroom Interventions. A Complex Neurobehavioral Disorder Which May Involve Learning Problems, Attention Deficit Hyperactivity Disorder, Obsessive Compulsive Symptoms, and Stereotypical Behaviors.

    ERIC Educational Resources Information Center

    Fisher, Ramona A.; Collins, Edward C.

    Tourette Syndrome is conceptualized as a neurobehavioral disorder, with behavioral aspects that are sometimes difficult for teachers to understand and deal with. The disorder has five layers of complexity: (1) observable multiple motor, vocal, and cognitive tics and sensory involvement; (2) Attention Deficit Hyperactivity Disorder; (3)…

  1. Boundary force method for analyzing two-dimensional cracked bodies

    NASA Technical Reports Server (NTRS)

    Tan, P. W.; Raju, I. S.; Newman, J. C., Jr.

    1986-01-01

    The Boundary Force Method (BFM) was formulated for the two-dimensional stress analysis of complex crack configurations. In this method, only the boundaries of the region of interest are modeled. The boundaries are divided into a finite number of straight-line segments, and at the center of each segment, concentrated forces and a moment are applied. This set of unknown forces and moments is calculated to satisfy the prescribed boundary conditions of the problem. The elasticity solution for the stress distribution due to concentrated forces and a moment applied at an arbitrary point in a cracked infinite plate are used as the fundamental solution. Thus, the crack need not be modeled as part of the boundary. The formulation of the BFM is described and the accuracy of the method is established by analyzing several crack configurations for which accepted stress-intensity factor solutions are known. The crack configurations investigated include mode I and mixed mode (mode I and II) problems. The results obtained are, in general, within + or - 0.5 percent of accurate numerical solutions. The versatility of the method is demonstrated through the analysis of complex crack configurations for which limited or no solutions are known.

  2. Analyzes Data from Semiconductor Wafers

    2002-07-23

    This program analyzes reflectance data from semiconductor wafers taken during the deposition or evolution of a thin film, typically via chemical vapor deposition (CVD) or molecular beam epitaxy (MBE). It is used to determine the growth rate and optical constants of the deposited thin films using a virtual interface concept. Growth rates and optical constants of multiple-layer structures is possible by selecting appropriate sections in the reflectance vs time waveform. No prior information or estimatesmore » of growth rates and materials properties is required if an absolute reflectance waveform is used. If the optical constants of a thin film are known, then the growth rate may be extracted from a relative reflectance data set. The analysis is valid for either s or p polarized light at any incidence angle and wavelength. The analysis package is contained within an easy-to-use graphical user interface. The program is based on the algorighm described in the following two publications: W.G. Breiland and K.P. Killen, J. Appl. Phys. 78 (1995) 6726, and W. G. Breiland, H.Q. Hou, B.E. Hammons, and J.F. Klem, Proc. XXVIII SOTAPOCS Symp. Electrochem. Soc. San Diego, May 3-8, 1998. It relies on the fact that any multiple-layer system has a reflectance spectrum that is mathematically equivalent to a single-layer thin film on a virtual substrate. The program fits the thin film reflectance with five adjustable parameters: 1) growth rate, 2) real part of complex refractive index, 3) imaginary part of refractive index, 4) amplitude of virtual interface reflectance, 5) phase of virtual interface reflectance.« less

  3. Soft Decision Analyzer and Method

    NASA Technical Reports Server (NTRS)

    Steele, Glen F. (Inventor); Lansdowne, Chatwin (Inventor); Zucha, Joan P. (Inventor); Schlesinger, Adam M. (Inventor)

    2016-01-01

    A soft decision analyzer system is operable to interconnect soft decision communication equipment and analyze the operation thereof to detect symbol wise alignment between a test data stream and a reference data stream in a variety of operating conditions.

  4. Soft Decision Analyzer and Method

    NASA Technical Reports Server (NTRS)

    Steele, Glen F. (Inventor); Lansdowne, Chatwin (Inventor); Zucha, Joan P. (Inventor); Schlesinger, Adam M. (Inventor)

    2015-01-01

    A soft decision analyzer system is operable to interconnect soft decision communication equipment and analyze the operation thereof to detect symbol wise alignment between a test data stream and a reference data stream in a variety of operating conditions.

  5. Droplet actuator analyzer with cartridge

    NASA Technical Reports Server (NTRS)

    Smith, Gregory F. (Inventor); Sturmer, Ryan A. (Inventor); Paik, Philip Y. (Inventor); Srinivasan, Vijay (Inventor); Pollack, Michael G. (Inventor); Pamula, Vamsee K. (Inventor); Brafford, Keith R. (Inventor); West, Richard M. (Inventor)

    2011-01-01

    A droplet actuator with cartridge is provided. According to one embodiment, a sample analyzer is provided and includes an analyzer unit comprising electronic or optical receiving means, a cartridge comprising self-contained droplet handling capabilities, and a wherein the cartridge is coupled to the analyzer unit by a means which aligns electronic and/or optical outputs from the cartridge with electronic or optical receiving means on the analyzer unit. According to another embodiment, a sample analyzer is provided and includes a sample analyzer comprising a cartridge coupled thereto and a means of electrical interface and/or optical interface between the cartridge and the analyzer, whereby electrical signals and/or optical signals may be transmitted from the cartridge to the analyzer.

  6. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information. Part 2—Application to crosshole GPR tomography

    NASA Astrophysics Data System (ADS)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Looms, Majken Caroline; Mosegaard, Klaus

    2013-03-01

    We present an application of the SIPPI Matlab toolbox, to obtain a sample from the a posteriori probability density function for the classical tomographic inversion problem. We consider a number of different forward models, linear and non-linear, such as ray based forward models that rely on the high frequency approximation of the wave-equation and 'fat' ray based forward models relying on finite frequency theory. In order to sample the a posteriori probability density function we make use of both least squares based inversion, for linear Gaussian inverse problems, and the extended Metropolis sampler, for non-linear non-Gaussian inverse problems. To illustrate the applicability of the SIPPI toolbox to a tomographic field data set we use a cross-borehole traveltime data set from Arrenæs, Denmark. Both the computer code and the data are released in the public domain using open source and open data licenses. The code has been developed to facilitate inversion of 2D and 3D travel time tomographic data using a wide range of possible a priori models and choices of forward models.

  7. Analyzing stochastic dependence of cognitive processes in multidimensional source recognition.

    PubMed

    Meiser, Thorsten

    2014-01-01

    Stochastic dependence among cognitive processes can be modeled in different ways, and the family of multinomial processing tree models provides a flexible framework for analyzing stochastic dependence among discrete cognitive states. This article presents a multinomial model of multidimensional source recognition that specifies stochastic dependence by a parameter for the joint retrieval of multiple source attributes together with parameters for stochastically independent retrieval. The new model is equivalent to a previous multinomial model of multidimensional source memory for a subset of the parameter space. An empirical application illustrates the advantages of the new multinomial model of joint source recognition. The new model allows for a direct comparison of joint source retrieval across conditions, it avoids statistical problems due to inflated confidence intervals and does not imply a conceptual imbalance between source dimensions. Model selection criteria that take model complexity into account corroborate the new model of joint source recognition.

  8. Balance Problems

    MedlinePlus

    ... often, it could be a sign of a balance problem. Balance problems can make you feel unsteady or as ... fall-related injuries, such as hip fracture. Some balance problems are due to problems in the inner ...

  9. Use of single-molecule spectroscopy to tackle fundamental problems in biochemistry: using studies on purple bacterial antenna complexes as an example.

    PubMed

    Cogdell, Richard J; Köhler, Jürgen

    2009-08-13

    Optical single-molecule techniques can be used in two modes to investigate fundamental questions in biochemistry, namely single-molecule detection and single-molecule spectroscopy. This review provides an overview of how single-molecule spectroscopy can be used to gain detailed information on the electronic structure of purple bacterial antenna complexes and to draw conclusions about the underlying physical structure. This information can be used to understand the energy-transfer reactions that are responsible for the earliest reactions in photosynthesis.

  10. The problems associated with the monitoring of complex workplace radiation fields at European high-energy accelerators and thermonuclear fusion facilities.

    PubMed

    Bilski, P; Blomgren, J; d'Errico, F; Esposito, A; Fehrenbacher, G; Fernàndez, F; Fuchs, A; Golnik, N; Lacoste, V; Leuschner, A; Sandri, S; Silari, M; Spurny, F; Wiegel, B; Wright, P

    2007-01-01

    The European Commission is funding within its Sixth Framework Programme a three-year project (2005-2007) called CONRAD, COordinated Network for RAdiation Dosimetry. The organisational framework for this project is provided by the European Radiation Dosimetry Group EURADOS. One task within the CONRAD project, Work Package 6 (WP6), was to provide a report outlining research needs and research activities within Europe to develop new and improved methods and techniques for the characterisation of complex radiation fields at workplaces around high-energy accelerators, but also at the next generation of thermonuclear fusion facilities. The paper provides an overview of the report, which will be available as CERN Yellow Report.

  11. Going beyond the hero in leadership development: the place of healthcare context, complexity and relationships: Comment on "Leadership and leadership development in healthcare settings - a simplistic solution to complex problems?".

    PubMed

    Ford, Jackie

    2015-04-01

    There remains a conviction that the torrent of publications and the financial outlay on leadership development will create managers with the skills and characters of perfect leaders, capable of guiding healthcare organisations through the challenges and crises of the 21st century. The focus of much attention continues to be the search for the (illusory) core set of heroic qualities, abilities or competencies that will enable the development of leaders to achieve levels of supreme leadership and organisational performance. This brief commentary adds support to McDonald's (1) call for recognition of the complexity of the undertaking. PMID:25844391

  12. Going beyond the hero in leadership development: the place of healthcare context, complexity and relationships: Comment on "Leadership and leadership development in healthcare settings - a simplistic solution to complex problems?".

    PubMed

    Ford, Jackie

    2015-04-01

    There remains a conviction that the torrent of publications and the financial outlay on leadership development will create managers with the skills and characters of perfect leaders, capable of guiding healthcare organisations through the challenges and crises of the 21st century. The focus of much attention continues to be the search for the (illusory) core set of heroic qualities, abilities or competencies that will enable the development of leaders to achieve levels of supreme leadership and organisational performance. This brief commentary adds support to McDonald's (1) call for recognition of the complexity of the undertaking.

  13. Interpolation Errors in Spectrum Analyzers

    NASA Technical Reports Server (NTRS)

    Martin, J. L.

    1996-01-01

    To obtain the proper measurement amplitude with a spectrum analyzer, the correct frequency-dependent transducer factor must be added to the voltage measured by the transducer. This report examines how entering transducer factors into a spectrum analyzer can cause significant errors in field amplitude due to the misunderstanding of the analyzer's interpolation methods. It also discusses how to reduce these errors to obtain a more accurate field amplitude reading.

  14. Accuracy considerations of portable electrochemical NOX analyzers

    SciTech Connect

    Capetanopoulos, C.; Hobbs, B.

    1996-12-31

    Two key components contributing to measurement errors of electrochemical analyzers are discussed. These are the sample conditioning system and the electrochemical nitric oxide and nitrogen dioxide sensors. The problems associated with various types of conditioning systems are discussed and some experimental results are presented using analyte spiking methods. Permeation drier based systems are shown to cause the smallest loss of the analyte. Two major problems of the NO and NO{sub 2} sensors are examined. The first problem deals with the significant effect of temperature on the sensor and its associated interference rejection filter. The requirement for maintaining sensor and filter temperature below 30{degree}C is demonstrated. The second deals with the saturation and drift considerations caused by over exposure to the gas, The significance of capillary size to minimize drift for diffusion sensors is discussed. Experimental results are presented and discussed with a view to the recently published EPA CTM-022 Method. 2 refs., 7 figs.

  15. Problem-Based Learning

    ERIC Educational Resources Information Center

    Allen, Deborah E.; Donham, Richard S.; Bernhardt, Stephen A.

    2011-01-01

    In problem-based learning (PBL), students working in collaborative groups learn by resolving complex, realistic problems under the guidance of faculty. There is some evidence of PBL effectiveness in medical school settings where it began, and there are numerous accounts of PBL implementation in various undergraduate contexts, replete with…

  16. The possibility of generating focal regions of complex configurations in application to the problems of stimulation of human receptor structures by focused ultrasound

    NASA Astrophysics Data System (ADS)

    Gavrilov, L. R.

    2008-03-01

    Studies of the stimulating effect of ultrasound on human receptor structures have recently become more intensive in connection with the development of promising robotic techniques and systems, sensors, and automated control systems, as well as with the use of taction in the design of a human-machine interface. One of the promising fields of research is the development of tactile displays for transmission of sensory data to a human by an acoustic method based on the effect of radiation pressure. In this case, it is necessary to generate rapidly changing patterns on a display (symbols, letters, digits, etc.), which may often have a complex shape. It is demonstrated that such patterns can be created by the generation of multiple-focus ultrasonic fields with the help of two-dimensional phased arrays whose elements are randomly positioned on the surface. The parameters for such an array are presented. It is shown that the arrays make it possible to form the regions of action by focused ultrasound with various necessary shapes and the sidelobe (or other secondary peak) intensity level acceptable for practical purposes. Using these arrays, it is possible to move the set of foci off the array axis to a distance of at least ±5 mm, which corresponds to the display dimensions. It is possible, on the screen of a tactile display, to generate the regions of action with a very complex shape, for example, Latin letters. This opportunity may be of interest, for example, for the development of systems that enable a blind person to perceive the displayed text information by using the sense of touch.

  17. Problem Solving in the Professions.

    ERIC Educational Resources Information Center

    Jackling, Noel; And Others

    1990-01-01

    It is proposed that algorithms and heuristics are useful in improving professional problem-solving abilities when contextualized within the academic discipline. A basic algorithm applied to problem solving in undergraduate engineering education and a similar algorithm applicable to legal problems are used as examples. Problem complexity and…

  18. Problem-Based Learning Tools

    ERIC Educational Resources Information Center

    Chin, Christine; Chia, Li-Gek

    2008-01-01

    One way of implementing project-based science (PBS) is to use problem-based learning (PBL), in which students formulate their own problems. These problems are often ill-structured, mirroring complex real-life problems where data are often messy and inclusive. In this article, the authors describe how they used PBL in a ninth-grade biology class in…

  19. Implementation of a multidisciplinary approach to solve complex nano EHS problems by the UC Center for the Environmental Implications of Nanotechnology.

    PubMed

    Xia, Tian; Malasarn, Davin; Lin, Sijie; Ji, Zhaoxia; Zhang, Haiyuan; Miller, Robert J; Keller, Arturo A; Nisbet, Roger M; Harthorn, Barbara H; Godwin, Hilary A; Lenihan, Hunter S; Liu, Rong; Gardea-Torresdey, Jorge; Cohen, Yoram; Mädler, Lutz; Holden, Patricia A; Zink, Jeffrey I; Nel, Andre E

    2013-05-27

    UC CEIN was established with funding from the US National Science Foundation and the US Environmental Protection Agency in 2008 with the mission to study the impact of nanotechnology on the environment, including the identification of hazard and exposure scenarios that take into consideration the unique physicochemical properties of engineered nanomaterials (ENMs). Since its inception, the Center has made great progress in assembling a multidisciplinary team to develop the scientific underpinnings, research, knowledge acquisition, education and outreach that is required for assessing the safe implementation of nanotechnology in the environment. In this essay, the development of the infrastructure, protocols, and decision-making tools that are required to effectively integrate complementary scientific disciplines allowing knowledge gathering in a complex study area that goes beyond the traditional safety and risk assessment protocols of the 20th century is outlined. UC CEIN's streamlined approach, premised on predictive hazard and exposure assessment methods, high-throughput discovery platforms and environmental decision-making tools that consider a wide range of nano/bio interfaces in terrestrial and aquatic ecosystems, demonstrates the implementation of a 21st-century approach to the safe implementation of nanotechnology in the environment.

  20. Balance Problems

    MedlinePlus

    ... version of this page please turn Javascript on. Balance Problems About Balance Problems Have you ever felt dizzy, lightheaded, or ... dizziness problem during the past year. Why Good Balance is Important Having good balance means being able ...

  1. Nuclear fuel microsphere gamma analyzer

    DOEpatents

    Valentine, Kenneth H.; Long, Jr., Ernest L.; Willey, Melvin G.

    1977-01-01

    A gamma analyzer system is provided for the analysis of nuclear fuel microspheres and other radioactive particles. The system consists of an analysis turntable with means for loading, in sequence, a plurality of stations within the turntable; a gamma ray detector for determining the spectrum of a sample in one section; means for analyzing the spectrum; and a receiver turntable to collect the analyzed material in stations according to the spectrum analysis. Accordingly, particles may be sorted according to their quality; e.g., fuel particles with fractured coatings may be separated from those that are not fractured, or according to other properties.

  2. A Stochastic Employment Problem

    ERIC Educational Resources Information Center

    Wu, Teng

    2013-01-01

    The Stochastic Employment Problem(SEP) is a variation of the Stochastic Assignment Problem which analyzes the scenario that one assigns balls into boxes. Balls arrive sequentially with each one having a binary vector X = (X[subscript 1], X[subscript 2],...,X[subscript n]) attached, with the interpretation being that if X[subscript i] = 1 the ball…

  3. Market study: Whole blood analyzer

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A market survey was conducted to develop findings relative to the commercialization potential and key market factors of the whole blood analyzer which is being developed in conjunction with NASA's Space Shuttle Medical System.

  4. Molecular wake shield gas analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, J. H.

    1980-01-01

    Techniques for measuring and characterizing the ultrahigh vacuum in the wake of an orbiting spacecraft are studied. A high sensitivity mass spectrometer that contains a double mass analyzer consisting of an open source miniature magnetic sector field neutral gas analyzer and an identical ion analyzer is proposed. These are configured to detect and identify gas and ion species of hydrogen, helium, nitrogen, oxygen, nitric oxide, and carbon dioxide and any other gas or ion species in the 1 to 46 amu mass range. This range covers the normal atmospheric constituents. The sensitivity of the instrument is sufficient to measure ambient gases and ion with a particle density of the order of one per cc. A chemical pump, or getter, is mounted near the entrance aperture of the neutral gas analyzer which integrates the absorption of ambient gases for a selectable period of time for subsequent release and analysis. The sensitivity is realizable for all but rare gases using this technique.

  5. Six Questions on Complex Systems

    NASA Astrophysics Data System (ADS)

    Symons, John F.; Sanayei, Ali

    2011-09-01

    This paper includes an interview with John F. Symons regarding some important questions in "complex systems" and "complexity". In addition, he has stated some important open problems concerning complex systems in his research area from a philosophical point of view.

  6. Analyzing Media: Metaphors as Methodologies.

    ERIC Educational Resources Information Center

    Meyrowitz, Joshua

    Students have little intuitive insight into the process of thinking and structuring ideas. The image of metaphor for a phenomenon acts as a kind of methodology for the study of the phenomenon by (1) defining the key issues or problems; (2) shaping the type of research questions that are asked; (3) defining the type of data that are searched out;…

  7. On-Demand Urine Analyzer

    NASA Technical Reports Server (NTRS)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  8. Rotor for centrifugal fast analyzers

    DOEpatents

    Lee, N.E.

    1984-01-01

    The invention is an improved photometric analyzer of the rotary cuvette type, the analyzer incorporating a multicuvette rotor of novel design. The rotor (a) is leaktight, (b) permits operation in the 90/sup 0/ and 180/sup 0/ excitation modes, (c) is compatible with extensively used Centrifugal Fast Analyzers, and (d) can be used thousands of times. The rotor includes an assembly comprising a top plate, a bottom plate, and a central plate, the rim of the central plate being formed with circumferentially spaced indentations. A uv-transmitting ring is sealably affixed to the indented rim to define with the indentations an array of cuvettes. The ring serves both as a sealing means and an end window for the cuvettes.

  9. Rotor for centrifugal fast analyzers

    DOEpatents

    Lee, Norman E.

    1985-01-01

    The invention is an improved photometric analyzer of the rotary cuvette type, the analyzer incorporating a multicuvette rotor of novel design. The rotor (a) is leaktight, (b) permits operation in the 90.degree. and 180.degree. excitation modes, (c) is compatible with extensively used Centrifugal Fast Analyzers, and (d) can be used thousands of times. The rotor includes an assembly comprising a top plate, a bottom plate, and a central plate, the rim of the central plate being formed with circumferentially spaced indentations. A UV-transmitting ring is sealably affixed to the indented rim to define with the indentations an array of cuvettes. The ring serves both as a sealing means and an end window for the cuvettes.

  10. Developpement d'une plateforme de calcul d'equilibres chimiques complexes et adaptation aux problemes electrochimiques et d'equilibres contraints

    NASA Astrophysics Data System (ADS)

    Neron, Alex

    Avec l'arrivée de l'environnement comme enjeu mondial, le secteur de l'efficacité énergétique prend une place de plus en plus importante pour les entreprises autant au niveau économique que pour l'image de la compagnie. Par le fait même, le domaine des technologies de l'énergie est un créneau de recherche dont les projets en cours se multiplient. D'ailleurs, un des problèmes qui peut survenir fréquemment dans certaines entreprises est d'aller mesurer la composition des matériaux dans des conditions difficiles d'accès. C'est le cas par exemple de l'électrolyse de l'aluminium qui se réalise à des températures très élevées. Pour pallier à ce problème, il faut créer et valider des modèles mathématiques qui vont calculer la composition et les propriétés à l'équilibre du système chimique. Ainsi, l'objectif global du projet de recherche est de développer un outil de calcul d'équilibres chimiques complexes (plusieurs réactions et plusieurs phases) et l'adapter aux problèmes électrochimiques et d'équilibres contraints. Plus spécifiquement, la plateforme de calcul doit tenir compte de la variation de température due à un gain ou une perte en énergie du système. Elle doit aussi considérer la limitation de l'équilibre due à un taux de réaction et enfin, résoudre les problèmes d'équilibres électrochimiques. Pour y parvenir, les propriétés thermodynamiques telles que l'énergie libre de Gibbs, la fugacité et l'activité sont tout d'abord étudiées pour mieux comprendre les interactions moléculaires qui régissent les équilibres chimiques. Ensuite, un bilan énergétique est inséré à la plateforme de calcul, ce qui permet de calculer la température à laquelle le système est le plus stable en fonction d'une température initiale et d'une quantité d'énergie échangée. Puis, une contrainte cinétique est ajoutée au système afin de calculer les équilibres pseudo-stationnaires en évolution dans le temps. De plus, la

  11. Real time infrared aerosol analyzer

    DOEpatents

    Johnson, Stanley A.; Reedy, Gerald T.; Kumar, Romesh

    1990-01-01

    Apparatus for analyzing aerosols in essentially real time includes a virtual impactor which separates coarse particles from fine and ultrafine particles in an aerosol sample. The coarse and ultrafine particles are captured in PTFE filters, and the fine particles impact onto an internal light reflection element. The composition and quantity of the particles on the PTFE filter and on the internal reflection element are measured by alternately passing infrared light through the filter and the internal light reflection element, and analyzing the light through infrared spectrophotometry to identify the particles in the sample.

  12. Analyzing epithelial and endothelial kisses in Merida

    PubMed Central

    Nusrat, Asma; Quiros, Miguel; González-Mariscal, Lorenza

    2013-01-01

    Last November a group of principal investigators, postdoctoral fellows and PhD students from around the world got together in the city of Merida in Southeastern Mexico in a State of the Art meeting on the “Molecular structure and function of the apical junctional complex in epithelial and endothelia.” They analyzed diverse tissue barriers including those in the gastrointestinal tract, the blood brain barrier, blood neural and blood retinal barriers. The talks revealed exciting new findings in the field, novel technical approaches and unpublished data and highlighted the importance of studying junctional complexes to better understand several pathogenesis and to develop therapeutic approaches that can be utilized for drug delivery. This meeting report has the purpose of highlighting the results and advances discussed by the speakers at the Merida Meeting.

  13. Structural qualia: a solution to the hard problem of consciousness

    PubMed Central

    Loorits, Kristjan

    2014-01-01

    The hard problem of consciousness has been often claimed to be unsolvable by the methods of traditional empirical sciences. It has been argued that all the objects of empirical sciences can be fully analyzed in structural terms but that consciousness is (or has) something over and above its structure. However, modern neuroscience has introduced a theoretical framework in which also the apparently non-structural aspects of consciousness, namely the so called qualia or qualitative properties, can be analyzed in structural terms. That framework allows us to see qualia as something compositional with internal structures that fully determine their qualitative nature. Moreover, those internal structures can be identified which certain neural patterns. Thus consciousness as a whole can be seen as a complex neural pattern that misperceives some of its own highly complex structural properties as monadic and qualitative. Such neural pattern is analyzable in fully structural terms and thereby the hard problem is solved. PMID:24672510

  14. Nonlinear Single-Spin Spectrum Analyzer

    NASA Astrophysics Data System (ADS)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-01

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  15. Using SCR methods to analyze requirements documentation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Morrison, Jeffery

    1995-01-01

    Software Cost Reduction (SCR) methods are being utilized to analyze and verify selected parts of NASA's EOS-DIS Core System (ECS) requirements documentation. SCR is being used as a spot-inspection tool. Through this formal and systematic approach of the SCR requirements methods, insights as to whether the requirements are internally inconsistent or incomplete as the scenarios of intended usage evolve in the OC (Operations Concept) documentation. Thus, by modelling the scenarios and requirements as mode charts using the SCR methods, we have been able to identify problems within and between the documents.

  16. Software-Design-Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    CRISP-90 software-design-analyzer system, update of CRISP-80, is set of computer programs constituting software tool for design and documentation of other software and supporting top-down, hierarchical, modular, structured methodologies for design and programming. Written in Microsoft QuickBasic.

  17. Strategies for Analyzing Tone Languages

    ERIC Educational Resources Information Center

    Coupe, Alexander R.

    2014-01-01

    This paper outlines a method of auditory and acoustic analysis for determining the tonemes of a language starting from scratch, drawing on the author's experience of recording and analyzing tone languages of north-east India. The methodology is applied to a preliminary analysis of tone in the Thang dialect of Khiamniungan, a virtually undocumented…

  18. Converting general nonlinear programming problems into separable programming problems with feedforward neural networks.

    PubMed

    Lu, Bao-Liang; Ito, Koji

    2003-09-01

    In this paper we present a method for converting general nonlinear programming (NLP) problems into separable programming (SP) problems by using feedforward neural networks (FNNs). The basic idea behind the method is to use two useful features of FNNs: their ability to approximate arbitrary continuous nonlinear functions with a desired degree of accuracy and their ability to express nonlinear functions in terms of parameterized compositions of functions of single variables. According to these two features, any nonseparable objective functions and/or constraints in NLP problems can be approximately expressed as separable functions with FNNs. Therefore, any NLP problems can be converted into SP problems. The proposed method has three prominent features. (a) It is more general than existing transformation techniques; (b) it can be used to formulate optimization problems as SP problems even when their precise analytic objective function and/or constraints are unknown; (c) the SP problems obtained by the proposed method may highly facilitate the selection of grid points for piecewise linear approximation of nonlinear functions. We analyze the computational complexity of the proposed method and compare it with an existing transformation approach. We also present several examples to demonstrate the method and the performance of the simplex method with the restricted basis entry rule for solving SP problems.

  19. Using Problem-Based Learning in Accounting

    ERIC Educational Resources Information Center

    Hansen, James D.

    2006-01-01

    In this article, the author describes the process of writing a problem-based learning (PBL) problem and shows how a typical end-of-chapter accounting problem can be converted to a PBL problem. PBL uses complex, real-world problems to motivate students to identify and research the concepts and principles they need to know to solve these problems.…

  20. Clustering-led complex brain networks approach.

    PubMed

    Liu, Dazhong; Zhong, Ning

    2014-01-01

    This paper reviewed the meaning of the statistic index and the properties of the complex network models and their physiological explanation. By analyzing existing problems and construction strategies, this paper attempted to construct complex brain networks from a different point of view: that of clustering first and constructing the brain network second. A clustering-guided (or led) construction strategy towards complex brain networks was proposed. The research focused on the discussion of the task-induced brain network. To discover different networks in a single run, a combined-clusters method was applied. Afterwards, a complex local brain network was formed with a complex network method on voxels. In a real test dataset, it was found that the network had small-world characteristics and had no significant scale-free properties. Meanwhile, some key bridge nodes and their characteristics were identified in the local network by calculating the betweenness centrality.

  1. Walking Problems

    MedlinePlus

    ... daily activities, get around, and exercise. Having a problem with walking can make daily life more difficult. ... walk is called your gait. A variety of problems can cause an abnormal gait and lead to ...

  2. Breathing Problems

    MedlinePlus

    ... re not getting enough air. Sometimes mild breathing problems are from a stuffy nose or hard exercise. ... emphysema or pneumonia cause breathing difficulties. So can problems with your trachea or bronchi, which are part ...

  3. Joint Problems

    MedlinePlus

    ... ankles and toes. Other types of arthritis include gout or pseudogout. Sometimes, there is a mechanical problem ... for more information on osteoarthritis, rheumatoid arthritis and gout. How Common are Joint Problems? Osteoarthritis, which affects ...

  4. The Statistical Loop Analyzer (SLA)

    NASA Technical Reports Server (NTRS)

    Lindsey, W. C.

    1985-01-01

    The statistical loop analyzer (SLA) is designed to automatically measure the acquisition, tracking and frequency stability performance characteristics of symbol synchronizers, code synchronizers, carrier tracking loops, and coherent transponders. Automated phase lock and system level tests can also be made using the SLA. Standard baseband, carrier and spread spectrum modulation techniques can be accomodated. Through the SLA's phase error jitter and cycle slip measurements the acquisition and tracking thresholds of the unit under test are determined; any false phase and frequency lock events are statistically analyzed and reported in the SLA output in probabilistic terms. Automated signal drop out tests can be performed in order to trouble shoot algorithms and evaluate the reacquisition statistics of the unit under test. Cycle slip rates and cycle slip probabilities can be measured using the SLA. These measurements, combined with bit error probability measurements, are all that are needed to fully characterize the acquisition and tracking performance of a digital communication system.

  5. Portable imaging polarized light analyzer

    NASA Astrophysics Data System (ADS)

    Shashar, Nadav; Cronin, Thomas W.; Johnson, George; Wolff, Lawrence B.

    1995-06-01

    Many animals, both marine and terrestrial, are sensitive to the orientation of the e-vector of partially linearly polarized light (PLPL). This sensitivity is used for navigation, spatial orientation, and detection of large bodies of water. However, it is not clear what other information animals may receive from polarized light. Natural light fields, both in the sky and underwater, are known to be partially polarized. Additionally, natural objects reflect light that is polarized at specific orientations. Sensors capable of measuring the characteristics of PLPL, namely partial polarization and orientation, throughout an image are not yet available. By placing 2 twisted nematic liquid crystals (TNLCs) and a fixed polarizing filter in series in front of a video camera, and by controlling the angles of rotation of the orientation of polarization produced by the TNLCs, we are able to fully analyze PLPL throughout a full image on a single pixel basis. As a recording device we use a small camcorder. The sensor can be operated autonomously, with the images analyzed at a later stage, or it can be connected (in a future phase) via a frame grabber to a personal computer which analyzes the information online. The analyzed image can be presented as a false color image, where hue represents orientation of polarization and saturation represents partial polarization. Field measurements confirm that PLPL is a characteristic distributed both under water and on land. Marine background light is strongly horizontally polarized. Light reflected from leaves is polarized mainly according to their spatial orientation. Differences between PLPL reflected from objects or animals and their background can be used to enhance contrast and break color camouflage. Our sensor presents a new approach for answering questions related to the ecology of vision and is a new tool for remote sensing.

  6. Satellite-based interference analyzer

    NASA Technical Reports Server (NTRS)

    Varice, H.; Johannsen, K.; Sabaroff, S.

    1977-01-01

    System identifies terrestrial sources of radiofrequency interference and measures their frequency spectra and amplitudes. Designed to protect satellite communication networks, system measures entire noise spectrum over selected frequency band and can raster-scan geographical region to locate noise sources. Once interference is analyzed, realistic interference protection ratios are determined and mathematical models for predicting ratio-frequency noise spectra are established. This enhances signal-detection and locates optimum geographical positions and frequency bands for communication equipment.

  7. DEEP WATER ISOTOPIC CURRENT ANALYZER

    DOEpatents

    Johnston, W.H.

    1964-04-21

    A deepwater isotopic current analyzer, which employs radioactive isotopes for measurement of ocean currents at various levels beneath the sea, is described. The apparatus, which can determine the direction and velocity of liquid currents, comprises a shaft having a plurality of radiation detectors extending equidistant radially therefrom, means for releasing radioactive isotopes from the shaft, and means for determining the time required for the isotope to reach a particular detector. (AEC)

  8. Analyzing ion distributions around DNA.

    PubMed

    Lavery, Richard; Maddocks, John H; Pasi, Marco; Zakrzewska, Krystyna

    2014-07-01

    We present a new method for analyzing ion, or molecule, distributions around helical nucleic acids and illustrate the approach by analyzing data derived from molecular dynamics simulations. The analysis is based on the use of curvilinear helicoidal coordinates and leads to highly localized ion densities compared to those obtained by simply superposing molecular dynamics snapshots in Cartesian space. The results identify highly populated and sequence-dependent regions where ions strongly interact with the nucleic and are coupled to its conformational fluctuations. The data from this approach is presented as ion populations or ion densities (in units of molarity) and can be analyzed in radial, angular and longitudinal coordinates using 1D or 2D graphics. It is also possible to regenerate 3D densities in Cartesian space. This approach makes it easy to understand and compare ion distributions and also allows the calculation of average ion populations in any desired zone surrounding a nucleic acid without requiring references to its constituent atoms. The method is illustrated using microsecond molecular dynamics simulations for two different DNA oligomers in the presence of 0.15 M potassium chloride. We discuss the results in terms of convergence, sequence-specific ion binding and coupling with DNA conformation. PMID:24906882

  9. Analyzing ion distributions around DNA

    PubMed Central

    Lavery, Richard; Maddocks, John H.; Pasi, Marco; Zakrzewska, Krystyna

    2014-01-01

    We present a new method for analyzing ion, or molecule, distributions around helical nucleic acids and illustrate the approach by analyzing data derived from molecular dynamics simulations. The analysis is based on the use of curvilinear helicoidal coordinates and leads to highly localized ion densities compared to those obtained by simply superposing molecular dynamics snapshots in Cartesian space. The results identify highly populated and sequence-dependent regions where ions strongly interact with the nucleic and are coupled to its conformational fluctuations. The data from this approach is presented as ion populations or ion densities (in units of molarity) and can be analyzed in radial, angular and longitudinal coordinates using 1D or 2D graphics. It is also possible to regenerate 3D densities in Cartesian space. This approach makes it easy to understand and compare ion distributions and also allows the calculation of average ion populations in any desired zone surrounding a nucleic acid without requiring references to its constituent atoms. The method is illustrated using microsecond molecular dynamics simulations for two different DNA oligomers in the presence of 0.15 M potassium chloride. We discuss the results in terms of convergence, sequence-specific ion binding and coupling with DNA conformation. PMID:24906882

  10. Remote Laser Diffraction PSD Analyzer

    SciTech Connect

    T. A. Batcheller; G. M. Huestis; S. M. Bolton

    2000-06-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified off-the-shelf classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a hot cell (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  11. Remote Laser Diffraction PSD Analyzer

    SciTech Connect

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-06-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified "off-the-shelf" classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a "hot cell" (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  12. The complex structured singular value

    NASA Technical Reports Server (NTRS)

    Packard, A.; Doyle, J.

    1993-01-01

    A tutorial introduction to the complex structured singular value (mu) is presented, with an emphasis on the mathematical aspects of mu. The mu-based methods discussed here have been useful for analyzing the performance and robustness properties of linear feedback systems. Several tests for robust stability and performance with computable bounds for transfer functions and their state space realizations are compared, and a simple synthesis problem is studied. Uncertain systems are represented using linear fractional transformations which naturally unify the frequency-domain and state space methods.

  13. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    PubMed

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  14. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    PubMed

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  15. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    PubMed Central

    Cheung, Mike W.-L.; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  16. Analyzing delay causes in Egyptian construction projects.

    PubMed

    Marzouk, Mohamed M; El-Rasas, Tarek I

    2014-01-01

    Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor's organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  17. [Current methods for preparing samples on working with hematology analyzers].

    PubMed

    Tsyganova, A V; Pogorelov, V M; Naumova, I N; Kozinets, G I; Antonov, V S

    2011-03-01

    The paper raises a problem of preparing samples in hematology. It considers whether the preanalytical stage is of importance in hematological studies. The use of disposal vacuum blood collection systems is shown to solve the problem in the standardization of a blood sampling procedure. The benefits of the use of close tube hematology analyzers are also considered. PMID:21584966

  18. Light-Emitting Diodes: Solving Complex Problems

    ERIC Educational Resources Information Center

    Planinšic, Gorazd; Etkina, Eugenia

    2015-01-01

    This is the fourth paper in our Light-Emitting Diodes series. The series aims to create a systematic library of LED-based materials and to provide readers with the description of experiments and the pedagogical treatment that would help their students construct, test, and apply physics concepts and mathematical relations. The first paper provided…

  19. Euler's Three-Body Problem.

    ERIC Educational Resources Information Center

    Wild, Walter J.

    1980-01-01

    Discusses the simplest three-body problem, known as Euler's problem. The article, intended for students in the undergraduate mathematics and physics curricula, shows how the complex equations for a specific three-body problem can be solved on a small calculator. (HM)

  20. ITK and ANALYZE: a synergistic integration

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Robb, Richard A.

    2004-05-01

    The Insight Toolkit (ITK) is a C++ open-source software toolkit developed under sponsorship of the National Library of Medicine. It provides advanced algorithms for performing image registration and segmentation, but does not provide support for visualization and analysis, nor does it offer any graphical user interface (GUI). The purpose of this integration project is to make ITK readily accessible to end-users with little or no programming skills, and provide interactive processing, visualization and measurement capabilities. This is achieved through the integration of ITK with ANALYZE, a multi-dimension image visualization/analysis application installed in over 300 institutions around the world, with a user-base in excess of 4000. This integration is carried out at both the software foundation and GUI levels. The foundation technology upon which ANALYZE is built is a comprehensive C-function library called AVW. A new set of AVW-ITK functions have been developed and integrated into the AVW library, and four new ITK modules have been added to the ANALYZE interface. Since ITK is a software developer"s toolkit, the only way to access its intrinsic power is to write programs that incorporate it. Integrating ITK with ANALYZE opens the ITK algorithms to end-users who otherwise might never be able to take advantage of the toolkit"s advanced functionality. In addition, this integration provides end-to-end interactive problem solving capabilities which allow all users, including programmers, an integrated system to readily display and quantitatively evaluate the results from the segmentation and registration routines in ITK, regardless of the type or format of input images, which are comprehensively supported in ANALYZE.