Science.gov

Sample records for analyzing complex problems

  1. The Bright Side of Being Blue: Depression as an Adaptation for Analyzing Complex Problems

    ERIC Educational Resources Information Center

    Andrews, Paul W.; Thomson, J. Anderson, Jr.

    2009-01-01

    Depression is the primary emotional condition for which help is sought. Depressed people often report persistent rumination, which involves analysis, and complex social problems in their lives. Analysis is often a useful approach for solving complex problems, but it requires slow, sustained processing, so disruption would interfere with problem…

  2. The bright side of being blue: Depression as an adaptation for analyzing complex problems

    PubMed Central

    Andrews, Paul W.; Thomson, J. Anderson

    2009-01-01

    Depression ranks as the primary emotional problem for which help is sought. Depressed people often have severe, complex problems, and rumination is a common feature. Depressed people often believe that their ruminations give them insight into their problems, but clinicians often view depressive rumination as pathological because it is difficult to disrupt and interferes with the ability to concentrate on other things. Abundant evidence indicates that depressive rumination involves the analysis of episode-related problems. Because analysis is time consuming and requires sustained processing, disruption would interfere with problem-solving. The analytical rumination (AR) hypothesis proposes that depression is an adaptation that evolved as a response to complex problems and whose function is to minimize disruption of rumination and sustain analysis of complex problems. It accomplishes this by giving episode-related problems priority access to limited processing resources, by reducing the desire to engage in distracting activities (anhedonia), and by producing psychomotor changes that reduce exposure to distracting stimuli. Because processing resources are limited, the inability to concentrate on other things is a tradeoff that must be made to sustain analysis of the triggering problem. The AR hypothesis is supported by evidence from many levels, including genes, neurotransmitters and their receptors, neurophysiology, neuroanatomy, neuroenergetics, pharmacology, cognition and behavior, and the efficacy of treatments. In addition, we address and provide explanations for puzzling findings in the cognitive and behavioral genetics literatures on depression. In the process, we challenge the belief that serotonin transmission is low in depression. Finally, we discuss implications of the hypothesis for understanding and treating depression. PMID:19618990

  3. A new approach for analyzing average time complexity of population-based evolutionary algorithms on unimodal problems.

    PubMed

    Chen, Tianshi; He, Jun; Sun, Guangzhong; Chen, Guoliang; Yao, Xin

    2009-10-01

    In the past decades, many theoretical results related to the time complexity of evolutionary algorithms (EAs) on different problems are obtained. However, there is not any general and easy-to-apply approach designed particularly for population-based EAs on unimodal problems. In this paper, we first generalize the concept of the takeover time to EAs with mutation, then we utilize the generalized takeover time to obtain the mean first hitting time of EAs and, thus, propose a general approach for analyzing EAs on unimodal problems. As examples, we consider the so-called (N + N) EAs and we show that, on two well-known unimodal problems, leadingones and onemax , the EAs with the bitwise mutation and two commonly used selection schemes both need O(n ln n + n(2)/N) and O(n ln ln n + n ln n/N) generations to find the global optimum, respectively. Except for the new results above, our approach can also be applied directly for obtaining results for some population-based EAs on some other unimodal problems. Moreover, we also discuss when the general approach is valid to provide us tight bounds of the mean first hitting times and when our approach should be combined with problem-specific knowledge to get the tight bounds. It is the first time a general idea for analyzing population-based EAs on unimodal problems is discussed theoretically.

  4. Implementation of Complexity Analyzing Based on Additional Effect

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang

    According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.

  5. On a Procedure for Analyzing Certain Problems of Diffusion Theory.

    DTIC Science & Technology

    PARTIAL DIFFERENTIAL EQUATIONS, DIFFUSION ), BOUNDARY VALUE PROBLEMS, BOUNDARY VALUE PROBLEMS, INTEGRAL TRANSFORMS, COMPLEX VARIABLES, CONDUCTION(HEAT TRANSFER), ELECTRICAL CONDUCTIVITY, FLUID FLOW, BESSEL FUNCTIONS

  6. Analyzing Adversaries as Complex Adaptive Systems

    DTIC Science & Technology

    2006-10-01

    reflecting the general population’s sympathy (support) for the terrorist’s cause, is depressed as the terrorist attack magnitude increases, as shown in...Cowan, George A., Pines, David, Meltzer , David, eds., 1994, Complexity: Metaphors, Models, and Reality, Reading, Massachusetts: Addison-Wesley

  7. Analyzing and Detecting Problems in Systems of Systems

    NASA Technical Reports Server (NTRS)

    Lindvall, Mikael; Ackermann, Christopher; Stratton, William C.; Sibol, Deane E.; Godfrey, Sally

    2008-01-01

    Many software systems are evolving complex system of systems (SoS) for which inter-system communication is mission-critical. Evidence indicates that transmission failures and performance issues are not uncommon occurrences. In a NASA-supported Software Assurance Research Program (SARP) project, we are researching a new approach addressing such problems. In this paper, we are presenting an approach for analyzing inter-system communications with the goal to uncover both transmission errors and performance problems. Our approach consists of a visualization and an evaluation component. While the visualization of the observed communication aims to facilitate understanding, the evaluation component automatically checks the conformance of an observed communication (actual) to a desired one (planned). The actual and the planned are represented as sequence diagrams. The evaluation algorithm checks the conformance of the actual to the planned diagram. We have applied our approach to the communication of aerospace systems and were successful in detecting and resolving even subtle and long existing transmission problems.

  8. Software Analyzes Complex Systems in Real Time

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Expert system software programs, also known as knowledge-based systems, are computer programs that emulate the knowledge and analytical skills of one or more human experts, related to a specific subject. SHINE (Spacecraft Health Inference Engine) is one such program, a software inference engine (expert system) designed by NASA for the purpose of monitoring, analyzing, and diagnosing both real-time and non-real-time systems. It was developed to meet many of the Agency s demanding and rigorous artificial intelligence goals for current and future needs. NASA developed the sophisticated and reusable software based on the experience and requirements of its Jet Propulsion Laboratory s (JPL) Artificial Intelligence Research Group in developing expert systems for space flight operations specifically, the diagnosis of spacecraft health. It was designed to be efficient enough to operate in demanding real time and in limited hardware environments, and to be utilized by non-expert systems applications written in conventional programming languages. The technology is currently used in several ongoing NASA applications, including the Mars Exploration Rovers and the Spacecraft Health Automatic Reasoning Pilot (SHARP) program for the diagnosis of telecommunication anomalies during the Neptune Voyager Encounter. It is also finding applications outside of the Space Agency.

  9. Quantum Computing: Solving Complex Problems

    ScienceCinema

    DiVincenzo, David [IBM Watson Research Center

    2016-07-12

    One of the motivating ideas of quantum computation was that there could be a new kind of machine that would solve hard problems in quantum mechanics. There has been significant progress towards the experimental realization of these machines (which I will review), but there are still many questions about how such a machine could solve computational problems of interest in quantum physics. New categorizations of the complexity of computational problems have now been invented to describe quantum simulation. The bad news is that some of these problems are believed to be intractable even on a quantum computer, falling into a quantum analog of the NP class. The good news is that there are many other new classifications of tractability that may apply to several situations of physical interest.

  10. Analyzing the Origins of Childhood Externalizing Behavioral Problems

    ERIC Educational Resources Information Center

    Barnes, J. C.; Boutwell, Brian B.; Beaver, Kevin M.; Gibson, Chris L.

    2013-01-01

    Drawing on a sample of twin children from the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B; Snow et al., 2009), the current study analyzed 2 of the most prominent predictors of externalizing behavioral problems (EBP) in children: (a) parental use of spankings and (b) childhood self-regulation. A variety of statistical techniques were…

  11. Quantifying and analyzing the network basis of genetic complexity.

    PubMed

    Thompson, Ethan G; Galitski, Timothy

    2012-01-01

    Genotype-to-phenotype maps exhibit complexity. This genetic complexity is mentioned frequently in the literature, but a consistent and quantitative definition is lacking. Here, we derive such a definition and investigate its consequences for model genetic systems. The definition equates genetic complexity with a surplus of genotypic diversity over phenotypic diversity. Applying this definition to ensembles of Boolean network models, we found that the in-degree distribution and the number of periodic attractors produced determine the relative complexity of different topology classes. We found evidence that networks that are difficult to control, or that exhibit a hierarchical structure, are genetically complex. We analyzed the complexity of the cell cycle network of Sacchoromyces cerevisiae and pinpointed genes and interactions that are most important for its high genetic complexity. The rigorous definition of genetic complexity is a tool for unraveling the structure and properties of genotype-to-phenotype maps by enabling the quantitative comparison of the relative complexities of different genetic systems. The definition also allows the identification of specific network elements and subnetworks that have the greatest effects on genetic complexity. Moreover, it suggests ways to engineer biological systems with desired genetic properties.

  12. Analyzing Problem's Difficulty Based on Neural Networks and Knowledge Map

    ERIC Educational Resources Information Center

    Kuo, Rita; Lien, Wei-Peng; Chang, Maiga; Heh, Jia-Sheng

    2004-01-01

    This paper proposes a methodology to calculate both the difficulty of the basic problems and the difficulty of solving a problem. The method to calculate the difficulty of problem is according to the process of constructing a problem, including Concept Selection, Unknown Designation, and Proposition Construction. Some necessary measures observed…

  13. Complex Problem Solving in a Workplace Setting.

    ERIC Educational Resources Information Center

    Middleton, Howard

    2002-01-01

    Studied complex problem solving in the hospitality industry through interviews with six office staff members and managers. Findings show it is possible to construct a taxonomy of problem types and that the most common approach can be termed "trial and error." (SLD)

  14. The Process of Solving Complex Problems

    ERIC Educational Resources Information Center

    Fischer, Andreas; Greiff, Samuel; Funke, Joachim

    2012-01-01

    This article is about Complex Problem Solving (CPS), its history in a variety of research domains (e.g., human problem solving, expertise, decision making, and intelligence), a formal definition and a process theory of CPS applicable to the interdisciplinary field. CPS is portrayed as (a) knowledge acquisition and (b) knowledge application…

  15. Analyzing Quadratic Unconstrained Binary Optimization Problems Via Multicommodity Flows.

    PubMed

    Wang, Di; Kleinberg, Robert D

    2009-11-28

    Quadratic Unconstrained Binary Optimization (QUBO) problems concern the minimization of quadratic polynomials in n {0, 1}-valued variables. These problems are NP-complete, but prior work has identified a sequence of polynomial-time computable lower bounds on the minimum value, denoted by C(2), C(3), C(4),…. It is known that C(2) can be computed by solving a maximum-flow problem, whereas the only previously known algorithms for computing C(k) (k > 2) require solving a linear program. In this paper we prove that C(3) can be computed by solving a maximum multicommodity flow problem in a graph constructed from the quadratic function. In addition to providing a lower bound on the minimum value of the quadratic function on {0, 1}(n), this multicommodity flow problem also provides some information about the coordinates of the point where this minimum is achieved. By looking at the edges that are never saturated in any maximum multicommodity flow, we can identify relational persistencies: pairs of variables that must have the same or different values in any minimizing assignment. We furthermore show that all of these persistencies can be detected by solving single-commodity flow problems in the same network.

  16. Program for Analyzing Flows in a Complex Network

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok Kumar

    2006-01-01

    Generalized Fluid System Simulation Program (GFSSP) version 4 is a general-purpose computer program for analyzing steady-state and transient flows in a complex fluid network. The program is capable of modeling compressibility, fluid transients (e.g., water hammers), phase changes, mixtures of chemical species, and such externally applied body forces as gravitational and centrifugal ones. A graphical user interface enables the user to interactively develop a simulation of a fluid network consisting of nodes and branches. The user can also run the simulation and view the results in the interface. The system of equations for conservation of mass, energy, chemical species, and momentum is solved numerically by a combination of the Newton-Raphson and successive-substitution methods.

  17. Complex Problem Solving--More than Reasoning?

    ERIC Educational Resources Information Center

    Wustenberg, Sascha; Greiff, Samuel; Funke, Joachim

    2012-01-01

    This study investigates the internal structure and construct validity of Complex Problem Solving (CPS), which is measured by a "Multiple-Item-Approach." It is tested, if (a) three facets of CPS--"rule identification" (adequateness of strategies), "rule knowledge" (generated knowledge) and "rule application"…

  18. Refined scale-dependent permutation entropy to analyze systems complexity

    NASA Astrophysics Data System (ADS)

    Wu, Shuen-De; Wu, Chiu-Wen; Humeau-Heurtier, Anne

    2016-05-01

    Multiscale entropy (MSE) has become a prevailing method to quantify the complexity of systems. Unfortunately, MSE has a temporal complexity in O(N2) , which is unrealistic for long time series. Moreover, MSE relies on the sample entropy computation which is length-dependent and which leads to large variance and possible undefined entropy values for short time series. Here, we propose and introduce a new multiscale complexity measure, the refined scale-dependent permutation entropy (RSDPE). Through the processing of different kinds of synthetic data and real signals, we show that RSDPE has a behavior close to the one of MSE. Furthermore, RSDPE has a temporal complexity in O(N) . Finally, RSDPE has the advantage of being much less length-dependent than MSE. From all this, we conclude that RSDPE over-performs MSE in terms of computational cost and computational accuracy.

  19. New Approach to Analyzing Physics Problems: A Taxonomy of Introductory Physics Problems

    ERIC Educational Resources Information Center

    Teodorescu, Raluca E.; Bennhold, Cornelius; Feldman, Gerald; Medsker, Larry

    2013-01-01

    This paper describes research on a classification of physics problems in the context of introductory physics courses. This classification, called the Taxonomy of Introductory Physics Problems (TIPP), relates physics problems to the cognitive processes required to solve them. TIPP was created in order to design educational objectives, to develop…

  20. Fractal applications to complex crustal problems

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1989-01-01

    Complex scale-invariant problems obey fractal statistics. The basic definition of a fractal distribution is that the number of objects with a characteristic linear dimension greater than r satisfies the relation N = about r exp -D where D is the fractal dimension. Fragmentation often satisfies this relation. The distribution of earthquakes satisfies this relation. The classic relationship between the length of a rocky coast line and the step length can be derived from this relation. Power law relations for spectra can also be related to fractal dimensions. Topography and gravity are examples. Spectral techniques can be used to obtain maps of fractal dimension and roughness amplitude. These provide a quantitative measure of texture analysis. It is argued that the distribution of stress and strength in a complex crustal region, such as the Alps, is fractal. Based on this assumption, the observed frequency-magnitude relation for the seismicity in the region can be derived.

  1. New approach to analyzing physics problems: A Taxonomy of Introductory Physics Problems

    NASA Astrophysics Data System (ADS)

    Teodorescu, Raluca E.; Bennhold, Cornelius; Feldman, Gerald; Medsker, Larry

    2013-06-01

    This paper describes research on a classification of physics problems in the context of introductory physics courses. This classification, called the Taxonomy of Introductory Physics Problems (TIPP), relates physics problems to the cognitive processes required to solve them. TIPP was created in order to design educational objectives, to develop assessments that can evaluate individual component processes of the physics problem-solving process, and to guide curriculum design in introductory physics courses, specifically within the context of a “thinking-skills” curriculum. Moreover, TIPP enables future physics education researchers to investigate to what extent the cognitive processes presented in various taxonomies of educational objectives are exercised during physics problem solving and what relationship might exist between such processes. We describe the taxonomy, give examples of classifications of physics problems, and discuss the validity and reliability of this tool.

  2. System and method for modeling and analyzing complex scenarios

    DOEpatents

    Shevitz, Daniel Wolf

    2013-04-09

    An embodiment of the present invention includes a method for analyzing and solving possibility tree. A possibility tree having a plurality of programmable nodes is constructed and solved with a solver module executed by a processor element. The solver module executes the programming of said nodes, and tracks the state of at least a variable through a branch. When a variable of said branch is out of tolerance with a parameter, the solver disables remaining nodes of the branch and marks the branch as an invalid solution. The valid solutions are then aggregated and displayed as valid tree solutions.

  3. MatOFF: A Tool For Analyzing Behaviorally-Complex Neurophysiological Experiments

    PubMed Central

    Genovesio, Aldo; Mitz, Andrew R.

    2007-01-01

    The simple operant conditioning originally used in behavioral neurophysiology 30 years ago has given way to complex and sophisticated behavioral paradigms; so much so, that early general purpose programs for analyzing neurophysiological data are ill-suited for complex experiments. The trend has been to develop custom software for each class of experiment, but custom software can have serious drawbacks. We describe here a general purpose software tool for behavioral and electrophysiological studies, called MatOFF, that is especially suited for processing neurophysiological data gathered during the execution of complex behaviors. Written in the MATLAB programming language, MatOFF solves the problem of handling complex analysis requirements in a unique and powerful way. While other neurophysiological programs are either a loose collection of tools or append MATLAB as a post-processing step, MatOFF is an integrated environment that supports MATLAB scripting within the event search engine safely isolated in programming sandbox. The results from scripting are stored separately, but in parallel with the raw data, and thus available to all subsequent MatOFF analysis and display processing. An example from a recently published experiment shows how all the features of MatOFF work together to analyze complex experiments and mine neurophysiological data in efficient ways. PMID:17604115

  4. Network-Thinking: Graphs to Analyze Microbial Complexity and Evolution

    PubMed Central

    Corel, Eduardo; Lopez, Philippe; Méheust, Raphaël; Bapteste, Eric

    2016-01-01

    The tree model and tree-based methods have played a major, fruitful role in evolutionary studies. However, with the increasing realization of the quantitative and qualitative importance of reticulate evolutionary processes, affecting all levels of biological organization, complementary network-based models and methods are now flourishing, inviting evolutionary biology to experience a network-thinking era. We show how relatively recent comers in this field of study, that is, sequence-similarity networks, genome networks, and gene families–genomes bipartite graphs, already allow for a significantly enhanced usage of molecular datasets in comparative studies. Analyses of these networks provide tools for tackling a multitude of complex phenomena, including the evolution of gene transfer, composite genes and genomes, evolutionary transitions, and holobionts. PMID:26774999

  5. Estimating uncertainties in complex joint inverse problems

    NASA Astrophysics Data System (ADS)

    Afonso, Juan Carlos

    2016-04-01

    Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related

  6. Analyzing complex gaze behavior in the natural world

    NASA Astrophysics Data System (ADS)

    Pelz, Jeff B.; Kinsman, Thomas B.; Evans, Karen M.

    2011-03-01

    The history of eye-movement research extends back at least to 1794, when Erasmus Darwin (Charles' grandfather) published Zoonomia, including descriptions of eye movements due to self-motion. But research on eye movements was restricted to the laboratory for 200 years, until Michael Land built the first wearable eyetracker at the University of Sussex and published the seminal paper "Where we look when we steer" [1]. In the intervening centuries, we learned a tremendous amount about the mechanics of the oculomotor system and how it responds to isolated stimuli, but virtually nothing about how we actually use our eyes to explore, gather information, navigate, and communicate in the real world. Inspired by Land's work, we have been working to extend knowledge in these areas by developing hardware, algorithms, and software that have allowed researchers to ask questions about how we actually use vision in the real world. Central to that effort are new methods for analyzing the volumes of data that come from the experiments made possible by the new systems. We describe a number of recent experiments and SemantiCode, a new program that supports assisted coding of eye-movement data collected in unrestricted environments.

  7. Hybrid techniques for complex aerospace electromagnetics problems

    NASA Technical Reports Server (NTRS)

    Aberle, Jim

    1993-01-01

    Important aerospace electromagnetics problems include the evaluation of antenna performance on aircraft and the prediction and control of the aircraft's electromagnetic signature. Due to the ever increasing complexity and expense of aircraft design, aerospace engineers have become increasingly dependent on computer solutions. Traditionally, computational electromagnetics (CEM) has relied primarily on four disparate techniques: the method of moments (MoM), the finite-difference time-domain (FDTD) technique, the finite element method (FEM), and high frequency asymptotic techniques (HFAT) such as ray tracing. Each of these techniques has distinct advantages and disadvantages, and no single technique is capable of accurately solving all problems of interest on computers that are available now or will be available in the foreseeable future. As a result, new approaches that overcome the deficiencies of traditional techniques are beginning to attract a great deal of interest in the CEM community. Among these new approaches are hybrid methods which combine two or more of these techniques into a coherent model. During the ASEE Summer Faculty Fellowship Program a hybrid FEM/MoM computer code was developed and applied to a geometry containing features found on many modern aircraft.

  8. The Guarding Problem - Complexity and Approximation

    NASA Astrophysics Data System (ADS)

    Reddy, T. V. Thirumala; Krishna, D. Sai; Rangan, C. Pandu

    Let G = (V, E) be the given graph and G R = (V R ,E R ) and G C = (V C ,E C ) be the sub graphs of G such that V R ∩ V C = ∅ and V R ∪ V C = V. G C is referred to as the cops region and G R is called as the robber region. Initially a robber is placed at some vertex of V R and the cops are placed at some vertices of V C . The robber and cops may move from their current vertices to one of their neighbours. While a cop can move only within the cops region, the robber may move to any neighbour. The robber and cops move alternatively. A vertex v ∈ V C is said to be attacked if the current turn is the robber's turn, the robber is at vertex u where u ∈ V R , (u,v) ∈ E and no cop is present at v. The guarding problem is to find the minimum number of cops required to guard the graph G C from the robber's attack. We first prove that the decision version of this problem when G R is an arbitrary undirected graph is PSPACE-hard. We also prove that the complexity of the decision version of the guarding problem when G R is a wheel graph is NP-hard. We then present approximation algorithms if G R is a star graph, a clique and a wheel graph with approximation ratios H(n 1), 2 H(n 1) and left( H(n1) + 3/2 right) respectively, where H(n1) = 1 + 1/2 + ... + 1/n1 and n 1 = ∣ V R ∣.

  9. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 2. [sample problem library guide

    NASA Technical Reports Server (NTRS)

    Jackson, C. E., Jr.

    1977-01-01

    A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.

  10. Decomposing a complex design problem using CLIPS

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    1990-01-01

    Many engineering systems are large and multidisciplinary. Before the design of such complex systems can begin, much time and money are invested in determining the possible couplings among the participating subsystems and their parts. For designs based on existing concepts, like commercial aircraft design, the subsystems and their couplings are usually well-established. However, for designs based on novel concepts, like large space platforms, the determination of the subsystems, couplings, and participating disciplines is an important task. Moreover, this task must be repeated as new information becomes available or as the design specifications change. Determining the subsystems is not an easy, straightforward process and often important couplings are overlooked. The design manager must know how to divide the design work among the design teams so that changes in one subsystem will have predictable effects on other subsystems. The resulting subsystems must be ordered into a hierarchical structure before the planning documents and milestones of the design project are set. The success of a design project often depends on the wise choice of design variables, constraints, objective functions, and the partitioning of these among the design teams. Very few tools are available to aid the design manager in determining the hierarchical structure of a design problem and assist in making these decisions.

  11. Complex network problems in physics, computer science and biology

    NASA Astrophysics Data System (ADS)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe

  12. Team-Based Complex Problem Solving: A Collective Cognition Perspective

    ERIC Educational Resources Information Center

    Hung, Woei

    2013-01-01

    Today, much problem solving is performed by teams, rather than individuals. The complexity of these problems has exceeded the cognitive capacity of any individual and requires a team of members to solve them. The success of solving these complex problems not only relies on individual team members who possess different but complementary expertise,…

  13. Solving Complex Problems: A Convergent Approach to Cognitive Load Measurement

    ERIC Educational Resources Information Center

    Zheng, Robert; Cook, Anne

    2012-01-01

    The study challenged the current practices in cognitive load measurement involving complex problem solving by manipulating the presence of pictures in multiple rule-based problem-solving situations and examining the cognitive load resulting from both off-line and online measures associated with complex problem solving. Forty-eight participants…

  14. Solving Nonlinear Optimization Problems of Real Functions in Complex Variables by Complex-Valued Iterative Methods.

    PubMed

    Zhang, Songchuan; Xia, Youshen

    2016-12-28

    Much research has been devoted to complex-variable optimization problems due to their engineering applications. However, the complex-valued optimization method for solving complex-variable optimization problems is still an active research area. This paper proposes two efficient complex-valued optimization methods for solving constrained nonlinear optimization problems of real functions in complex variables, respectively. One solves the complex-valued nonlinear programming problem with linear equality constraints. Another solves the complex-valued nonlinear programming problem with both linear equality constraints and an ℓ₁-norm constraint. Theoretically, we prove the global convergence of the proposed two complex-valued optimization algorithms under mild conditions. The proposed two algorithms can solve the complex-valued optimization problem completely in the complex domain and significantly extend existing complex-valued optimization algorithms. Numerical results further show that the proposed two algorithms have a faster speed than several conventional real-valued optimization algorithms.

  15. Managing Complex Problems in Rangeland Ecosystems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Management of rangelands, and natural resources in general, has become increasingly complex. There is an atmosphere of increasing expectations for conservation efforts associated with a variety of issues from water quality to endangered species. We argue that many current issues are complex by their...

  16. Complex partial status epilepticus: a recurrent problem.

    PubMed Central

    Cockerell, O C; Walker, M C; Sander, J W; Shorvon, S D

    1994-01-01

    Twenty patients with complex partial status epilepticus were identified retrospectively from a specialist neurology hospital. Seventeen patients experienced recurrent episodes of complex partial status epilepticus, often occurring at regular intervals, usually over many years, and while being treated with effective anti-epileptic drugs. No unifying cause for the recurrences, and no common epilepsy aetiologies, were identified. In spite of the frequency of recurrence and length of history, none of the patients showed any marked evidence of cognitive or neurological deterioration. Complex partial status epilepticus is more common than is generally recognised, should be differentiated from other forms of non-convulsive status, and is often difficult to treat. PMID:8021671

  17. Solving the Inverse-Square Problem with Complex Variables

    ERIC Educational Resources Information Center

    Gauthier, N.

    2005-01-01

    The equation of motion for a mass that moves under the influence of a central, inverse-square force is formulated and solved as a problem in complex variables. To find the solution, the constancy of angular momentum is first established using complex variables. Next, the complex position coordinate and complex velocity of the particle are assumed…

  18. Organizational Structure and Complex Problem Solving

    ERIC Educational Resources Information Center

    Becker, Selwyn W.; Baloff, Nicholas

    1969-01-01

    The problem-solving efficiency of different organization structures is discussed in relation to task requirements and the appropriate organizational behavior, to group adaptation to a task over time, and to various group characteristics. (LN)

  19. How Unstable Are Complex Financial Systems? Analyzing an Inter-bank Network of Credit Relations

    NASA Astrophysics Data System (ADS)

    Sinha, Sitabhra; Thess, Maximilian; Markose, Sheri

    The recent worldwide economic crisis of 2007-09 has focused attention on the need to analyze systemic risk in complex financial networks. We investigate the problem of robustness of such systems in the context of the general theory of dynamical stability in complex networks and, in particular, how the topology of connections influence the risk of the failure of a single institution triggering a cascade of successive collapses propagating through the network. We use data on bilateral liabilities (or exposure) in the derivatives market between 202 financial intermediaries based in USA and Europe in the last quarter of 2009 to empirically investigate the network structure of the over-the-counter (OTC) derivatives market. We observe that the network exhibits both heterogeneity in node properties and the existence of communities. It also has a prominent core-periphery organization and can resist large-scale collapse when subjected to individual bank defaults (however, failure of any bank in the core may result in localized collapse of the innermost core with substantial loss of capital) but is vulnerable to system-wide breakdown as a result of an accompanying liquidity crisis.

  20. Analyzing the Responses of 7-8 Year Olds When Solving Partitioning Problems

    ERIC Educational Resources Information Center

    Badillo, Edelmira; Font, Vicenç; Edo, Mequè

    2015-01-01

    We analyze the mathematical solutions of 7- to 8-year-old pupils while individually solving an arithmetic problem. The analysis was based on the "configuration of objects," an instrument derived from the onto-semiotic approach to mathematical knowledge. Results are illustrated through a number of cases. From the analysis of mathematical…

  1. Analyzing the Solution of Word Problems in Mathematics: An Exploratory Study.

    ERIC Educational Resources Information Center

    Kilpatrick, Jeremy

    This study attempted to develop a system for analyzing the processes students use in solving work problems and to investigate the relationships of these processes to other behavioral measures. The subjects in this study were 56 students of both sexes who had above average mental ability and who had just completed the eighth grade from two junior…

  2. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  3. Multigrid Methods for Aerodynamic Problems in Complex Geometries

    NASA Technical Reports Server (NTRS)

    Caughey, David A.

    1995-01-01

    Work has been directed at the development of efficient multigrid methods for the solution of aerodynamic problems involving complex geometries, including the development of computational methods for the solution of both inviscid and viscous transonic flow problems. The emphasis is on problems of complex, three-dimensional geometry. The methods developed are based upon finite-volume approximations to both the Euler and the Reynolds-Averaged Navier-Stokes equations. The methods are developed for use on multi-block grids using diagonalized implicit multigrid methods to achieve computational efficiency. The work is focused upon aerodynamic problems involving complex geometries, including advanced engine inlets.

  4. Analyzing HIV/AIDS and Alcohol and Other Drug Use as a Social Problem

    PubMed Central

    PATTERSON, DAVID A.; Wolf (Adelv unegv Waya), Silver

    2012-01-01

    Most prevention and intervention activities directed toward HIV/AIDS and alcohol and other drug use separately as well as the combining of the two (e.g., those who are both HIV/AIDS and using alcohol and other drugs) comes in the form of specific, individualized therapies without consideration of social influences that may have a greater impact on this population. Approaching this social problem from the narrowed view of individualized, mi-cro solutions disregards the larger social conditions that affect or perhaps even are at the root of the problem. This paper analyzes the social problem of HIV/AIDS and alcohol and other drug abuse using three sociological perspectives—social construction theory, ethnomethodology, and conflict theory—informing the reader of the broader influences accompanying this problem. PMID:23264724

  5. Complex Mathematical Problem Solving by Individuals and Dyads.

    ERIC Educational Resources Information Center

    Vye, Nancy J.; Goldman, Susan R.; Voss, James F.; Hmelo, Cindy; Williams, Susan; Cognition and Technology Group at Vanderbilt University

    1997-01-01

    Describes two studies of mathematical problem solving using an episode from "The Adventures of Jasper Woodbury," a set of curriculum materials that afford complex problem-solving opportunities. Discussion focuses on characteristics of problems that make solutions difficult, kinds of reasoning that dyadic interactions support, and…

  6. Preparing for Complexity and Wicked Problems through Transformational Learning Approaches

    ERIC Educational Resources Information Center

    Yukawa, Joyce

    2015-01-01

    As the information environment becomes increasingly complex and challenging, Library and Information Studies (LIS) education is called upon to nurture innovative leaders capable of managing complex situations and "wicked problems." While disciplinary expertise remains essential, higher levels of mental complexity and adaptive…

  7. A New Approach to Analyzing the Cognitive Load in Physics Problems

    NASA Astrophysics Data System (ADS)

    Teodorescu, Raluca

    2010-02-01

    I will present a Taxonomy of Introductory Physics Problems (TIPP), which relates physics problems to the cognitive processes and the knowledge required to solve them. TIPP was created for designing and clarifying educational objectives, for developing assessments to evaluate components of the problem-solving process, and for guiding curriculum design in introductory physics courses. To construct TIPP, I considered processes that have been identified either by cognitive science and expert-novice research or by direct observation of students' behavior while solving physics problems. Based on Marzano and Kendall's taxonomy [1], I developed a procedure to classify physics problems according to the cognitive processes that they involve and the knowledge to which they refer. The procedure is applicable to any physics problem and its validity and reliability have been confirmed. This algorithm was then used to build TIPP, which is a database that contains text-based and research-based physics problems and explains their relationship to cognitive processes and knowledge. TIPP has been used in the years 2006--2009 to reform the first semester of the introductory algebra-based physics course at The George Washington University. The reform targeted students' cognitive development and attitudes improvement. The methodology employed in the course involves exposing students to certain types of problems in a variety of contexts with increasing complexity. To assess the effectiveness of our approach, rubrics were created to evaluate students' problem-solving abilities and the Colorado Learning Attitudes about Science Survey (CLASS) was administered pre- and post-instruction to determine students' shift in dispositions towards learning physics. Our results show definitive gains in the areas targeted by our curricular reform.[4pt] [1] R.J. Marzano and J.S. Kendall, The New Taxonomy of Educational Objectives, 2^nd Ed., (Corwin Press, Thousand Oaks, 2007). )

  8. Completed Beltrami-Michell formulation for analyzing mixed boundary value problems in elasticity

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Kaljevic, Igor; Hopkins, Dale A.; Saigal, Sunil

    1995-01-01

    In elasticity, the method of forces, wherein stress parameters are considered as the primary unknowns, is known as the Beltrami-Michell formulation (BMF). The existing BMF can only solve stress boundary value problems; it cannot handle the more prevalent displacement of mixed boundary value problems of elasticity. Therefore, this formulation, which has restricted application, could not become a true alternative to the Navier's displacement method, which can solve all three types of boundary value problems. The restrictions in the BMF have been alleviated by augmenting the classical formulation with a novel set of conditions identified as the boundary compatibility conditions. This new method, which completes the classical force formulation, has been termed the completed Beltrami-Michell formulation (CBMF). The CBMF can solve general elasticity problems with stress, displacement, and mixed boundary conditions in terms of stresses as the primary unknowns. The CBMF is derived from the stationary condition of the variational functional of the integrated force method. In the CBMF, stresses for kinematically stable structures can be obtained without any reference to the displacements either in the field or on the boundary. This paper presents the CBMF and its derivation from the variational functional of the integrated force method. Several examples are presented to demonstrate the applicability of the completed formulation for analyzing mixed boundary value problems under thermomechanical loads. Selected example problems include a cylindrical shell wherein membrane and bending responses are coupled, and a composite circular plate.

  9. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 1. [thermal analyzer manual

    NASA Technical Reports Server (NTRS)

    Lee, H. P.

    1977-01-01

    The NASTRAN Thermal Analyzer Manual describes the fundamental and theoretical treatment of the finite element method, with emphasis on the derivations of the constituent matrices of different elements and solution algorithms. Necessary information and data relating to the practical applications of engineering modeling are included.

  10. The complex problem of sensitive skin.

    PubMed

    Marriott, Marie; Holmes, Jo; Peters, Lisa; Cooper, Karen; Rowson, Matthew; Basketter, David A

    2005-08-01

    There exists within the population subsets of individuals who display heightened skin reactivity to materials the majority find tolerable. In a series of investigations, we have examined interrelationships between many of the endpoints associated with the term 'sensitive skin'. In the most recent work, 58 volunteers were treated with 10% lactic acid, 50% ethanol, 0.5% menthol and 1.0% capsaicin on the nasolabial fold, unoccluded, with sensory reactions recorded at 2.5 min, 5 min and 8 min after application. Urticant susceptibility was evaluated with 1 m benzoic acid and 125 mM trans-cinnamic acid applied to the volar forearm for 20 min. A 2 x 23-h patch test was also conducted using 0.1% and 0.3% sodium dodecyl sulfate, 0.3% and 0.6% cocamidopropyl betaine and 0.1% and 0.2% benzalkonium chloride to determine irritant susceptibility. As found in previous studies, increased susceptibility to one endpoint was not predictive of sensitivity to another. In our experience, nasolabial stinging was a poor predictor of general skin sensitivity. Nevertheless, it may be possible to identify in the normal population individuals who, coincidentally, are more generally sensitive to a range of non-immunologic adverse skin reactions. Whether such individuals are those who experience problems with skin care products remains to be addressed.

  11. Modeling Complex Chemical Systems: Problems and Solutions

    NASA Astrophysics Data System (ADS)

    van Dijk, Jan

    2016-09-01

    Non-equilibrium plasmas in complex gas mixtures are at the heart of numerous contemporary technologies. They typically contain dozens to hundreds of species, involved in hundreds to thousands of reactions. Chemists and physicists have always been interested in what are now called chemical reduction techniques (CRT's). The idea of such CRT's is that they reduce the number of species that need to be considered explicitly without compromising the validity of the model. This is usually achieved on the basis of an analysis of the reaction time scales of the system under study, which identifies species that are in partial equilibrium after a given time span. The first such CRT that has been widely used in plasma physics was developed in the 1960's and resulted in the concept of effective ionization and recombination rates. It was later generalized to systems in which multiple levels are effected by transport. In recent years there has been a renewed interest in tools for chemical reduction and reaction pathway analysis. An example of the latter is the PumpKin tool. Another trend is that techniques that have previously been developed in other fields of science are adapted as to be able to handle the plasma state of matter. Examples are the Intrinsic Low Dimension Manifold (ILDM) method and its derivatives, which originate from combustion engineering, and the general-purpose Principle Component Analysis (PCA) technique. In this contribution we will provide an overview of the most common reduction techniques, then critically assess the pros and cons of the methods that have gained most popularity in recent years. Examples will be provided for plasmas in argon and carbon dioxide.

  12. Operational Reconnaissance: Identifying the Right Problems in a Complex World

    DTIC Science & Technology

    2015-05-23

    Operational Reconnaissance: Identifying the Right Problems in a Complex World A Monograph by MAJ Donald Erickson United States...Operational Reconnaissance: Identifying the Right Problems in a Complex World 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...proposes a model for the development of an operational reconnaissance force and explores its development and conceptual usage in World War II and the

  13. Analyzing Chromatin Remodeling Complexes Using Shotgun Proteomics and Normalized Spectral Abundance Factors

    PubMed Central

    Florens, Laurence; Carozza, Michael J.; Swanson, Selene K; Fournier, Marjorie; Coleman, Michael K.; Workman, Jerry L.; Washburn, Michael P.

    2006-01-01

    Mass spectrometry based approaches are commonly used to identify proteins from multiprotein complexes, typically with the goal of identifying new complex members or identifying post translational modifications. However, with the recent demonstration that spectral counting is a powerful quantitative proteomic approach, the analysis of multiprotein complexes by mass spectrometry can be reconsidered in certain cases. Using the chromatography based approach named multidimensional protein identification technology, multiprotein complexes may be analyzed quantitatively using the normalized spectral abundance factor that allows comparison of multiple independent analyses of samples. This study describes an approach to visualize multiprotein complex datasets that provides structure function information that is superior to tabular lists of data. In this method review, we describe a reanalysis of the Rpd3/Sin3 small and large histone deacetylase complexes previously described in a tabular form to demonstrate the normalized spectral abundance factor approach. PMID:17101441

  14. A Formal Approach to Analyzing Interference Problems in Aspect-Oriented Designs

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Ye, Nan; Ding, Wenxu

    Interference problems in aspect-oriented designs refer to the undesired interference between aspects and base programs that can lead to the emergence of unexpected behaviors, which do harm to the correctness of the entire system. We present a rigorous approach to analyzing the interference problems in aspect-oriented designs. Formal representations of classes and aspects are defined in terms of designs in UTP, while the weaving techniques in AOP are interpreted as the compositions of corresponding formal models. Conflicts between an aspect and base programs as well as between two aspects can be detected by calculating the weakest preconditions. Furthermore, the calculation also provides informative guidelines on how to solve the conflicts it found. Early detecting and removing conflicts in aspect-oriented design models can improve their qualities and save plenty of costs.

  15. Does problem complexity matter for environmental policy delivery? How public authorities address problems of water governance.

    PubMed

    Kirschke, Sabrina; Newig, Jens; Völker, Jeanette; Borchardt, Dietrich

    2017-03-08

    Problem complexity is often assumed to hamper effective environmental policy delivery. However, this claim is hardly substantiated, given the dominance of qualitative small-n designs in environmental governance research. We studied 37 types of contemporary problems defined by German water governance to assess the impact of problem complexity on policy delivery through public authorities. The analysis is based on a unique data set related to these problems, encompassing both in-depth interview-based data on complexities and independent official data on policy delivery. Our findings show that complexity in fact tends to delay implementation at the stage of planning. However, different dimensions of complexity (goals, variables, dynamics, interconnections, and uncertainty) impact on the different stages of policy delivery (goal formulation, stages and degrees of implementation) in various ways.

  16. Dyspareunia: a complex problem requiring a selective approach.

    PubMed

    Walid, Mohammad Sami; Heaton, Richard L

    2009-09-01

    Dyspareunia frequently has a multifactorial aetiology. The problem with the term is that it is not specific enough and does not allow for proper discussion of the very important problem of pain with sexual intercourse, a problem that can be very disturbing to a couple's relationship. We present two cases of patients who had multiple potential anatomic reasons for dyspareunia. The clinical picture, treatment strategy and the complex nature of deep penetration pain was discussed. We also proposed a new way of defining dyspareunia to allow a more adequate way of studying and discussing the problem.

  17. From problem solving to problem definition: scrutinizing the complex nature of clinical practice.

    PubMed

    Cristancho, Sayra; Lingard, Lorelei; Regehr, Glenn

    2017-02-01

    In medical education, we have tended to present problems as being singular, stable, and solvable. Problem solving has, therefore, drawn much of medical education researchers' attention. This focus has been important but it is limited in terms of preparing clinicians to deal with the complexity of the 21st century healthcare system in which they will provide team-based care for patients with complex medical illness. In this paper, we use the Soft Systems Engineering principles to introduce the idea that in complex, team-based situations, problems usually involve divergent views and evolve with multiple solution iterations. As such we need to shift the conversation from (1) problem solving to problem definition, and (2) from a problem definition derived exclusively at the level of the individual to a definition derived at the level of the situation in which the problem is manifested. Embracing such a focus on problem definition will enable us to advocate for novel educational practices that will equip trainees to effectively manage the problems they will encounter in complex, team-based healthcare.

  18. Semantic Annotation of Complex Text Structures in Problem Reports

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Throop, David R.; Fleming, Land D.

    2011-01-01

    Text analysis is important for effective information retrieval from databases where the critical information is embedded in text fields. Aerospace safety depends on effective retrieval of relevant and related problem reports for the purpose of trend analysis. The complex text syntax in problem descriptions has limited statistical text mining of problem reports. The presentation describes an intelligent tagging approach that applies syntactic and then semantic analysis to overcome this problem. The tags identify types of problems and equipment that are embedded in the text descriptions. The power of these tags is illustrated in a faceted searching and browsing interface for problem report trending that combines automatically generated tags with database code fields and temporal information.

  19. Particle swarm optimization for complex nonlinear optimization problems

    NASA Astrophysics Data System (ADS)

    Alexandridis, Alex; Famelis, Ioannis Th.; Tsitouras, Charalambos

    2016-06-01

    This work presents the application of a technique belonging to evolutionary computation, namely particle swarm optimization (PSO), to complex nonlinear optimization problems. To be more specific, a PSO optimizer is setup and applied to the derivation of Runge-Kutta pairs for the numerical solution of initial value problems. The effect of critical PSO operational parameters on the performance of the proposed scheme is thoroughly investigated.

  20. Extended variational theory of complex rays in heterogeneous Helmholtz problem

    NASA Astrophysics Data System (ADS)

    Li, Hao; Ladeveze, Pierre; Riou, Hervé

    2017-02-01

    In the past years, a numerical technique method called Variational Theory of Complex Rays (VTCR) has been developed for vibration problems in medium frequency. It is a Trefftz Discontinuous Galerkin method which uses plane wave functions as shape functions. However this method is only well developed in homogeneous case. In this paper, VTCR is extended to the heterogeneous Helmholtz problem by creating a new base of shape functions. Numerical examples give a scope of the performances of such an extension of VTCR.

  1. Inverse Spectral Problems for Tridiagonal N by N Complex Hamiltonians

    NASA Astrophysics Data System (ADS)

    Guseinov, Gusein Sh.

    2009-02-01

    In this paper, the concept of generalized spectral function is introduced for finite-order tridiagonal symmetric matrices (Jacobi matrices) with complex entries. The structure of the generalized spectral function is described in terms of spectral data consisting of the eigenvalues and normalizing numbers of the matrix. The inverse problems from generalized spectral function as well as from spectral data are investigated. In this way, a procedure for construction of complex tridiagonal matrices having real eigenvalues is obtained.

  2. Theory of periodically specified problems: Complexity and approximability

    SciTech Connect

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.; Rosenkrantz, D.J.

    1997-12-05

    We study the complexity and the efficient approximability of graph and satisfiability problems when specified using various kinds of periodic specifications studied. The general results obtained include the following: (1) We characterize the complexities of several basic generalized CNF satisfiability problems SAT(S) [Sc78], when instances are specified using various kinds of 1- and 2-dimensional periodic specifications. We outline how this characterization can be used to prove a number of new hardness results for the complexity classes DSPACE(n), NSPACE(n), DEXPTIME, NEXPTIME, EXPSPACE etc. These results can be used to prove in a unified way the hardness of a number of combinatorial problems when instances are specified succinctly using various succient specifications considered in the literature. As one corollary, we show that a number of basic NP-hard problems because EXPSPACE-hard when inputs are represented using 1-dimensional infinite periodic wide specifications. This answers a long standing open question posed by Orlin. (2) We outline a simple yet a general technique to devise approximation algorithms with provable worst case performance guarantees for a number of combinatorial problems specified periodically. Our efficient approximation algorithms and schemes are based on extensions of the ideas and represent the first non-trivial characterization of a class of problems having an {epsilon}-approximation (or PTAS) for periodically specified NEXPTIME-hard problems. Two of properties of our results are: (i) For the first time, efficient approximation algorithms and schemes have been developed for natural NEXPTIME-complete problems. (ii) Our results are the first polynomial time approximation algorithms with good performance guarantees for hard problems specified using various kinds of periodic specifications considered in this paper.

  3. EEG activity during the performance of complex mental problems.

    PubMed

    Jausovec, N; Jausovec, K

    2000-04-01

    This study investigated differences in cognitive processes related to problem complexity. It was assumed that these differences would be reflected in respondents' EEG activity--spectral power and coherence. A second issue of the study was to compare differences between the lower (alpha(1) = 7.9-10.0 Hz), and upper alpha band (alpha(2) = 10.1-12.9 Hz). In the first experiment two well-defined problems with two levels of complexity were used. Only minor differences in EEG power and coherence measures related to problem complexity were observed. In the second experiment divergent production problems resembling tasks on creativity tests were compared with dialectic problems calling for creative solutions. Differences in EEG power measures were mainly related to the form of problem presentation (figural/verbal). In contrast, coherence was related to the level of creativity needed to solve a problem. Noticeable increased intra- and interhemispheric cooperation between mainly the far distant brain regions was observed in the EEG activity of respondents while solving the dialectic problems. These results are explained by the more intense involvement of the long cortico-cortical fiber system in creative thinking. Differences between the lower and upper alpha band were significant for the power and coherence measures. In Experiment 2, fewer differences were observed in power measures in the upper alpha band than in the lower alpha band. A reverse pattern was observed for the coherence measures. These results hint to a functional independence of the two alpha bands, however, they do not allow to draw firm conclusions about their functional meanings. The study showed that it is unlikely that individuals solve well- and ill-defined problems by employing similar cognitive strategies.

  4. Investigating the Effect of Complexity Factors in Gas Law Problems

    ERIC Educational Resources Information Center

    Schuttlefield, Jennifer D.; Kirk, John; Pienta, Norbert J.; Tang, Hui

    2012-01-01

    Undergraduate students were asked to complete gas law questions using a Web-based tool as a first step in our understanding of the role of cognitive load in chemistry word questions and in helping us assess student problem-solving. Each question contained five different complexity factors, which were randomly assigned by the tool so that a…

  5. What Do Employers Pay for Employees' Complex Problem Solving Skills?

    ERIC Educational Resources Information Center

    Ederer, Peer; Nedelkoska, Ljubica; Patt, Alexander; Castellazzi, Silvia

    2015-01-01

    We estimate the market value that employers assign to the complex problem solving (CPS) skills of their employees, using individual-level Mincer-style wage regressions. For the purpose of the study, we collected new and unique data using psychometric measures of CPS and an extensive background questionnaire on employees' personal and work history.…

  6. Olae: A Bayesian Performance Assessment for Complex Problem Solving.

    ERIC Educational Resources Information Center

    VanLehn, Kurt

    Olae is a computer system for assessing student knowledge of physics, and Newtonian mechanics in particular, using performance data collected while students solve complex problems. Although originally designed as a stand-alone system, it has also been used as part of the Andes intelligent tutoring system. Like many other performance assessment…

  7. Application of NASA management approach to solve complex problems on earth

    NASA Technical Reports Server (NTRS)

    Potate, J. S.

    1972-01-01

    The application of NASA management approach to solving complex problems on earth is discussed. The management of the Apollo program is presented as an example of effective management techniques. Four key elements of effective management are analyzed. Photographs of the Cape Kennedy launch sites and supporting equipment are included to support the discussions.

  8. The Complex Route to Success: Complex Problem-Solving Skills in the Prediction of University Success

    ERIC Educational Resources Information Center

    Stadler, Matthias J.; Becker, Nicolas; Greiff, Samuel; Spinath, Frank M.

    2016-01-01

    Successful completion of a university degree is a complex matter. Based on considerations regarding the demands of acquiring a university degree, the aim of this paper was to investigate the utility of complex problem-solving (CPS) skills in the prediction of objective and subjective university success (SUS). The key finding of this study was that…

  9. Complexity and efficient approximability of two dimensional periodically specified problems

    SciTech Connect

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.

    1996-09-01

    The authors consider the two dimensional periodic specifications: a method to specify succinctly objects with highly regular repetitive structure. These specifications arise naturally when processing engineering designs including VLSI designs. These specifications can specify objects whose sizes are exponentially larger than the sizes of the specification themselves. Consequently solving a periodically specified problem by explicitly expanding the instance is prohibitively expensive in terms of computational resources. This leads one to investigate the complexity and efficient approximability of solving graph theoretic and combinatorial problems when instances are specified using two dimensional periodic specifications. They prove the following results: (1) several classical NP-hard optimization problems become NEXPTIME-hard, when instances are specified using two dimensional periodic specifications; (2) in contrast, several of these NEXPTIME-hard problems have polynomial time approximation algorithms with guaranteed worst case performance.

  10. A generalized topological entropy for analyzing the complexity of DNA sequences.

    PubMed

    Jin, Shuilin; Tan, Renjie; Jiang, Qinghua; Xu, Li; Peng, Jiajie; Wang, Yong; Wang, Yadong

    2014-01-01

    Topological entropy is one of the most difficult entropies to be used to analyze the DNA sequences, due to the finite sample and high-dimensionality problems. In order to overcome these problems, a generalized topological entropy is introduced. The relationship between the topological entropy and the generalized topological entropy is compared, which shows the topological entropy is a special case of the generalized entropy. As an application the generalized topological entropy in introns, exons and promoter regions was computed, respectively. The results indicate that the entropy of introns is higher than that of exons, and the entropy of the exons is higher than that of the promoter regions for each chromosome, which suggest that DNA sequence of the promoter regions is more regular than the exons and introns.

  11. Analyzing Pre-Service Primary Teachers' Fraction Knowledge Structures through Problem Posing

    ERIC Educational Resources Information Center

    Kilic, Cigdem

    2015-01-01

    In this study it was aimed to determine pre-service primary teachers' knowledge structures of fraction through problem posing activities. A total of 90 pre-service primary teachers participated in this study. A problem posing test consisting of two questions was used and the participants were asked to generate as many as problems based on the…

  12. How Humans Solve Complex Problems: The Case of the Knapsack Problem

    PubMed Central

    Murawski, Carsten; Bossaerts, Peter

    2016-01-01

    Life presents us with problems of varying complexity. Yet, complexity is not accounted for in theories of human decision-making. Here we study instances of the knapsack problem, a discrete optimisation problem commonly encountered at all levels of cognition, from attention gating to intellectual discovery. Complexity of this problem is well understood from the perspective of a mechanical device like a computer. We show experimentally that human performance too decreased with complexity as defined in computer science. Defying traditional economic principles, participants spent effort way beyond the point where marginal gain was positive, and economic performance increased with instance difficulty. Human attempts at solving the instances exhibited commonalities with algorithms developed for computers, although biological resource constraints–limited working and episodic memories–had noticeable impact. Consistent with the very nature of the knapsack problem, only a minority of participants found the solution–often quickly–but the ones who did appeared not to realise. Substantial heterogeneity emerged, suggesting why prizes and patents, schemes that incentivise intellectual discovery but discourage information sharing, have been found to be less effective than mechanisms that reveal private information, such as markets. PMID:27713516

  13. Analyzing the Heterogeneity and Complexity of Electronic Health Record Oriented Phenotyping Algorithms

    PubMed Central

    Conway, Mike; Berg, Richard L.; Carrell, David; Denny, Joshua C.; Kho, Abel N.; Kullo, Iftikhar J.; Linneman, James G.; Pacheco, Jennifer A.; Peissig, Peggy; Rasmussen, Luke; Weston, Noah; Chute, Christopher G.; Pathak, Jyotishman

    2011-01-01

    The need for formal representations of eligibility criteria for clinical trials – and for phenotyping more generally – has been recognized for some time. Indeed, the availability of a formal computable representation that adequately reflects the types of data and logic evidenced in trial designs is a prerequisite for the automatic identification of study-eligible patients from Electronic Health Records. As part of the wider process of representation development, this paper reports on an analysis of fourteen Electronic Health Record oriented phenotyping algorithms (developed as part of the eMERGE project) in terms of their constituent data elements, types of logic used and temporal characteristics. We discovered that the majority of eMERGE algorithms analyzed include complex, nested boolean logic and negation, with several dependent on cardinality constraints and complex temporal logic. Insights gained from the study will be used to augment the CDISC Protocol Representation Model. PMID:22195079

  14. A Comparison of Geographic Information Systems, Complex Networks, and Other Models for Analyzing Transportation Network Topologies

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia (Technical Monitor); Kuby, Michael; Tierney, Sean; Roberts, Tyler; Upchurch, Christopher

    2005-01-01

    This report reviews six classes of models that are used for studying transportation network topologies. The report is motivated by two main questions. First, what can the "new science" of complex networks (scale-free, small-world networks) contribute to our understanding of transport network structure, compared to more traditional methods? Second, how can geographic information systems (GIS) contribute to studying transport networks? The report defines terms that can be used to classify different kinds of models by their function, composition, mechanism, spatial and temporal dimensions, certainty, linearity, and resolution. Six broad classes of models for analyzing transport network topologies are then explored: GIS; static graph theory; complex networks; mathematical programming; simulation; and agent-based modeling. Each class of models is defined and classified according to the attributes introduced earlier. The paper identifies some typical types of research questions about network structure that have been addressed by each class of model in the literature.

  15. Analyzing networks of phenotypes in complex diseases: methodology and applications in COPD

    PubMed Central

    2014-01-01

    Background The investigation of complex disease heterogeneity has been challenging. Here, we introduce a network-based approach, using partial correlations, that analyzes the relationships among multiple disease-related phenotypes. Results We applied this method to two large, well-characterized studies of chronic obstructive pulmonary disease (COPD). We also examined the associations between these COPD phenotypic networks and other factors, including case-control status, disease severity, and genetic variants. Using these phenotypic networks, we have detected novel relationships between phenotypes that would not have been observed using traditional epidemiological approaches. Conclusion Phenotypic network analysis of complex diseases could provide novel insights into disease susceptibility, disease severity, and genetic mechanisms. PMID:24964944

  16. Binocular adaptive optics vision analyzer with full control over the complex pupil functions.

    PubMed

    Schwarz, Christina; Prieto, Pedro M; Fernández, Enrique J; Artal, Pablo

    2011-12-15

    We present a binocular adaptive optics vision analyzer fully capable of controlling both amplitude and phase of the two complex pupil functions in each eye of the subject. A special feature of the instrument is its comparatively simple setup. A single reflective liquid crystal on silicon spatial light modulator working in pure phase modulation generates the phase profiles for both pupils simultaneously. In addition, another liquid crystal spatial light modulator working in transmission operates in pure intensity modulation to produce a large variety of pupil masks for each eye. Subjects perform visual tasks through any predefined variations of the complex pupil function for both eyes. As an example of the system efficiency, we recorded images of the stimuli through the system as they were projected at the subject's retina. This instrument proves to be extremely versatile for designing and testing novel ophthalmic elements and simulating visual outcomes, as well as for further research of binocular vision.

  17. [Problems of protecting complex inventions in the field of microbiology].

    PubMed

    Korovkin, V I

    1978-01-01

    Some problems are discussed which are connected with the protection of inventions in the field of microbiology when the invention is complex. The rights of the author are determined when a method and a product are to be protected at the same time. The additional juridical protection of a microbial strain is not necessary. The complex protection of a microbial strain and the method of its utilization is recommended in certain cases since it might prevent conflicts which arise upon the parallel juridical protection of a strain and the method of its utilization.

  18. Data Mining and Complex Problems: Case Study in Composite Materials

    NASA Technical Reports Server (NTRS)

    Rabelo, Luis; Marin, Mario

    2009-01-01

    Data mining is defined as the discovery of useful, possibly unexpected, patterns and relationships in data using statistical and non-statistical techniques in order to develop schemes for decision and policy making. Data mining can be used to discover the sources and causes of problems in complex systems. In addition, data mining can support simulation strategies by finding the different constants and parameters to be used in the development of simulation models. This paper introduces a framework for data mining and its application to complex problems. To further explain some of the concepts outlined in this paper, the potential application to the NASA Shuttle Reinforced Carbon-Carbon structures and genetic programming is used as an illustration.

  19. Complexity of hierarchically and 1-dimensional periodically specified problems

    SciTech Connect

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.; Radhakrishnan, V.

    1995-08-23

    We study the complexity of various combinatorial and satisfiability problems when instances are specified using one of the following specifications: (1) the 1-dimensional finite periodic narrow specifications of Wanke and Ford et al. (2) the 1-dimensional finite periodic narrow specifications with explicit boundary conditions of Gale (3) the 2-way infinite1-dimensional narrow periodic specifications of Orlin et al. and (4) the hierarchical specifications of Lengauer et al. we obtain three general types of results. First, we prove that there is a polynomial time algorithm that given a 1-FPN- or 1-FPN(BC)specification of a graph (or a C N F formula) constructs a level-restricted L-specification of an isomorphic graph (or formula). This theorem along with the hardness results proved here provides alternative and unified proofs of many hardness results proved in the past either by Lengauer and Wagner or by Orlin. Second, we study the complexity of generalized CNF satisfiability problems of Schaefer. Assuming P {ne} PSPACE, we characterize completely the polynomial time solvability of these problems, when instances are specified as in (1), (2),(3) or (4). As applications of our first two types of results, we obtain a number of new PSPACE-hardness and polynomial time algorithms for problems specified as in (1), (2), (3) or(4). Many of our results also hold for O(log N) bandwidth bounded planar instances.

  20. Mechanical stability of bivalent transition metal complexes analyzed by single-molecule force spectroscopy

    PubMed Central

    Gensler, Manuel; Eidamshaus, Christian; Taszarek, Maurice; Reissig, Hans-Ulrich

    2015-01-01

    Summary Multivalent biomolecular interactions allow for a balanced interplay of mechanical stability and malleability, and nature makes widely use of it. For instance, systems of similar thermal stability may have very different rupture forces. Thus it is of paramount interest to study and understand the mechanical properties of multivalent systems through well-characterized model systems. We analyzed the rupture behavior of three different bivalent pyridine coordination complexes with Cu2+ in aqueous environment by single-molecule force spectroscopy. Those complexes share the same supramolecular interaction leading to similar thermal off-rates in the range of 0.09 and 0.36 s−1, compared to 1.7 s−1 for the monovalent complex. On the other hand, the backbones exhibit different flexibility, and we determined a broad range of rupture lengths between 0.3 and 1.1 nm, with higher most-probable rupture forces for the stiffer backbones. Interestingly, the medium-flexible connection has the highest rupture forces, whereas the ligands with highest and lowest rigidity seem to be prone to consecutive bond rupture. The presented approach allows separating bond and backbone effects in multivalent model systems. PMID:26124883

  1. Unpacking complexity in the analysis of environmental and geologic problems

    SciTech Connect

    Pinet, P.R. . Geology Dept.)

    1992-01-01

    In order to understand or to make policy decisions about environmental issues, it is imperative that the only complexity of the problem be unpacked, that is its causes and effects be separated into a natural hierarchical scheme. Unpacking complexity separates the elements that affect natural systems into primary, secondary, and tertiary factors. Primary factors are universal in the sense that they operate in a fundamental way anywhere on the globe where the system is present. Secondary factors interact with the primary elements to infuse regional characteristics into the system. Local (site-specific) factors impose a tertiary level of complexity that operates on small spatial scales. The utility of this technique will be demonstrated by several examples: the origin of an Atlantic-type continental margin, a beach-erosion study, and a groundwater investigation. The appraisal of environmental problems on a truly global scale involves the evaluation of the primary elements of the system, and a de-emphasis of the secondary and tertiary factors which are inappropriate to the scale of the study. On the other hand, policy decisions regarding a regional coastal-erosion problem or the management of a large watershed require that primary and secondary elements be addressed, and that tertiary factors be put aside. Moreover, assessing the nature of erosion at a specific beach or managing a local tract of woodland must include a consideration of all causes and effects that occur at primary, secondary, and tertiary levels. This hierarchical analysis applies to temporal scales as well. For example, solutions to beach-erosion or deforestation problems are very different when considering causes and effects over years, decades, centuries, or millennia.

  2. TOPAZ - the transient one-dimensional pipe flow analyzer: code validation and sample problems

    SciTech Connect

    Winters, W.S.

    1985-10-01

    TOPAZ is a ''user friendly'' computer code for modeling the one-dimensional-transient physics of multi-species gas transfer in arbitrary arrangements of pipes, valves, vessels, and flow branches. This document presents a series of sample problems designed to aid potential users in creating TOPAZ input files. To the extent possible, sample problems were selected for which analytical solutions currently exist. TOPAZ comparisons with such solutions are intended to provide a measure of code validation.

  3. Employing the Hilbert-Huang Transform to analyze observed natural complex signals: Calm wind meandering cases

    NASA Astrophysics Data System (ADS)

    Martins, Luis Gustavo Nogueira; Stefanello, Michel Baptistella; Degrazia, Gervásio Annes; Acevedo, Otávio Costa; Puhales, Franciano Scremin; Demarco, Giuliano; Mortarini, Luca; Anfossi, Domenico; Roberti, Débora Regina; Denardin, Felipe Costa; Maldaner, Silvana

    2016-11-01

    In this study we analyze natural complex signals employing the Hilbert-Huang spectral analysis. Specifically, low wind meandering meteorological data are decomposed into turbulent and non turbulent components. These non turbulent movements, responsible for the absence of a preferential direction of the horizontal wind, provoke negative lobes in the meandering autocorrelation functions. The meandering characteristic time scales (meandering periods) are determined from the spectral peak provided by the Hilbert-Huang marginal spectrum. The magnitudes of the temperature and horizontal wind meandering period obtained agree with the results found from the best fit of the heuristic meandering autocorrelation functions. Therefore, the new method represents a new procedure to evaluate meandering periods that does not employ mathematical expressions to represent observed meandering autocorrelation functions.

  4. Complex Genotype Mixtures Analyzed by Deep Sequencing in Two Different Regions of Hepatitis B Virus

    PubMed Central

    Homs, Maria; Tabernero, David; Gonzalez, Carolina; Quer, Josep; Blasi, Maria; Casillas, Rosario; Nieto, Leonardo; Riveiro-Barciela, Mar; Esteban, Rafael; Buti, Maria; Rodriguez-Frias, Francisco

    2015-01-01

    This study assesses the presence and outcome of genotype mixtures in the polymerase/surface and X/preCore regions of the HBV genome in patients with chronic hepatitis B virus (HBV) infection. Thirty samples from ten chronic hepatitis B patients were included. The polymerase/surface and X/preCore regions were analyzed by deep sequencing (UDPS) in the first available sample at diagnosis, a pre-treatment sample, and a sample while under treatment. HBV genotype was determined by phylogenesis. Quasispecies complexity was evaluated by mutation frequency and nucleotide diversity. The polymerase/surface and X/preCore regions were validated for genotyping from 113 GenBank reference sequences. UDPS yielded a median of 10,960 sequences per sample (IQR 16,645) in the polymerase/surface region and 11,595 sequences per sample (IQR 14,682) in X/preCore. Genotype mixtures were more common in X/preCore (90%) than in polymerase/surface (30%) (p<0.001). On X/preCore genotyping, all samples were genotype A, whereas polymerase/surface yielded genotypes A (80%), D (16.7%), and F (3.3%) (p = 0.036). Genotype changes in polymerase/surface were observed in four patients during natural quasispecies dynamics and in two patients during treatment. There were no genotype changes in X/preCore. Quasispecies complexity was higher in X/preCore than in polymerase/surface (p = 0.004). The results provide evidence of genotype mixtures and differential genotype proportions in the polymerase/surface and X/preCore regions. The genotype dynamics in HBV infection and the different patterns of quasispecies complexity in the HBV genome suggest a new paradigm for HBV genotype classification. PMID:26714168

  5. Complex Genotype Mixtures Analyzed by Deep Sequencing in Two Different Regions of Hepatitis B Virus.

    PubMed

    Caballero, Andrea; Gregori, Josep; Homs, Maria; Tabernero, David; Gonzalez, Carolina; Quer, Josep; Blasi, Maria; Casillas, Rosario; Nieto, Leonardo; Riveiro-Barciela, Mar; Esteban, Rafael; Buti, Maria; Rodriguez-Frias, Francisco

    2015-01-01

    This study assesses the presence and outcome of genotype mixtures in the polymerase/surface and X/preCore regions of the HBV genome in patients with chronic hepatitis B virus (HBV) infection. Thirty samples from ten chronic hepatitis B patients were included. The polymerase/surface and X/preCore regions were analyzed by deep sequencing (UDPS) in the first available sample at diagnosis, a pre-treatment sample, and a sample while under treatment. HBV genotype was determined by phylogenesis. Quasispecies complexity was evaluated by mutation frequency and nucleotide diversity. The polymerase/surface and X/preCore regions were validated for genotyping from 113 GenBank reference sequences. UDPS yielded a median of 10,960 sequences per sample (IQR 16,645) in the polymerase/surface region and 11,595 sequences per sample (IQR 14,682) in X/preCore. Genotype mixtures were more common in X/preCore (90%) than in polymerase/surface (30%) (p<0.001). On X/preCore genotyping, all samples were genotype A, whereas polymerase/surface yielded genotypes A (80%), D (16.7%), and F (3.3%) (p = 0.036). Genotype changes in polymerase/surface were observed in four patients during natural quasispecies dynamics and in two patients during treatment. There were no genotype changes in X/preCore. Quasispecies complexity was higher in X/preCore than in polymerase/surface (p = 0.004). The results provide evidence of genotype mixtures and differential genotype proportions in the polymerase/surface and X/preCore regions. The genotype dynamics in HBV infection and the different patterns of quasispecies complexity in the HBV genome suggest a new paradigm for HBV genotype classification.

  6. Complexity and approximability of certain bicriteria location problems

    SciTech Connect

    Krumke, S.O.; Noltemeier, H.; Ravi, S.S.; Marathe, M.V.

    1995-10-01

    We investigate the complexity and approximability of some location problems when two distance values are specified for each pair of potential sites. These problems involve the selection of a specified number of facilities (i.e. a placement of a specified size) to minimize a function of one distance metric subject to a budget constraint on the other distance metric. Such problems arise in several application areas including statistical clustering, pattern recognition and load-balancing in distributed systems. We show that, in general, obtaining placements that are near-optimal with respect to the first distance metric is NP-hard even when we allow the budget constraint on the second distance metric to be violated by a constant factor. However, when both the distance metrics satisfy the triangle inequality, we present approximation algorithms that produce placements which are near-optimal with respect to the first distance metric while violating the budget constraint only by a small constant factor. We also present polynomial algorithms for these problems when the underlying graph is a tree.

  7. Intelligent Tutoring for Diagnostic Problem Solving in Complex Dynamic Systems

    DTIC Science & Technology

    1991-09-01

    tre cOllecti0n Of ormtio. n he r n tmt or er e Of this C ,ollectof of inorma o. Icluding suggeiont for reducing tis burden. to Watiington N _dgar i...AND SUBTITLE S. FUNDING NUMBERS Intelligent Tutoring for Diagnostic Problem Solving in Complex Dynamic Systems C : N00014-87-K-0482 6. AUTHOR(S) PE...Chronister, Sally Cohen, Ed Crowther, Kelly Deyoe, Suzanne Dilley, Brenda Downs, Janet Fath, Dick Henneman , Patty Jones, Merrick Kossack, Steve Krosner

  8. Coordinating complex problem-solving among distributed intelligent agents

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1992-01-01

    A process-oriented control model is described for distributed problem solving. The model coordinates the transfer and manipulation of information across independent networked applications, both intelligent and conventional. The model was implemented using SOCIAL, a set of object-oriented tools for distributing computing. Complex sequences of distributed tasks are specified in terms of high level scripts. Scripts are executed by SOCIAL objects called Manager Agents, which realize an intelligent coordination model that routes individual tasks to suitable server applications across the network. These tools are illustrated in a prototype distributed system for decision support of ground operations for NASA's Space Shuttle fleet.

  9. Radio interferometric gain calibration as a complex optimization problem

    NASA Astrophysics Data System (ADS)

    Smirnov, O. M.; Tasse, C.

    2015-05-01

    Recent developments in optimization theory have extended some traditional algorithms for least-squares optimization of real-valued functions (Gauss-Newton, Levenberg-Marquardt, etc.) into the domain of complex functions of a complex variable. This employs a formalism called the Wirtinger derivative, and derives a full-complex Jacobian counterpart to the conventional real Jacobian. We apply these developments to the problem of radio interferometric gain calibration, and show how the general complex Jacobian formalism, when combined with conventional optimization approaches, yields a whole new family of calibration algorithms, including those for the polarized and direction-dependent gain regime. We further extend the Wirtinger calculus to an operator-based matrix calculus for describing the polarized calibration regime. Using approximate matrix inversion results in computationally efficient implementations; we show that some recently proposed calibration algorithms such as STEFCAL and peeling can be understood as special cases of this, and place them in the context of the general formalism. Finally, we present an implementation and some applied results of COHJONES, another specialized direction-dependent calibration algorithm derived from the formalism.

  10. Case Studies in Critical Ecoliteracy: A Curriculum for Analyzing the Social Foundations of Environmental Problems

    ERIC Educational Resources Information Center

    Turner, Rita; Donnelly, Ryan

    2013-01-01

    This article outlines the features and application of a set of model curriculum materials that utilize eco-democratic principles and humanities-based content to cultivate critical analysis of the cultural foundations of socio-environmental problems. We first describe the goals and components of the materials, then discuss results of their use in…

  11. Using a Database to Analyze Core Basic Science Content in a Problem-Based Curriculum.

    ERIC Educational Resources Information Center

    Rosen, Robert L.; And Others

    1992-01-01

    A study used computer analysis to examine distribution of basic science content in the 53 cases in the problem-based medical curriculum of Rush Medical College (Illinois) and compared it to application of that content by students and faculty. The method of analysis is recommended for reviewing curricula for omissions and redundancy. (Author/MSE)

  12. Analyzing and Attempting to Overcome Prospective Teachers' Difficulties during Problem-Solving Instruction

    ERIC Educational Resources Information Center

    Karp, Alexander

    2010-01-01

    This article analyzes the experiences of prospective secondary mathematics teachers during a teaching methods course, offered prior to their student teaching, but involving actual teaching and reflexive analysis of this teaching. The study focuses on the pedagogical difficulties that arose during their teaching, in which prospective teachers…

  13. Improving Analyzing Skills of Primary Students Using a Problem Solving Strategy

    ERIC Educational Resources Information Center

    Cabanilla-Pedro, Lily Ann; Acob-Navales, Margelyn; Josue, Fe T.

    2004-01-01

    This study makes use of an action research paradigm to improve primary students' analyzing skills. It was conducted at the San Esteban Elementary School, Region I, Philippines, during the 6-week off campus practice teaching of one of the researchers. Sources of data include a thinking skills checklist, a set of Curriculum Support Materials (CSM),…

  14. Language and rigour in qualitative research: problems and principles in analyzing data collected in Mandarin.

    PubMed

    Smith, Helen J; Chen, Jing; Liu, Xiaoyun

    2008-07-10

    In collaborative qualitative research in Asia, data are usually collected in the national language, and this poses challenges for analysis. Translation of transcripts to a language common to the whole research team is time consuming and expensive; meaning can easily be lost in translation; and validity of the data may be compromised in this process. We draw on several published examples from public health research conducted in mainland China, to highlight how language can influence rigour in the qualitative research process; for each problem we suggest potential solutions based on the methods used in one of our research projects in China. Problems we have encountered include obtaining sufficient depth and detail in qualitative data; deciding on language for data collection; managing data collected in Mandarin; and the influence of language on interpreting meaning. We have suggested methods for overcoming problems associated with collecting, analysing, and interpreting qualitative data in a local language, that we think help maintain analytical openness in collaborative qualitative research. We developed these methods specifically in research conducted in Mandarin in mainland China; but they need further testing in other countries with data collected in other languages. Examples from other researchers are needed.

  15. A Real Space Cellular Automaton Laboratory (ReSCAL) to analyze complex geophysical systems

    NASA Astrophysics Data System (ADS)

    Rozier, O.; Narteau, C.

    2012-04-01

    The Real Space Cellular Automaton Laboratory (ReSCAL) is a generator of 3D multiphysics, markovian and stochastic cellular automata with continuous time. The objective of this new software released under a GNU licence is to develop interdisciplinary research collaboration to investigate the dynamics of complex geophysical systems. In a vast majority of cases, a numerical model is a set of physical variables (temperature, pressure, velocity, etc...) that are recalculated over time according to some predetermined rules or equations. Then, any point in space is entirely characterized by a local set of parameters. This is not the case in ReSCAL where the only local variable is a state-parameter that represent the different phases involved in the problem. An elementary cell represent a given volume of real-space. Pairs of nearest neighbour cells are called doublet. For each individual physical process that we take into account, there is a set of doublet transitions. Using this approach we can model a wide range of physical-chemical or anthropological processes. Here, we present different ingredients of ReSCAL using published applications in geosciences (Narteau et al. 2001 and 2009). We also show how ReSCAL can be developped and used across many displines in geophysics and physical geography. Supplementary informations: Sources files of ReSCAL can be download on http://www.ipgp.fr/~rozier/ReSCAL/rescal-en.html

  16. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    NASA Technical Reports Server (NTRS)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  17. Leveraging Cultural Resources through Teacher Pedagogical Reasoning: Elementary Grade Teachers Analyze Second Language Learners' Science Problem Solving

    ERIC Educational Resources Information Center

    Buxton, Cory A.; Salinas, Alejandra; Mahotiere, Margarette; Lee, Okhee; Secada, Walter G.

    2013-01-01

    Grounded in teacher professional development addressing the intersection of student diversity and content area instruction, this study examined school teachers' pedagogical reasoning complexity as they reflected on their second language learners' science problem solving abilities using both home and school contexts. Teachers responded to interview…

  18. Understanding the determinants of problem-solving behavior in a complex environment

    NASA Technical Reports Server (NTRS)

    Casner, Stephen A.

    1994-01-01

    It is often argued that problem-solving behavior in a complex environment is determined as much by the features of the environment as by the goals of the problem solver. This article explores a technique to determine the extent to which measured features of a complex environment influence problem-solving behavior observed within that environment. In this study, the technique is used to determine how complex flight deck and air traffic control environment influences the strategies used by airline pilots when controlling the flight path of a modern jetliner. Data collected aboard 16 commercial flights are used to measure selected features of the task environment. A record of the pilots' problem-solving behavior is analyzed to determine to what extent behavior is adapted to the environmental features that were measured. The results suggest that the measured features of the environment account for as much as half of the variability in the pilots' problem-solving behavior and provide estimates on the probable effects of each environmental feature.

  19. Human opinion dynamics: An inspiration to solve complex optimization problems

    PubMed Central

    Kaur, Rishemjit; Kumar, Ritesh; Bhondekar, Amol P.; Kapur, Pawan

    2013-01-01

    Human interactions give rise to the formation of different kinds of opinions in a society. The study of formations and dynamics of opinions has been one of the most important areas in social physics. The opinion dynamics and associated social structure leads to decision making or so called opinion consensus. Opinion formation is a process of collective intelligence evolving from the integrative tendencies of social influence with the disintegrative effects of individualisation, and therefore could be exploited for developing search strategies. Here, we demonstrate that human opinion dynamics can be utilised to solve complex mathematical optimization problems. The results have been compared with a standard algorithm inspired from bird flocking behaviour and the comparison proves the efficacy of the proposed approach in general. Our investigation may open new avenues towards understanding the collective decision making. PMID:24141795

  20. Determining electron temperature for small spherical probes from network analyzer measurements of complex impedance

    NASA Astrophysics Data System (ADS)

    Walker, D. N.; Fernsler, R. F.; Blackwell, D. D.; Amatucci, W. E.

    2008-12-01

    In earlier work, using a network analyzer, it was shown that collisionless resistance (CR) exists in the sheath of a spherical probe when driven by a small rf signal. The CR is inversely proportional to the plasma density gradient at the location where the applied angular frequency equals the plasma frequency ωpe. Recently, efforts have concentrated on a study of the low-to-intermediate frequency response of the probe to the rf signal. At sufficiently low frequencies, the CR is beyond cutoff, i.e., below the plasma frequency at the surface of the probe. Since the electron density at the probe surface decreases as a function of applied (negative) bias, the CR will extend to lower frequencies as the magnitude of negative bias increases. Therefore to eliminate both CR and ion current contributions, the frequencies presently being considered are much greater than the ion plasma frequency, ωpi, but less than the plasma frequency, ωpe(r0), where r0 is the probe radius. It is shown that, in this frequency regime, the complex impedance measurements made with a network analyzer can be used to determine electron temperature. An overview of the theory is presented along with comparisons to data sets made using three stainless steel spherical probes of different sizes in different experimental environments and different plasma parameter regimes. The temperature measurements made by this method are compared to those made by conventional Langmuir probe sweeps; the method shown here requires no curve fitting as is the usual procedure with Langmuir probes when a Maxwell-Boltzmann electron distribution is assumed. The new method requires, however, a solution of the Poisson equation to determine the approximate sheath dimensions and integrals to determine approximate plasma and sheath inductances. The solution relies on the calculation of impedance for a spherical probe immersed in a collisionless plasma and is based on a simple circuit analogy for the plasma. Finally, the

  1. On the problem of analyzing the dynamic properties of a layered half-space

    NASA Astrophysics Data System (ADS)

    Belyankova, T. I.; Kalinchuk, V. V.

    2014-09-01

    An efficient method is developed for constructing the Green matrix functions of a layered inhomogeneous half-space. Matrix formulas convenient for programming are proposed, which make it possible to study the properties of a multilayered half-space with high accuracy. As an example of the problem of oscillations of a three-layer half-space, transformation of the dispersion characteristics of a three-layered medium is shown as a function of the relations of the mechanical and geometric parameters of its components. Study of the properties of the Green's function of a medium with a low-velocity layered inclusion showed that each mode of a surface wave exists in a limited frequency range: in addition to the critical frequency of mode occurrence, the frequency of its disappearance exists—a frequency above which the mode is suppressed because of superposition of the zero of the Green's function on its pole. A similar study conducted for a medium with a high-velocity layered inclusion has shown that in addition to the cutoff frequency (the frequency at which a surface wave propagating in the low-frequency range disappears), there is the frequency of its recurrent generation—the upper boundary of the "cutoff range" of the first mode. Beyond this range, the first mode propagates, and also the other propagating modes can appear. The critical relation of the geometric parameters of the medium determining the existence and boundaries of the cutoff range of a wave is established.

  2. Applied social and behavioral science to address complex health problems.

    PubMed

    Livingood, William C; Allegrante, John P; Airhihenbuwa, Collins O; Clark, Noreen M; Windsor, Richard C; Zimmerman, Marc A; Green, Lawrence W

    2011-11-01

    Complex and dynamic societal factors continue to challenge the capacity of the social and behavioral sciences in preventive medicine and public health to overcome the most seemingly intractable health problems. This paper proposes a fundamental shift from a research approach that presumes to identify (from highly controlled trials) universally applicable interventions expected to be implemented "with fidelity" by practitioners, to an applied social and behavioral science approach similar to that of engineering. Such a shift would build on and complement the recent recommendations of the NIH Office of Behavioral and Social Science Research and require reformulation of the research-practice dichotomy. It would also require disciplines now engaged in preventive medicine and public health practice to develop a better understanding of systems thinking and the science of application that is sensitive to the complexity, interactivity, and unique elements of community and practice settings. Also needed is a modification of health-related education to ensure that those entering the disciplines develop instincts and capacities as applied scientists.

  3. Sporothrix schenckii complex and sporotrichosis, an emerging health problem.

    PubMed

    López-Romero, Everardo; Reyes-Montes, María del Rocío; Pérez-Torres, Armando; Ruiz-Baca, Estela; Villagómez-Castro, Julio C; Mora-Montes, Héctor M; Flores-Carreón, Arturo; Toriello, Conchita

    2011-01-01

    Sporothrix schenckii, now named the S. schenckii species complex, has largely been known as the etiological agent of sporotrichosis, which is an acute or chronic subcutaneous mycosis of humans and other mammals. Gene sequencing has revealed the following species in the S. schenckii complex: Sporothrix albicans, Sporothrix brasiliensis, Sporothrix globosa, Sporothrix luriei, Sporothrix mexicana and S. schenckii. The increasing number of reports of Sporothrix infection in immunocompromised patients, mainly the HIV-infected population, suggests sporotrichosis as an emerging global health problem concomitant with the AIDS pandemic. Molecular studies have demonstrated a high level of intraspecific variability. Components of the S. schenckii cell wall that act as adhesins and immunogenic inducers, such as a 70-kDa glycoprotein, are apparently specific to this fungus. The main glycan peptidorhamnomannan cell wall component is the only O-linked glycan structure known in S. schenckii. It contains an α-mannobiose core followed by one α-glucuronic acid unit, which may be mono- or di-rhamnosylated. The oligomeric structure of glucosamine-6-P synthase has led to a significant advance in the development of antifungals targeted to the enzyme's catalytic domain in S. schenckii.

  4. Complex Problem Exercises in Developing Engineering Students' Conceptual and Procedural Knowledge of Electromagnetics

    ERIC Educational Resources Information Center

    Leppavirta, J.; Kettunen, H.; Sihvola, A.

    2011-01-01

    Complex multistep problem exercises are one way to enhance engineering students' learning of electromagnetics (EM). This study investigates whether exposure to complex problem exercises during an introductory EM course improves students' conceptual and procedural knowledge. The performance in complex problem exercises is compared to prior success…

  5. Deep graphs—A general framework to represent and analyze heterogeneous complex systems across scales

    NASA Astrophysics Data System (ADS)

    Traxl, Dominik; Boers, Niklas; Kurths, Jürgen

    2016-06-01

    Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of

  6. Deep graphs-A general framework to represent and analyze heterogeneous complex systems across scales.

    PubMed

    Traxl, Dominik; Boers, Niklas; Kurths, Jürgen

    2016-06-01

    Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of

  7. Effects of friction and heat conduction on sound propagation in ducts. [analyzing complex aerodynamic noise problems

    NASA Technical Reports Server (NTRS)

    Huerre, P.; Karamcheti, K.

    1976-01-01

    The theory of sound propagation is examined in a viscous, heat-conducting fluid, initially at rest and in a uniform state, and contained in a rigid, impermeable duct with isothermal walls. Topics covered include: (1) theoretical formulation of the small amplitude fluctuating motions of a viscous, heat-conducting and compressible fluid; (2) sound propagation in a two dimensional duct; and (3) perturbation study of the inplane modes.

  8. Sleep, cognition, and behavioral problems in school-age children: a century of research meta-analyzed.

    PubMed

    Astill, Rebecca G; Van der Heijden, Kristiaan B; Van Ijzendoorn, Marinus H; Van Someren, Eus J W

    2012-11-01

    Clear associations of sleep, cognitive performance, and behavioral problems have been demonstrated in meta-analyses of studies in adults. This meta-analysis is the first to systematically summarize all relevant studies reporting on sleep, cognition, and behavioral problems in healthy school-age children (5-12 years old) and incorporates 86 studies on 35,936 children. Sleep duration shows a significant positive relation with cognitive performance (r = .08, confidence interval [CI] [.06, .10]). Subsequent analyses on cognitive subdomains indicate specific associations of sleep duration with executive functioning (r = .07, CI [.02, .13]), with performance on tasks that address multiple cognitive domains (r = .10, CI = [.05, .16]), and with school performance (r = .09, CI [.06, .12]), but not with intelligence. Quite unlike typical findings in adults, sleep duration was not associated with sustained attention and memory. Methodological issues and brain developmental immaturities are proposed to underlie the marked differences. Shorter sleep duration is associated with more behavioral problems (r = .09, CI [.07, .11]). Subsequent analyses on subdomains of behavioral problems showed that the relation holds for both internalizing (r = .09, CI [.06, .12]) and externalizing behavioral problems (r = .08, CI [.06, .11]). Ancillary moderator analyses identified practices recommended to increase sensitivity of assessments and designs in future studies. In practical terms, the findings suggest that insufficient sleep in children is associated with deficits in higher-order and complex cognitive functions and an increase in behavioral problems. This is particularly relevant given society's tendency towards sleep curtailment.

  9. Inverse Problems in Complex Models and Applications to Earth Sciences

    NASA Astrophysics Data System (ADS)

    Bosch, M. E.

    2015-12-01

    The inference of the subsurface earth structure and properties requires the integration of different types of data, information and knowledge, by combined processes of analysis and synthesis. To support the process of integrating information, the regular concept of data inversion is evolving to expand its application to models with multiple inner components (properties, scales, structural parameters) that explain multiple data (geophysical survey data, well-logs, core data). The probabilistic inference methods provide the natural framework for the formulation of these problems, considering a posterior probability density function (PDF) that combines the information from a prior information PDF and the new sets of observations. To formulate the posterior PDF in the context of multiple datasets, the data likelihood functions are factorized assuming independence of uncertainties for data originating across different surveys. A realistic description of the earth medium requires modeling several properties and structural parameters, which relate to each other according to dependency and independency notions. Thus, conditional probabilities across model components also factorize. A common setting proceeds by structuring the model parameter space in hierarchical layers. A primary layer (e.g. lithology) conditions a secondary layer (e.g. physical medium properties), which conditions a third layer (e.g. geophysical data). In general, less structured relations within model components and data emerge from the analysis of other inverse problems. They can be described with flexibility via direct acyclic graphs, which are graphs that map dependency relations between the model components. Examples of inverse problems in complex models can be shown at various scales. At local scale, for example, the distribution of gas saturation is inferred from pre-stack seismic data and a calibrated rock-physics model. At regional scale, joint inversion of gravity and magnetic data is applied

  10. How Cognitive Style and Problem Complexity Affect Preservice Agricultural Education Teachers' Abilities to Solve Problems in Agricultural Mechanics

    ERIC Educational Resources Information Center

    Blackburn, J. Joey; Robinson, J. Shane; Lamm, Alexa J.

    2014-01-01

    The purpose of this experimental study was to determine the effects of cognitive style and problem complexity on Oklahoma State University preservice agriculture teachers' (N = 56) ability to solve problems in small gasoline engines. Time to solution was operationalized as problem solving ability. Kirton's Adaption-Innovation Inventory was…

  11. Can SNOMED CT fulfill the vision of a compositional terminology? Analyzing the use case for problem list.

    PubMed

    Campbell, James R; Xu, Junchuan; Fung, Kin Wah

    2011-01-01

    We analyzed 598 of 63,952 terms employed in problem list entries from seven major healthcare institutions that were not mapped with UMLS to SNOMED CT when preparing the NLM UMLS-CORE problem list subset. We intended to determine whether published or post-coordinated SNOMED concepts could accurately capture the problems as stated by the clinician and to characterize the workload for the local terminology manager. From the terms we analyzed, we estimate that 7.5% of the total terms represent ambiguous statements that require clarification. Of those terms which were unambiguous, we estimate that 38.1% could be encoded using the SNOMED CT January 2011 pre-coordinated (published core) content. 60.4% of unambiguous terms required post-coordination to capture the term meaning within the SNOMED model. Approximately 28.5% of post-coordinated content could not be fully defined and required primitive forms. This left 1.5% of unambiguous terms which were expressed with meaning which could not be represented in SNOMED CT. We estimate from our study that 98.5% of clinical terms unambiguously suggested for the problem list can be equated to published concepts or can be modeled with SNOMED CT but that roughly one in four SNOMED modeled expressions fail to represent the full meaning of the term. Implications for the business model of the local terminology manager and the development of SNOMED CT are discussed.

  12. Postoperative nausea and vomiting: A simple yet complex problem

    PubMed Central

    Shaikh, Safiya Imtiaz; Nagarekha, D.; Hegade, Ganapati; Marutheesh, M.

    2016-01-01

    Postoperative nausea and vomiting (PONV) is one of the complex and significant problems in anesthesia practice, with growing trend toward ambulatory and day care surgeries. This review focuses on pathophysiology, pharmacological prophylaxis, and rescue therapy for PONV. We searched the Medline and PubMed database for articles published in English from 1991 to 2014 while writing this review using “postoperative nausea and vomiting, PONV, nausea-vomiting, PONV prophylaxis, and rescue” as keywords. PONV is influenced by multiple factors which are related to the patient, surgery, and pre-, intra-, and post-operative anesthesia factors. The risk of PONV can be assessed using a scoring system such as Apfel simplified scoring system which is based on four independent risk predictors. PONV prophylaxis is administered to patients with medium and high risks based on this scoring system. Newer drugs such as neurokinin-1 receptor antagonist (aprepitant) are used along with serotonin (5-hydroxytryptamine subtype 3) receptor antagonist, corticosteroids, anticholinergics, antihistaminics, and butyrophenones for PONV prophylaxis. Combination of drugs from different classes with different mechanism of action are administered for optimized efficacy in adults with moderate risk for PONV. Multimodal approach with combination of pharmacological and nonpharmacological prophylaxis along with interventions that reduce baseline risk is employed in patients with high PONV risk. PMID:27746521

  13. Analyzing Heterogeneous Complexity in Complementary and Alternative Medicine Research: A Systems Biology Solution via Parsimony Phylogenetics

    PubMed Central

    Abu-Asab, Mones; Koithan, Mary; Shaver, Joan; Amri, Hakima

    2012-01-01

    Summary Systems biology offers cutting-edge tools for the study of complementary and alternative medicine (CAM). The advent of ‘omics’ techniques and the resulting avalanche of scientific data have introduced an unprecedented level of complexity and heterogeneous data to biomedical research, leading to the development of novel research approaches. Statistical averaging has its limitations and is unsuitable for the analysis of heterogeneity, as it masks diversity by homogenizing otherwise heterogeneous populations. Unfortunately, most researchers are unaware of alternative methods of analysis capable of accounting for individual variability. This paper describes a systems biology solution to data complexity through the application of parsimony phylogenetic analysis. Maximum parsimony (MP) provides a data-based modeling paradigm that will permit a priori stratification of the study cohort(s), better assessment of early diagnosis, prognosis, and treatment efficacy within each stratum, and a method that could be used to explore, identify and describe complex human patterning. PMID:22327551

  14. Using New Models to Analyze Complex Regularities of the World: Commentary on Musso et al. (2013)

    ERIC Educational Resources Information Center

    Nokelainen, Petri; Silander, Tomi

    2014-01-01

    This commentary to the recent article by Musso et al. (2013) discusses issues related to model fitting, comparison of classification accuracy of generative and discriminative models, and two (or more) cultures of data modeling. We start by questioning the extremely high classification accuracy with an empirical data from a complex domain. There is…

  15. An eye-tracking paradigm for analyzing the processing time of sentences with different linguistic complexities.

    PubMed

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics.

  16. The Influence of Prior Experience and Process Utilization in Solving Complex Problems.

    ERIC Educational Resources Information Center

    Sterner, Paula; Wedman, John

    By using ill-structured problems and examining problem- solving processes, this study was conducted to explore the nature of solving complex, multistep problems, focusing on how prior knowledge, problem-solving process utilization, and analogical problem solving are related to success. Twenty-four college students qualified to participate by…

  17. Eye-Tracking Study of Complexity in Gas Law Problems

    ERIC Educational Resources Information Center

    Tang, Hui; Pienta, Norbert

    2012-01-01

    This study, part of a series investigating students' use of online tools to assess problem solving, uses eye-tracking hardware and software to explore the effect of problem difficulty and cognitive processes when students solve gas law word problems. Eye movements are indices of cognition; eye-tracking data typically include the location,…

  18. An Effective Methodology for Processing and Analyzing Large, Complex Spacecraft Data Streams

    ERIC Educational Resources Information Center

    Teymourlouei, Haydar

    2013-01-01

    The emerging large datasets have made efficient data processing a much more difficult task for the traditional methodologies. Invariably, datasets continue to increase rapidly in size with time. The purpose of this research is to give an overview of some of the tools and techniques that can be utilized to manage and analyze large datasets. We…

  19. Complex Problem Solving in Educational Contexts--Something beyond "g": Concept, Assessment, Measurement Invariance, and Construct Validity

    ERIC Educational Resources Information Center

    Greiff, Samuel; Wustenberg, Sascha; Molnar, Gyongyver; Fischer, Andreas; Funke, Joachim; Csapo, Beno

    2013-01-01

    Innovative assessments of cross-curricular competencies such as complex problem solving (CPS) have currently received considerable attention in large-scale educational studies. This study investigated the nature of CPS by applying a state-of-the-art approach to assess CPS in high school. We analyzed whether two processes derived from cognitive…

  20. Investigating the Effect of Complexity Factors in Stoichiometry Problems Using Logistic Regression and Eye Tracking

    ERIC Educational Resources Information Center

    Tang, Hui; Kirk, John; Pienta, Norbert J.

    2014-01-01

    This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we developed. The…

  1. A note on the Dirichlet problem for model complex partial differential equations

    NASA Astrophysics Data System (ADS)

    Ashyralyev, Allaberen; Karaca, Bahriye

    2016-08-01

    Complex model partial differential equations of arbitrary order are considered. The uniqueness of the Dirichlet problem is studied. It is proved that the Dirichlet problem for higher order of complex partial differential equations with one complex variable has infinitely many solutions.

  2. Analyzing complex wake-terrain interactions and its implications on wind-farm performance.

    NASA Astrophysics Data System (ADS)

    Tabib, Mandar; Rasheed, Adil; Fuchs, Franz

    2016-09-01

    Rotating wind turbine blades generate complex wakes involving vortices (helical tip-vortex, root-vortex etc.).These wakes are regions of high velocity deficits and high turbulence intensities and they tend to degrade the performance of down-stream turbines. Hence, a conservative inter-turbine distance of up-to 10 times turbine diameter (10D) is sometimes used in wind-farm layout (particularly in cases of flat terrain). This ensures that wake-effects will not reduce the overall wind-farm performance, but this leads to larger land footprint for establishing a wind-farm. In-case of complex-terrain, within a short distance (say 10D) itself, the nearby terrain can rise in altitude and be high enough to influence the wake dynamics. This wake-terrain interaction can happen either (a) indirectly, through an interaction of wake (both near tip vortex and far wake large-scale vortex) with terrain induced turbulence (especially, smaller eddies generated by small ridges within the terrain) or (b) directly, by obstructing the wake-region partially or fully in its flow-path. Hence, enhanced understanding of wake- development due to wake-terrain interaction will help in wind farm design. To this end the current study involves: (1) understanding the numerics for successful simulation of vortices, (2) understanding fundamental vortex-terrain interaction mechanism through studies devoted to interaction of a single vortex with different terrains, (3) relating influence of vortex-terrain interactions to performance of a wind-farm by studying a multi-turbine wind-farm layout under different terrains. The results on interaction of terrain and vortex has shown a much faster decay of vortex for complex terrain compared to a flatter-terrain. The potential reasons identified explaining the observation are (a) formation of secondary vortices in flow and its interaction with the primary vortex and (b) enhanced vorticity diffusion due to increased terrain-induced turbulence. The implications of

  3. On the computational complexity of the reticulate cophylogeny reconstruction problem.

    PubMed

    Libeskind-Hadas, Ran; Charleston, Michael A

    2009-01-01

    The cophylogeny reconstruction problem is that of finding minimal cost explanations of differences between evolutionary histories of ecologically linked groups of biological organisms. We present a proof that shows that the general problem of reconciling evolutionary histories is NP-complete and provide a sharp boundary where this intractability begins. We also show that a related problem, that of finding Pareto optimal solutions, is NP-hard. As a byproduct of our results, we give a framework by which meta-heuristics can be applied to find good solutions to this problem.

  4. Detrended Partial-Cross-Correlation Analysis: A New Method for Analyzing Correlations in Complex System

    PubMed Central

    Yuan, Naiming; Fu, Zuntao; Zhang, Huan; Piao, Lin; Xoplaki, Elena; Luterbacher, Juerg

    2015-01-01

    In this paper, a new method, detrended partial-cross-correlation analysis (DPCCA), is proposed. Based on detrended cross-correlation analysis (DCCA), this method is improved by including partial-correlation technique, which can be applied to quantify the relations of two non-stationary signals (with influences of other signals removed) on different time scales. We illustrate the advantages of this method by performing two numerical tests. Test I shows the advantages of DPCCA in handling non-stationary signals, while Test II reveals the “intrinsic” relations between two considered time series with potential influences of other unconsidered signals removed. To further show the utility of DPCCA in natural complex systems, we provide new evidence on the winter-time Pacific Decadal Oscillation (PDO) and the winter-time Nino3 Sea Surface Temperature Anomaly (Nino3-SSTA) affecting the Summer Rainfall over the middle-lower reaches of the Yangtze River (SRYR). By applying DPCCA, better significant correlations between SRYR and Nino3-SSTA on time scales of 6 ~ 8 years are found over the period 1951 ~ 2012, while significant correlations between SRYR and PDO on time scales of 35 years arise. With these physically explainable results, we have confidence that DPCCA is an useful method in addressing complex systems. PMID:25634341

  5. Mass spectrometric methods to analyze the structural organization of macromolecular complexes.

    PubMed

    Rajabi, Khadijeh; Ashcroft, Alison E; Radford, Sheena E

    2015-11-01

    With the development of soft ionization techniques such as electrospray ionization (ESI), mass spectrometry (MS) has found widespread application in structural biology. The ability to transfer large biomolecular complexes intact into the gas-phase, combined with the low sample consumption and high sensitivity of MS, has made ESI-MS a method of choice for the characterization of macromolecules. This paper describes the application of MS to study large non-covalent complexes. We categorize the available techniques in two groups. First, solution-based techniques in which the biomolecules are labeled in solution and subsequently characterized by MS. Three MS-based techniques are discussed, namely hydroxyl radical footprinting, cross-linking and hydrogen/deuterium exchange (HDX) MS. In the second group, MS-based techniques to probe intact biomolecules in the gas-phase, e.g. side-chain microsolvation, HDX and ion mobility spectrometry are discussed. Together, the approaches place MS as a powerful methodology for an ever growing plethora of structural applications.

  6. A complexity analysis of space-bounded learning algorithms for the constraint satisfaction problem

    SciTech Connect

    Bayardo, R.J. Jr.; Miranker, D.P.

    1996-12-31

    Learning during backtrack search is a space-intensive process that records information (such as additional constraints) in order to avoid redundant work. In this paper, we analyze the effects of polynomial-space-bounded learning on runtime complexity of backtrack search. One space-bounded learning scheme records only those constraints with limited size, and another records arbitrarily large constraints but deletes those that become irrelevant to the portion of the search space being explored. We find that relevance-bounded learning allows better runtime bounds than size-bounded learning on structurally restricted constraint satisfaction problems. Even when restricted to linear space, our relevance-bounded learning algorithm has runtime complexity near that of unrestricted (exponential space-consuming) learning schemes.

  7. A Framework For Analyzing And Mitigating The Vulnerabilities Of Complex Systems Via Attack And Protection Trees

    DTIC Science & Technology

    2007-07-01

    Forrest, et al. in [FoH00, HoF00], Harmar, et al. in [HaL00, HaW02], and Williams , et al. [WiA01]. The main difference is the use of RNA as a memory...and Robert Bennington. “Extending the Schematic Protection Model to Verify the Safety of a System Using Attack and Protection Trees,” Under review...by IEEE Transactions on Information Forensics and Security. Kenneth Edge, George Dalton, Richard Raines, and Robert Mills. “Analyzing Network

  8. Technologically Mediated Complex Problem-Solving on a Statistics Task

    ERIC Educational Resources Information Center

    Scanlon, Eileen; Blake, Canan; Joiner, Richard; O'Shea, Tim

    2005-01-01

    Simulations on computers can allow many experiments to be conducted quickly to help students develop an understanding of statistical topics. We used a simulation of a challenging problem in statistics as the focus of an exploration of situations where members of a problem-solving group are physically separated then reconnected via combinations of…

  9. THE ROLE OF PROBLEM SOLVING IN COMPLEX INTRAVERBAL REPERTOIRES

    PubMed Central

    Sautter, Rachael A; LeBlanc, Linda A; Jay, Allison A; Goldsmith, Tina R; Carr, James E

    2011-01-01

    We examined whether typically developing preschoolers could learn to use a problem-solving strategy that involved self-prompting with intraverbal chains to provide multiple responses to intraverbal categorization questions. Teaching the children to use the problem-solving strategy did not produce significant increases in target responses until problem solving was modeled and prompted. Following the model and prompts, all participants showed immediate significant increases in intraverbal categorization, and all prompts were quickly eliminated. Use of audible self-prompts was evident initially for all participants, but declined over time for 3 of the 4 children. Within-session response patterns remained consistent with use of the problem-solving strategy even when self-prompts were not audible. These findings suggest that teaching and prompting a problem-solving strategy can be an effective way to produce intraverbal categorization responses. PMID:21709781

  10. The Fallacy of Univariate Solutions to Complex Systems Problems

    PubMed Central

    Lessov-Schlaggar, Christina N.; Rubin, Joshua B.; Schlaggar, Bradley L.

    2016-01-01

    Complex biological systems, by definition, are composed of multiple components that interact non-linearly. The human brain constitutes, arguably, the most complex biological system known. Yet most investigation of the brain and its function is carried out using assumptions appropriate for simple systems—univariate design and linear statistical approaches. This heuristic must change before we can hope to discover and test interventions to improve the lives of individuals with complex disorders of brain development and function. Indeed, a movement away from simplistic models of biological systems will benefit essentially all domains of biology and medicine. The present brief essay lays the foundation for this argument. PMID:27375425

  11. Analyzing Complex Treatment Effects in Nonrandomized Observational Studies: The Case of Retention of Students in Grade.

    PubMed

    West, Stephen G

    2016-01-01

    Should low-achieving students be promoted to the next grade or be retained (held back) in the prior grade? This special section presents a discussion of the application of marginal structural models to the challenging problem of estimating the effect of promotion versus retention in grade on math scores in elementary school. Vandecandelaere, De Fraine, Van Damme, and Vansteelandt provide a didactic presentation of the marginal structural modeling approach, noting retention is a time-varying treatment because promoted low-achieving students may be retained in a subsequent grade. Steiner, Park, and Kim's commentary presents a detailed analysis of the treatment effects being estimated in same-age versus same-grade comparisons from the perspective of the potential outcomes model. Reshetnyak, Cham, and Kim's commentary clarifies the conditions under which same-age versus same-grade comparisons might be preferred; they also identify methods of further improving the estimation of retention effects. In their rejoinder, Vandecandelaere and Vansteelandt discuss tradeoffs in comparing the promoted and retained groups and highlight sensitivity analysis as a method of probing the robustness of treatment effect estimates. Our hope is that this combined didactic presentation and critical evaluation will encourage researchers to add marginal structural models to their methodological toolkits.

  12. Solar optical codes evaluation for modeling and analyzing complex solar receiver geometries

    NASA Astrophysics Data System (ADS)

    Yellowhair, Julius; Ortega, Jesus D.; Christian, Joshua M.; Ho, Clifford K.

    2014-09-01

    Solar optical modeling tools are valuable for modeling and predicting the performance of solar technology systems. Four optical modeling tools were evaluated using the National Solar Thermal Test Facility heliostat field combined with flat plate receiver geometry as a benchmark. The four optical modeling tools evaluated were DELSOL, HELIOS, SolTrace, and Tonatiuh. All are available for free from their respective developers. DELSOL and HELIOS both use a convolution of the sunshape and optical errors for rapid calculation of the incident irradiance profiles on the receiver surfaces. SolTrace and Tonatiuh use ray-tracing methods to intersect the reflected solar rays with the receiver surfaces and construct irradiance profiles. We found the ray-tracing tools, although slower in computation speed, to be more flexible for modeling complex receiver geometries, whereas DELSOL and HELIOS were limited to standard receiver geometries such as flat plate, cylinder, and cavity receivers. We also list the strengths and deficiencies of the tools to show tool preference depending on the modeling and design needs. We provide an example of using SolTrace for modeling nonconventional receiver geometries. The goal is to transfer the irradiance profiles on the receiver surfaces calculated in an optical code to a computational fluid dynamics code such as ANSYS Fluent. This approach eliminates the need for using discrete ordinance or discrete radiation transfer models, which are computationally intensive, within the CFD code. The irradiance profiles on the receiver surfaces then allows for thermal and fluid analysis on the receiver.

  13. A Complex Network Model for Analyzing Railway Accidents Based on the Maximal Information Coefficient

    NASA Astrophysics Data System (ADS)

    Shao, Fu-Bo; Li, Ke-Ping

    2016-10-01

    It is an important issue to identify important influencing factors in railway accident analysis. In this paper, employing the good measure of dependence for two-variable relationships, the maximal information coefficient (MIC), which can capture a wide range of associations, a complex network model for railway accident analysis is designed in which nodes denote factors of railway accidents and edges are generated between two factors of which MIC values are larger than or equal to the dependent criterion. The variety of network structure is studied. As the increasing of the dependent criterion, the network becomes to an approximate scale-free network. Moreover, employing the proposed network, important influencing factors are identified. And we find that the annual track density-gross tonnage factor is an important factor which is a cut vertex when the dependent criterion is equal to 0.3. From the network, it is found that the railway development is unbalanced for different states which is consistent with the fact. Supported by the Fundamental Research Funds for the Central Universities under Grant No. 2016YJS087, the National Natural Science Foundation of China under Grant No. U1434209, and the Research Foundation of State Key Laboratory of Railway Traffic Control and Safety, Beijing Jiaotong University under Grant No. RCS2016ZJ001

  14. Nonprotein Based Enrichment Method to Analyze Peptide Cross-Linking in Protein Complexes

    PubMed Central

    Yan, Funing; Che, Fa-Yun; Rykunov, Dmitry; Nieves, Edward; Fiser, Andras; Weiss, Louis M.; Angeletti, Ruth Hogue

    2009-01-01

    Cross-linking analysis of protein complexes and structures by tandem mass spectrometry (MS/MS) has advantages in speed, sensitivity, specificity, and the capability of handling complicated protein assemblies. However, detection and accurate assignment of the cross-linked peptides are often challenging due to their low abundance and complicated fragmentation behavior in collision-induced dissociation (CID). To simplify the MS analysis and improve the signal-to-noise ratio of the cross-linked peptides, we developed a novel peptide enrichment strategy that utilizes a cross-linker with a cryptic thiol group and using beads modified with a photocleavable cross-linker. The functional cross-linkers were designed to react with the primary amino groups in proteins. Human serum albumin was used as a model protein to detect intra- and intermolecular cross-linkages. Use of this protein-free selective retrieval method eliminates the contamination that can result from avidin–biotin based retrieval systems and simplifies data analysis. These features may make the method suitable to investigate protein–protein interactions in biological samples. PMID:19642656

  15. Asbestos quantification in track ballast, a complex analytical problem

    NASA Astrophysics Data System (ADS)

    Cavallo, Alessandro

    2016-04-01

    Track ballast forms the trackbeb upon which railroad ties are laid. It is used to bear the load from the railroad ties, to facilitate water drainage, and also to keep down vegetation. It is typically made of angular crushed stone, with a grain size between 30 and 60 mm, with good mechanical properties (high compressive strength, freeze - thaw resistance, resistance to fragmentation). The most common rock types are represented by basalts, porphyries, orthogneisses, some carbonatic rocks and "green stones" (serpentinites, prasinites, amphibolites, metagabbros). Especially "green stones" may contain traces, and sometimes appreciable amounts of asbestiform minerals (chrysotile and/or fibrous amphiboles, generally tremolite - actinolite). In Italy, the chrysotile asbestos mine in Balangero (Turin) produced over 5 Mt railroad ballast (crushed serpentinites), which was used for the railways in northern and central Italy, from 1930 up to 1990. In addition to Balangero, several other serpentinite and prasinite quarries (e.g. Emilia Romagna) provided the railways ballast up to the year 2000. The legal threshold for asbestos content in track ballast is established in 1000 ppm: if the value is below this threshold, the material can be reused, otherwise it must be disposed of as hazardous waste, with very high costs. The quantitative asbestos determination in rocks is a very complex analytical issue: although techniques like TEM-SAED and micro-Raman are very effective in the identification of asbestos minerals, a quantitative determination on bulk materials is almost impossible or really expensive and time consuming. Another problem is represented by the discrimination of asbestiform minerals (e.g. chrysotile, asbestiform amphiboles) from the common acicular - pseudo-fibrous varieties (lamellar serpentine minerals, prismatic/acicular amphiboles). In this work, more than 200 samples from the main Italian rail yards were characterized by a combined use of XRD and a special SEM

  16. Analyzing complex FTMS simulations: a case study in high-level visualization of ion motions.

    PubMed

    Burakiewicz, Wojciech; van Liere, Robert

    2006-01-01

    Current practice in particle visualization renders particle position data directly onto the screen as points or glyphs. Using a camera placed at a fixed position, particle motions can be visualized by rendering trajectories or by animations. Applying such direct techniques to large, time dependent particle data sets often results in cluttered images in which the dynamic properties of the underlying system are difficult to interpret. In this case study we take an alternative approach to the visualization of ion motions. Instead of rendering ion position data directly, we first extract meaningful motion information from the ion position data and then map this information onto geometric primitives. Our goal is to produce high-level visualizations that reflect the physicists' way of thinking about ion dynamics. Parameterized geometric icons are defined to encode motion information of clusters of related ions. In addition, a parameterized camera control mechanism is used to analyze relative instead of only absolute ion motions. We apply the techniques to simulations of Fourier transform mass spectrometry (FTMS) experiments. The data produced by such simulations can amount to 5 10(4) ions and 10(5) timesteps. This paper discusses the requirements, design and informal evaluation of the implemented system.

  17. Generalized Householder transformations for the complex symmetric eigenvalue problem

    NASA Astrophysics Data System (ADS)

    Noble, J. H.; Lubasch, M.; Jentschura, U. D.

    2013-08-01

    We present an intuitive and scalable algorithm for the diagonalization of complex symmetric matrices, which arise from the projection of pseudo-Hermitian and complex scaled Hamiltonians onto a suitable basis set of "trial" states. The algorithm diagonalizes complex and symmetric (non-Hermitian) matrices and is easily implemented in modern computer languages. It is based on generalized Householder transformations and relies on iterative similarity transformations T → T' = Q T T Q, where Q is a complex and orthogonal, but not unitary, matrix, i.e. Q T = Q -1 but Q + ≠ Q -1. We present numerical reference data to support the scalability of the algorithm. We construct the generalized Householder transformations from the notion that the conserved scalar product of eigenstates Ψ n and Ψ m of a pseudo-Hermitian quantum mechanical Hamiltonian can be reformulated in terms of the generalized indefinite inner product ∫ d x Ψ n ( x, t) Ψ m ( x, t), where the integrand is locally defined, and complex conjugation is avoided. A few example calculations are described which illustrate the physical origin of the ideas used in the construction of the algorithm.

  18. Problem analysis of geotechnical well drilling in complex environment

    NASA Astrophysics Data System (ADS)

    Kasenov, A. K.; Biletskiy, M. T.; Ratov, B. T.; Korotchenko, T. V.

    2015-02-01

    The article examines primary causes of problems occurring during the drilling of geotechnical wells (injection, production and monitoring wells) for in-situ leaching to extract uranium in South Kazakhstan. Such a drilling problem as hole caving which is basically caused by various chemical and physical factors (hydraulic, mechanical, etc.) has been thoroughly investigated. The analysis of packing causes has revealed that this problem usually occurs because of insufficient amount of drilling mud being associated with small cross section downward flow and relatively large cross section upward flow. This is explained by the fact that when spear bores are used to drill clay rocks, cutting size is usually rather big and there is a risk for clay particles to coagulate.

  19. Complexity of the Generalized Mover’s Problem.

    DTIC Science & Technology

    1985-01-01

    problem by workers in the robotics fields and in artificial intellegence , (for example [Nilson, 69], [Paul, 72], (Udupa, 77], [Widdoes, 74], [Lozano-Perez...Nilsson, "A mobile automation: An application of artificial intelligence techniques," Proceedings TJCAI-69, 509-520, 1969. . -7-- -17- C. O’Dunlaing, M

  20. Making Sense of Complex Problems: A Resource for Teams

    DTIC Science & Technology

    2015-12-01

    of key challenges that military design teams encounter, and describes lessons, strategies , and approaches used by military leaders to optimize the...operational settings. 15. SUBJECT TERMS Planning teams, Design teams, Army Design Methodology, Best practices, Strategies ...address these unfamiliar problems.1 The resource offers practical tips, strategies , and examples designed to support planning teams and their leaders

  1. Exploiting Explicit and Implicit Structure in Complex Optimization Problems

    DTIC Science & Technology

    2014-09-24

    for minimizing implicitly defined functions resulting from applying decomposition , relaxation and/or dualization techniques to complex real-world...Fariba Fahroo Reporting Period: 16 June 2011 to 15 June 2014 Accomplishments: Decomposition techniques are often the best choice for solving large...publications [11], [13], [14], [3] and [5]. The decomposition type of approach involves minimizing a nonsmooth objective function with special

  2. Complex emotions, complex problems: understanding the experiences of perinatal depression among new mothers in urban Indonesia.

    PubMed

    Andajani-Sutjahjo, Sari; Manderson, Lenore; Astbury, Jill

    2007-03-01

    In this article, we explore how Javanese women identify and speak of symptoms of depression in late pregnancy and early postpartum and describe their subjective accounts of mood disorders. The study, conducted in the East Java region of Indonesia in 2000, involved in-depth interviews with a subgroup of women (N = 41) who scored above the cutoff score of 12/13 on the Edinburgh Postnatal Depression Scale (EPDS) during pregnancy, at six weeks postpartum, or on both occasions. This sample was taken from a larger cohort study (N cohort = 488) researching the sociocultural factors that contribute to women's emotional well-being in early motherhood. The women used a variety of Indonesian and Javanese terms to explain their emotional states during pregnancy and in early postpartum, some of which coincided with the feelings described on the EPDS and others of which did not. Women attributed their mood variations to multiple causes including: premarital pregnancy, chronic illness in the family, marital problems, lack of support from partners or family networks, their husband's unemployment, and insufficient family income due to giving up their own paid work. We argue for the importance of understanding the context of childbearing in order to interpret the meaning of depression within complex social, cultural, and economic contexts.

  3. Games that Enlist Collective Intelligence to Solve Complex Scientific Problems

    PubMed Central

    Burnett, Stephen; Furlong, Michelle; Melvin, Paul Guy; Singiser, Richard

    2016-01-01

    There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article. PMID:27047610

  4. Games that Enlist Collective Intelligence to Solve Complex Scientific Problems.

    PubMed

    Burnett, Stephen; Furlong, Michelle; Melvin, Paul Guy; Singiser, Richard

    2016-03-01

    There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article.

  5. On the Complexity of the Asymmetric VPN Problem

    NASA Astrophysics Data System (ADS)

    Rothvoß, Thomas; Sanità, Laura

    We give the first constant factor approximation algorithm for the asymmetric Virtual Private Network (textsc{Vpn}) problem with arbitrary concave costs. We even show the stronger result, that there is always a tree solution of cost at most 2·OPT and that a tree solution of (expected) cost at most 49.84·OPT can be determined in polynomial time.

  6. Navigating complex decision spaces: Problems and paradigms in sequential choice

    PubMed Central

    Walsh, Matthew M.; Anderson, John R.

    2015-01-01

    To behave adaptively, we must learn from the consequences of our actions. Doing so is difficult when the consequences of an action follow a delay. This introduces the problem of temporal credit assignment. When feedback follows a sequence of decisions, how should the individual assign credit to the intermediate actions that comprise the sequence? Research in reinforcement learning provides two general solutions to this problem: model-free reinforcement learning and model-based reinforcement learning. In this review, we examine connections between stimulus-response and cognitive learning theories, habitual and goal-directed control, and model-free and model-based reinforcement learning. We then consider a range of problems related to temporal credit assignment. These include second-order conditioning and secondary reinforcers, latent learning and detour behavior, partially observable Markov decision processes, actions with distributed outcomes, and hierarchical learning. We ask whether humans and animals, when faced with these problems, behave in a manner consistent with reinforcement learning techniques. Throughout, we seek to identify neural substrates of model-free and model-based reinforcement learning. The former class of techniques is understood in terms of the neurotransmitter dopamine and its effects in the basal ganglia. The latter is understood in terms of a distributed network of regions including the prefrontal cortex, medial temporal lobes cerebellum, and basal ganglia. Not only do reinforcement learning techniques have a natural interpretation in terms of human and animal behavior, but they also provide a useful framework for understanding neural reward valuation and action selection. PMID:23834192

  7. Problem-oriented stereo vision quality evaluation complex

    NASA Astrophysics Data System (ADS)

    Sidorchuk, D.; Gusamutdinova, N.; Konovalenko, I.; Ershov, E.

    2015-12-01

    We describe an original low cost hardware setting for efficient testing of stereo vision algorithms. The method uses a combination of a special hardware setup and mathematical model and is easy to construct, precise in applications of our interest. For a known scene we derive its analytical representation, called virtual scene. Using a four point correspondence between the scene and virtual one we compute extrinsic camera parameters, and project virtual scene on the image plane, which is the ground truth for depth map. Another result, presented in this paper, is a new depth map quality metric. Its main purpose is to tune stereo algorithms for particular problem, e.g. obstacle avoidance.

  8. Nonlinear problems of complex natural systems: Sun and climate dynamics.

    PubMed

    Bershadskii, A

    2013-01-13

    The universal role of the nonlinear one-third subharmonic resonance mechanism in generation of strong fluctuations in complex natural dynamical systems related to global climate is discussed using wavelet regression detrended data. The role of the oceanic Rossby waves in the year-scale global temperature fluctuations and the nonlinear resonance contribution to the El Niño phenomenon have been discussed in detail. The large fluctuations in the reconstructed temperature on millennial time scales (Antarctic ice core data for the past 400,000 years) are also shown to be dominated by the one-third subharmonic resonance, presumably related to the Earth's precession effect on the energy that the intertropical regions receive from the Sun. The effects of galactic turbulence on the temperature fluctuations are also discussed.

  9. Towards Cost-Effective Operational Monitoring Systems for Complex Waters: Analyzing Small-Scale Coastal Processes with Optical Transmissometry

    PubMed Central

    Gonçalves-Araujo, Rafael; Wiegmann, Sonja; Torrecilla, Elena; Bardaji, Raul; Röttgers, Rüdiger; Bracher, Astrid; Piera, Jaume

    2017-01-01

    The detection and prediction of changes in coastal ecosystems require a better understanding of the complex physical, chemical and biological interactions, which involves that observations should be performed continuously. For this reason, there is an increasing demand for small, simple and cost-effective in situ sensors to analyze complex coastal waters at a broad range of scales. In this context, this study seeks to explore the potential of beam attenuation spectra, c(λ), measured in situ with an advanced-technology optical transmissometer, for assessing temporal and spatial patterns in the complex estuarine waters of Alfacs Bay (NW Mediterranean) as a test site. In particular, the information contained in the spectral beam attenuation coefficient was assessed and linked with different biogeochemical variables. The attenuation at λ = 710 nm was used as a proxy for particle concentration, TSM, whereas a novel parameter was adopted as an optical indicator for chlorophyll a (Chl-a) concentration, based on the local maximum of c(λ) observed at the long-wavelength side of the red band Chl-a absorption peak. In addition, since coloured dissolved organic matter (CDOM) has an important influence on the beam attenuation spectral shape and complementary measurements of particle size distribution were available, the beam attenuation spectral slope was used to analyze the CDOM content. Results were successfully compared with optical and biogeochemical variables from laboratory analysis of collocated water samples, and statistically significant correlations were found between the attenuation proxies and the biogeochemical variables TSM, Chl-a and CDOM. This outcome depicted the potential of high-frequency beam attenuation measurements as a simple, continuous and cost-effective approach for rapid detection of changes and patterns in biogeochemical properties in complex coastal environments. PMID:28107539

  10. The solution of the optimization problem of small energy complexes using linear programming methods

    NASA Astrophysics Data System (ADS)

    Ivanin, O. A.; Director, L. B.

    2016-11-01

    Linear programming methods were used for solving the optimization problem of schemes and operation modes of distributed generation energy complexes. Applicability conditions of simplex method, applied to energy complexes, including installations of renewable energy (solar, wind), diesel-generators and energy storage, considered. The analysis of decomposition algorithms for various schemes of energy complexes was made. The results of optimization calculations for energy complexes, operated autonomously and as a part of distribution grid, are presented.

  11. Nuclear processing - a simple cost equation or a complex problem?

    SciTech Connect

    Banfield, Z.; Banford, A.W.; Hanson, B.C.; Scully, P.J.

    2007-07-01

    BNFL has extensive experience of nuclear processing plant from concept through to decommissioning, at all stages of the fuel cycle. Nexia Solutions (formerly BNFL's R and D Division) has always supported BNFL in development of concept plant, including the development of costed plant designs for the purpose of economic evaluation and technology selection. Having undertaken such studies over a number of years, this has enabled Nexia Solutions to develop a portfolio of costed plant designs for a broad range of nuclear processes, throughputs and technologies. This work has led to an extensive understanding of the relationship of the cost of nuclear processing plant, and how this can be impacted by scale of process, and the selection of design philosophy. The relationship has been seen to be non linear and so simplistic equations do not apply, the relationship is complex due to the variety of contributory factors. This is particularly evident when considering the scale of a process, for example how step changes in design occurs with increasing scale, how the applicability of technology options can vary with scale etc... This paper will explore the contributory factor of scale to nuclear processing plant costs. (authors)

  12. Beyond pure parasystole: promises and problems in modeling complex arrhythmias.

    PubMed

    Courtemanche, M; Glass, L; Rosengarten, M D; Goldberger, A L

    1989-08-01

    The dynamics of pure parasystole, a cardiac arrhythmia in which two competing pacemakers fire independently, have recently been fully characterized. This model is now extended in an attempt to account for the more complex dynamics occurring with modulated parasystole, in which there exists nonlinear interaction between the sinus node and the ectopic ventricular focus. Theoretical analysis of modulated parasystole reveals three types of dynamics: entrainment, quasiperiodicity, and chaos. Rhythms associated with quasiperiodicity obey a set of rules derived from pure parasystole. This model is applied to the interpretation of continuous electrocardiographic data sets from three patients with complicated patterns of ventricular ectopic activity. We describe several new statistical properties of these records, related to the number of intervening sinus beats between ectopic events, that are essential in characterizing the dynamics and testing mathematical models. Detailed comparison between data and theory in these cases show substantial areas of agreement as well as potentially important discrepancies. These findings have implications for understanding the dynamics of the heartbeat in normal and pathological conditions.

  13. Dusty (complex) plasmas: recent developments, advances, and unsolved problems

    NASA Astrophysics Data System (ADS)

    Popel, Sergey

    The area of dusty (complex) plasma research is a vibrant subfield of plasma physics that be-longs to frontier research in physical sciences. This area is intrinsically interdisciplinary and encompasses astrophysics, planetary science, atmospheric science, magnetic fusion energy sci-ence, and various applied technologies. The research in dusty plasma started after two major discoveries in very different areas: (1) the discovery by the Voyager 2 spacecraft in 1980 of the radial spokes in Saturn's B ring, and (2) the discovery of the early 80's growth of contaminating dust particles in plasma processing. Dusty plasmas are ubiquitous in the universe; examples are proto-planetary and solar nebulae, molecular clouds, supernovae explosions, interplanetary medium, circumsolar rings, and asteroids. Within the solar system, we have planetary rings (e.g., Saturn and Jupiter), Martian atmosphere, cometary tails and comae, dust clouds on the Moon, etc. Close to the Earth, there are noctilucent clouds and polar mesospheric summer echoes, which are clouds of tiny (charged) ice particles that are formed in the summer polar mesosphere at the altitudes of about 82-95 km. Dust and dusty plasmas are also found in the vicinity of artificial satellites and space stations. Dust also turns out to be common in labo-ratory plasmas, such as in the processing of semiconductors and in tokamaks. In processing plasmas, dust particles are actually grown in the discharge from the reactive gases used to form the plasmas. An example of the relevance of industrial dusty plasmas is the growth of silicon microcrystals for improved solar cells in the future. In fact, nanostructured polymorphous sili-con films provide solar cells with high and time stable efficiency. These nano-materials can also be used for the fabrication of ultra-large-scale integration circuits, display devices, single elec-tron devices, light emitting diodes, laser diodes, and others. In microelectronic industries, dust has to be

  14. Sleep, Cognition, and Behavioral Problems in School-Age Children: A Century of Research Meta-Analyzed

    ERIC Educational Resources Information Center

    Astill, Rebecca G.; Van der Heijden, Kristiaan B.; Van IJzendoorn, Marinus H.; Van Someren, Eus J. W.

    2012-01-01

    Clear associations of sleep, cognitive performance, and behavioral problems have been demonstrated in meta-analyses of studies in adults. This meta-analysis is the first to systematically summarize all relevant studies reporting on sleep, cognition, and behavioral problems in healthy school-age children (5-12 years old) and incorporates 86 studies…

  15. Students' Problem-Solving in a Complex Technology-Based Learning Environment.

    ERIC Educational Resources Information Center

    Suomala, Jyrki; Alamaki, Ari; Alajaaski, Jarkko

    The goals of this study were to investigate problem-solving in a context that requires a rich interaction among social, motivational, and cognitive processes and to compare the effects of the mediated and discovery models of learning on students' problem-solving processes in the complex technology-based learning environment. Subjects were 88…

  16. Percentages: The Effect of Problem Structure, Number Complexity and Calculation Format

    ERIC Educational Resources Information Center

    Baratta, Wendy; Price, Beth; Stacey, Kaye; Steinle, Vicki; Gvozdenko, Eugene

    2010-01-01

    This study reports how the difficulty of simple worded percentage problems is affected by the problem structure and the complexity of the numbers involved. We also investigate which methods students know. Results from 677 Year 8 and 9 students are reported. Overall the results indicate that more attention needs to be given to this important topic.…

  17. A Real-Life Case Study of Audit Interactions--Resolving Messy, Complex Problems

    ERIC Educational Resources Information Center

    Beattie, Vivien; Fearnley, Stella; Hines, Tony

    2012-01-01

    Real-life accounting and auditing problems are often complex and messy, requiring the synthesis of technical knowledge in addition to the application of generic skills. To help students acquire the necessary skills to deal with these problems effectively, educators have called for the use of case-based methods. Cases based on real situations (such…

  18. Individual Differences in Students' Complex Problem Solving Skills: How They Evolve and What They Imply

    ERIC Educational Resources Information Center

    Wüstenberg, Sascha; Greiff, Samuel; Vainikainen, Mari-Pauliina; Murphy, Kevin

    2016-01-01

    Changes in the demands posed by increasingly complex workplaces in the 21st century have raised the importance of nonroutine skills such as complex problem solving (CPS). However, little is known about the antecedents and outcomes of CPS, especially with regard to malleable external factors such as classroom climate. To investigate the relations…

  19. The Ethnology of Traditional and Complex Societies. Test Edition. AAAS Study Guides on Contemporary Problems.

    ERIC Educational Resources Information Center

    Simic, Andrei

    This is one of several study guides on contemporary problems produced by the American Association for the Advancement of Science with support of the National Science Foundation. This guide focuses on the ethnology of traditional and complex societies. Part I, Simple and Complex Societies, includes three sections: (1) Introduction: Anthropologists…

  20. Quantum computational complexity of the N-representability problem: QMA complete.

    PubMed

    Liu, Yi-Kai; Christandl, Matthias; Verstraete, F

    2007-03-16

    We study the computational complexity of the N-representability problem in quantum chemistry. We show that this problem is quantum Merlin-Arthur complete, which is the quantum generalization of nondeterministic polynomial time complete. Our proof uses a simple mapping from spin systems to fermionic systems, as well as a convex optimization technique that reduces the problem of finding ground states to N representability.

  1. One Problem, Many Solutions: Simple Statistical Approaches Help Unravel the Complexity of the Immune System in an Ecological Context

    PubMed Central

    Matson, Kevin D.; Tieleman, B. Irene

    2011-01-01

    The immune system is a complex collection of interrelated and overlapping solutions to the problem of disease. To deal with this complexity, researchers have devised multiple ways to measure immune function and to analyze the resulting data. In this way both organisms and researchers employ many tactics to solve a complex problem. One challenge facing ecological immunologists is the question of how these many dimensions of immune function can be synthesized to facilitate meaningful interpretations and conclusions. We tackle this challenge by employing and comparing several statistical methods, which we used to test assumptions about how multiple aspects of immune function are related at different organizational levels. We analyzed three distinct datasets that characterized 1) species, 2) subspecies, and 3) among- and within-individual level differences in the relationships among multiple immune indices. Specifically, we used common principal components analysis (CPCA) and two simpler approaches, pair-wise correlations and correlation circles. We also provide a simple example of how these techniques could be used to analyze data from multiple studies. Our findings lead to several general conclusions. First, relationships among indices of immune function may be consistent among some organizational groups (e.g. months over the annual cycle) but not others (e.g. species); therefore any assumption of consistency requires testing before further analyses. Second, simple statistical techniques used in conjunction with more complex multivariate methods give a clearer and more robust picture of immune function than using complex statistics alone. Moreover, these simpler approaches have potential for analyzing comparable data from multiple studies, especially as the field of ecological immunology moves towards greater methodological standardization. PMID:21526186

  2. Computer Technology and Complex Problem Solving: Issues in the Study of Complex Cognitive Activity.

    ERIC Educational Resources Information Center

    Goldman, Susan R.; Zech, Linda K.; Biswas, Gautam; Noser, Tom; Bateman, Helen; Bransford, John; Crews, Thaddeus; Moore, Allison; Nathan, Mitchell; Owens, Stephen

    1999-01-01

    Examines mathematics problem solving in a computer software environment using graphical representations of the results of simulations with adolescent students. Discusses the strengths and limitations of inferring goals and plans, the use of verbal protocols, and ways for computer-based learning environments to scaffold acquisition of domain…

  3. Eigenfunction Expansions for Coupled Nonlinear Convection-Diffusion Problems in Complex Physical Domains

    NASA Astrophysics Data System (ADS)

    Cotta, R. M.; Naveira-Cotta, C. P.; Knupp, D. C.; Zotin, J. L. Z.; Pontes, P. C.

    2016-09-01

    This lecture offers an updated review on the Generalized Integral Transform Technique (GITT), with focus on handling complex geometries, coupled problems, and nonlinear convection-diffusion, so as to illustrate some new application paradigms. Special emphasis is given to demonstrating novel developments, such as a single domain reformulation strategy that simplifies the treatment of complex geometries, an integral balance scheme in handling multiscale problems, the adoption of convective eigenvalue problems in dealing with strongly convective formulations, and the direct integral transformation of nonlinear convection-diffusion problems based on nonlinear eigenvalue problems. Representative application examples are then provided that employ recent extensions on the Generalized Integral Transform Technique (GITT), and a few numerical results are reported to illustrate the convergence characteristics of the proposed eigenfunction expansions.

  4. Harm reduction as a complex adaptive system: A dynamic framework for analyzing Tanzanian policies concerning heroin use

    PubMed Central

    Ratliff, Eric A.; Kaduri, Pamela; Masao, Frank; Mbwambo, Jessie K.K.; McCurdy, Sheryl A.

    2016-01-01

    Contrary to popular belief, policies on drug use are not always based on scientific evidence or composed in a rational manner. Rather, decisions concerning drug policies reflect the negotiation of actors’ ambitions, values, and facts as they organize in different ways around the perceived problems associated with illicit drug use. Drug policy is thus best represented as a complex adaptive system (CAS) that is dynamic, self-organizing, and coevolving. In this analysis, we use a CAS framework to examine how harm reduction emerged around heroin trafficking and use in Tanzania over the past thirty years (1985-present). This account is an organizational ethnography based on of the observant participation of the authors as actors within this system. We review the dynamic history and self-organizing nature of harm reduction, noting how interactions among system actors and components have coevolved with patterns of heroin us, policing, and treatment activities over time. Using a CAS framework, we describe harm reduction as a complex process where ambitions, values, facts, and technologies interact in the Tanzanian socio-political environment. We review the dynamic history and self-organizing nature of heroin policies, noting how the interactions within and between competing prohibitionist and harm reduction policies have changed with patterns of heroin use, policing, and treatment activities over time. Actors learn from their experiences to organize with other actors, align their values and facts, and implement new policies. Using a CAS approach provides researchers and policy actors a better understanding of patterns and intricacies in drug policy. This knowledge of how the system works can help improve the policy process through adaptive action to introduce new actors, different ideas, and avenues for communication into the system. PMID:26790689

  5. Harm reduction as a complex adaptive system: A dynamic framework for analyzing Tanzanian policies concerning heroin use.

    PubMed

    Ratliff, Eric A; Kaduri, Pamela; Masao, Frank; Mbwambo, Jessie K K; McCurdy, Sheryl A

    2016-04-01

    Contrary to popular belief, policies on drug use are not always based on scientific evidence or composed in a rational manner. Rather, decisions concerning drug policies reflect the negotiation of actors' ambitions, values, and facts as they organize in different ways around the perceived problems associated with illicit drug use. Drug policy is thus best represented as a complex adaptive system (CAS) that is dynamic, self-organizing, and coevolving. In this analysis, we use a CAS framework to examine how harm reduction emerged around heroin trafficking and use in Tanzania over the past thirty years (1985-present). This account is an organizational ethnography based on of the observant participation of the authors as actors within this system. We review the dynamic history and self-organizing nature of harm reduction, noting how interactions among system actors and components have coevolved with patterns of heroin us, policing, and treatment activities over time. Using a CAS framework, we describe harm reduction as a complex process where ambitions, values, facts, and technologies interact in the Tanzanian sociopolitical environment. We review the dynamic history and self-organizing nature of heroin policies, noting how the interactions within and between competing prohibitionist and harm reduction policies have changed with patterns of heroin use, policing, and treatment activities over time. Actors learn from their experiences to organize with other actors, align their values and facts, and implement new policies. Using a CAS approach provides researchers and policy actors a better understanding of patterns and intricacies in drug policy. This knowledge of how the system works can help improve the policy process through adaptive action to introduce new actors, different ideas, and avenues for communication into the system.

  6. Conceptual and procedural knowledge community college students use when solving a complex science problem

    NASA Astrophysics Data System (ADS)

    Steen-Eibensteiner, Janice Lee

    2006-07-01

    A strong science knowledge base and problem solving skills have always been highly valued for employment in the science industry. Skills currently needed for employment include being able to problem solve (Overtoom, 2000). Academia also recognizes the need for effectively teaching students to apply problem solving skills in clinical settings. This thesis investigates how students solve complex science problems in an academic setting in order to inform the development of problem solving skills for the workplace. Students' use of problem solving skills in the form of learned concepts and procedural knowledge was studied as students completed a problem that might come up in real life. Students were taking a community college sophomore biology course, Human Anatomy & Physiology II. The problem topic was negative feedback inhibition of the thyroid and parathyroid glands. The research questions answered were (1) How well do community college students use a complex of conceptual knowledge when solving a complex science problem? (2) What conceptual knowledge are community college students using correctly, incorrectly, or not using when solving a complex science problem? (3) What problem solving procedural knowledge are community college students using successfully, unsuccessfully, or not using when solving a complex science problem? From the whole class the high academic level participants performed at a mean of 72% correct on chapter test questions which was a low average to fair grade of C-. The middle and low academic participants both failed (F) the test questions (37% and 30% respectively); 29% (9/31) of the students show only a fair performance while 71% (22/31) fail. From the subset sample population of 2 students each from the high, middle, and low academic levels selected from the whole class 35% (8/23) of the concepts were used effectively, 22% (5/23) marginally, and 43% (10/23) poorly. Only 1 concept was used incorrectly by 3/6 of the students and identified as

  7. Thinking Problems of the Present Collision Warning Work by Analyzing the Intersection Between Cosmos 2251 and Iridium 33

    NASA Astrophysics Data System (ADS)

    Wang, R. L.; Liu, W.; Yan, R. D.; Gong, J. C.

    2013-08-01

    After Cosmos 2251 and Iridium 33 collision breakup event, the institutions at home and abroad began the collision warning analysis for the event. This paper compared the results from the different research units and discussed the problems of the current collision warning work, then gave the suggestions of further study.

  8. Analyzing Multiple Informant Data on Child and Adolescent Behavior Problems: Predictive Validity and Comparison of Aggregation Procedures

    ERIC Educational Resources Information Center

    van Dulmen, Manfred H. M.; Egeland, Byron

    2011-01-01

    We compared the predictive validity of five aggregation methods for multiple informant data on child and adolescent behavior problems. In addition, we compared the predictive validity of these aggregation methods with single informant scores. Data were derived from the Minnesota Longitudinal Study of Parents and Children (N = 175). Maternal and…

  9. Analyzing the Effects of a Mathematics Problem-Solving Program, Exemplars, on Mathematics Problem-Solving Scores with Deaf and Hard-of-Hearing Students

    ERIC Educational Resources Information Center

    Chilvers, Amanda Leigh

    2013-01-01

    Researchers have noted that mathematics achievement for deaf and hard-of-hearing (d/hh) students has been a concern for many years, including the ability to problem solve. This quasi-experimental study investigates the use of the Exemplars mathematics program with students in grades 2-8 in a school for the deaf that utilizes American Sign Language…

  10. Progress in the development of parabolized Navier-Stokes (PNS) methodology for analyzing propulsive jet mixing problems

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Wolf, D. E.; Sinha, N.; Lee, S. H.

    1986-01-01

    A brief review of 2D PNS methodology is first presented which describes the specialized features of supersonic shock-capturing and subsonic pressure-split models required for the analysis of aircraft, rocket and scramjet jet mixing problems. These features include techniques for dealing with various types of embedded and interfacing subsonic regions, the inclusion of finite-rate chemistry and the direct-coupling with potential flow solutions. Preliminary 3D extensions of this PNS methodology geared to supersonic and subsonic rectangular free jet mixing problems are also reviewed. New 3D PNS work will be described which includes the development of a hybrid supersonic/subsonic free jet mixing model, and, a supersonic model geared to the analysis of turbulent mixing and combustion processes occurring in scramjet combustor/nozzle flowfields.

  11. Performance of isotope ratio infrared spectroscopy (IRIS) for analyzing waters containing organic contaminants: Problems and solutions (Invited)

    NASA Astrophysics Data System (ADS)

    West, A. G.; Goldsmith, G. R.; Dawson, T. E.

    2010-12-01

    The development of isotope ratio infrared spectroscopy (IRIS) for simultaneous δ2H and δ18O analysis of liquid water samples shows much potential for affordable, simple and potentially portable isotopic analyses. IRIS has been shown to be comparable in precision and accuracy to isotope ratio mass spectrometry (IRMS) when analyzing pure water samples. However, recent studies have shown that organic contaminants in analyzed water samples may interfere with the spectroscopy leading to errors of considerable magnitude in the reported stable isotope data. Many environmental, biological and forensic studies require analyses of water containing organic contaminants in some form, yet our current methods of removing organic contaminants prior to analysis appear inadequate for IRIS. Treated plant water extracts analyzed by IRIS showed deviations as large as 35‰ (δ2H) and 11.8‰ (δ18O) from the IRMS value, indicating that trace amounts of contaminants were sufficient to disrupt IRIS analyses. However, not all organic contaminants negatively influence IRIS. For such samples, IRIS presents a labour saving method relative to IRMS. Prior to widespread use in the environmental, biological and forensic sciences, a means of obtaining reliable data from IRIS needs to be demonstrated. One approach is to use instrument-based software to flag potentially problematic spectra and output a corrected isotope value based on analysis of the spectra. We evaluate this approach on two IRIS systems and discuss the way forward for ensuring accurate stable isotope data using IRIS.

  12. Medicines counterfeiting is a complex problem: a review of key challenges across the supply chain.

    PubMed

    Tremblay, Michael

    2013-02-01

    The paper begins by asking why there is a market for counterfeit medicines, which in effect creates the problem of counterfeiting itself. Contributing factors include supply chain complexity and the lack of whole-systems thinking. These two underpin the author's view that counterfeiting is a complex (i.e. wicked) problem, and that corporate, public policy and regulatory actions need to be mindful of how their actions may be causal. The paper offers a problem-based review of key components of this complexity, viz., the knowledge end-users/consumers have of medicines; whether restrictive information policies may hamper information provision to patients; the internet's direct access to consumers; internet-enabled distribution of unsafe and counterfeit medicines; whether the internet is a parallel and competitive supply chain to legitimate routes; organised crime as an emerging medicines manufacturer and supplier and whether substandard medicines is really the bigger problem. Solutions respect the perceived complexity of the supply chain challenges. The paper identifies the need to avoid technologically-driven solutions, calling for 'technological agnosticism'. Both regulation and public policy need to reflect the dynamic nature of the problem and avoid creating perverse incentives; it may be, for instance, that medicines pricing and reimbursement policies, which affect consumer/patient access may act as market signals to counterfeiters, since this creates a cash market in cheaper drugs.

  13. A knowledge-based tool for multilevel decomposition of a complex design problem

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    1989-01-01

    Although much work has been done in applying artificial intelligence (AI) tools and techniques to problems in different engineering disciplines, only recently has the application of these tools begun to spread to the decomposition of complex design problems. A new tool based on AI techniques has been developed to implement a decomposition scheme suitable for multilevel optimization and display of data in an N x N matrix format.

  14. Classical and Quantum Complexity of the Sturm-Liouville Eigenvalue Problem

    DTIC Science & Technology

    2005-03-03

    study of a nonlin- ear continuous problem was done in [20] for ordinary differential equations with polynomial speedups over the classical settings. The...multivariate approximation, and ordinary differential equations . Tight bit query complexity bounds are known for a number of such problems, see [14, 15, 16...Linear Algebra, SIAM, Philadelphia. [12] Gary, H. (1965), Computing Eigenvalues of Ordinary Differential Equations with Finite Differences, Mathematics

  15. On the Critical Behaviour, Crossover Point and Complexity of the Exact Cover Problem

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.; Smelyanskiy, Vadim N.; Shumow, Daniel; Koga, Dennis (Technical Monitor)

    2003-01-01

    Research into quantum algorithms for NP-complete problems has rekindled interest in the detailed study a broad class of combinatorial problems. A recent paper applied the quantum adiabatic evolution algorithm to the Exact Cover problem for 3-sets (EC3), and provided an empirical evidence that the algorithm was polynomial. In this paper we provide a detailed study of the characteristics of the exact cover problem. We present the annealing approximation applied to EC3, which gives an over-estimate of the phase transition point. We also identify empirically the phase transition point. We also study the complexity of two classical algorithms on this problem: Davis-Putnam and Simulated Annealing. For these algorithms, EC3 is significantly easier than 3-SAT.

  16. Ecosystem services and cooperative fisheries research to address a complex fishery problem

    EPA Science Inventory

    The St. Louis River represents a complex fishery management problem. Current fishery management goals have to be developed taking into account bi-state commercial, subsistence and recreational fisheries which are valued for different characteristics by a wide range of anglers, as...

  17. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    ERIC Educational Resources Information Center

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  18. The Development of Complex Problem Solving in Adolescence: A Latent Growth Curve Analysis

    ERIC Educational Resources Information Center

    Frischkorn, Gidon T.; Greiff, Samuel; Wüstenberg, Sascha

    2014-01-01

    Complex problem solving (CPS) as a cross-curricular competence has recently attracted more attention in educational psychology as indicated by its implementation in international educational large-scale assessments such as the Programme for International Student Assessment. However, research on the development of CPS is scarce, and the few…

  19. Computer-Based Assessment of Complex Problem Solving: Concept, Implementation, and Application

    ERIC Educational Resources Information Center

    Greiff, Samuel; Wustenberg, Sascha; Holt, Daniel V.; Goldhammer, Frank; Funke, Joachim

    2013-01-01

    Complex Problem Solving (CPS) skills are essential to successfully deal with environments that change dynamically and involve a large number of interconnected and partially unknown causal influences. The increasing importance of such skills in the 21st century requires appropriate assessment and intervention methods, which in turn rely on adequate…

  20. Differential Relations between Facets of Complex Problem Solving and Students' Immigration Background

    ERIC Educational Resources Information Center

    Sonnleitner, Philipp; Brunner, Martin; Keller, Ulrich; Martin, Romain

    2014-01-01

    Whereas the assessment of complex problem solving (CPS) has received increasing attention in the context of international large-scale assessments, its fairness in regard to students' cultural background has gone largely unexplored. On the basis of a student sample of 9th-graders (N = 299), including a representative number of immigrant students (N…

  1. Assessment of Complex Problem Solving: What We Know and What We Don't Know

    ERIC Educational Resources Information Center

    Herde, Christoph Nils; Wüstenberg, Sascha; Greiff, Samuel

    2016-01-01

    Complex Problem Solving (CPS) is seen as a cross-curricular 21st century skill that has attracted interest in large-scale-assessments. In the Programme for International Student Assessment (PISA) 2012, CPS was assessed all over the world to gain information on students' skills to acquire and apply knowledge while dealing with nontransparent…

  2. Calculating Probabilistic Distance to Solution in a Complex Problem Solving Domain

    ERIC Educational Resources Information Center

    Sudol, Leigh Ann; Rivers, Kelly; Harris, Thomas K.

    2012-01-01

    In complex problem solving domains, correct solutions are often comprised of a combination of individual components. Students usually go through several attempts, each attempt reflecting an individual solution state that can be observed during practice. Classic metrics to measure student performance over time rely on counting the number of…

  3. On Using Meta-Modeling and Multi-Modeling to Address Complex Problems

    ERIC Educational Resources Information Center

    Abu Jbara, Ahmed

    2013-01-01

    Models, created using different modeling techniques, usually serve different purposes and provide unique insights. While each modeling technique might be capable of answering specific questions, complex problems require multiple models interoperating to complement/supplement each other; we call this Multi-Modeling. To address the syntactic and…

  4. Eighth-Grade Students Defining Complex Problems: The Effectiveness of Scaffolding in a Multimedia Program

    ERIC Educational Resources Information Center

    Zydney, Janet Mannheimer

    2005-01-01

    This pilot study investigated the effectiveness of a multimedia learning environment called "Pollution Solution" on eighth-grade students' ability to define a complex problem. Sixty students from four earth science classes taught by the same teacher in a New York City public school were included in the sample for this study. The classes…

  5. Small-Group Problem-Based Learning as a Complex Adaptive System

    ERIC Educational Resources Information Center

    Mennin, Stewart

    2007-01-01

    Small-group problem-based learning (PBL) is widely embraced as a method of study in health professions schools and at many different levels of education. Complexity science provides a different lens with which to view and understand the application of this method. It presents new concepts and vocabulary that may be unfamiliar to practitioners of…

  6. Application of the complex scaling method in solving three-body Coulomb scattering problem

    NASA Astrophysics Data System (ADS)

    Lazauskas, R.

    2017-03-01

    The three-body scattering problem in Coulombic systems is a widespread, yet unresolved problem using the mathematically rigorous methods. In this work this long-term challenge has been undertaken by combining distorted waves and Faddeev–Merkuriev equation formalisms in conjunction with the complex scaling technique to overcome the difficulties related with the boundary conditions. Unlike the common belief, it is demonstrated that the smooth complex scaling method can be applied to solve the three-body Coulomb scattering problem in a wide energy region, including the fully elastic domain and extending to the energies well beyond the atom ionization threshold. A newly developed method is used to study electron scattering on the ground states of hydrogen and positronium atoms as well as a {e}++{{H}}({n}=1)\\rightleftarrows {{p}}+{Ps}({n}=1) reaction. Where available, obtained results are compared with the experimental data and theoretical predictions, proving the accuracy and efficiency of the newly developed method.

  7. The complex variable reproducing kernel particle method for elasto-plasticity problems

    NASA Astrophysics Data System (ADS)

    Chen, Li; Cheng, Yumin

    2010-05-01

    On the basis of reproducing kernel particle method (RKPM), using complex variable theory, the complex variable reproducing kernel particle method (CVRKPM) is discussed in this paper. The advantage of the CVRKPM is that the correction function of a two-dimensional problem is formed with one-dimensional basis function when the shape function is formed. Then the CVRKPM is applied to solve two-dimensional elasto-plasticity problems. The Galerkin weak form is employed to obtain the discretized system equation, the penalty method is used to apply the essential boundary conditions. And then, the CVRKPM for two-dimensional elasto-plasticity problems is formed, the corresponding formulae are obtained, and the Newton-Raphson method is used in the numerical implementation. Three numerical examples are given to show that this method in this paper is effective for elasto-plasticity analysis.

  8. Determining the Effects of Cognitive Style, Problem Complexity, and Hypothesis Generation on the Problem Solving Ability of School-Based Agricultural Education Students

    ERIC Educational Resources Information Center

    Blackburn, J. Joey; Robinson, J. Shane

    2016-01-01

    The purpose of this experimental study was to assess the effects of cognitive style, problem complexity, and hypothesis generation on the problem solving ability of school-based agricultural education students. Problem solving ability was defined as time to solution. Kirton's Adaption-Innovation Inventory was employed to assess students' cognitive…

  9. Analysis of elastoplasticity problems using an improved complex variable element-free Galerkin method

    NASA Astrophysics Data System (ADS)

    Cheng, Yu-Min; Liu, Chao; Bai, Fu-Nong; Peng, Miao-Juan

    2015-10-01

    In this paper, based on the conjugate of the complex basis function, a new complex variable moving least-squares approximation is discussed. Then using the new approximation to obtain the shape function, an improved complex variable element-free Galerkin (ICVEFG) method is presented for two-dimensional (2D) elastoplasticity problems. Compared with the previous complex variable moving least-squares approximation, the new approximation has greater computational precision and efficiency. Using the penalty method to apply the essential boundary conditions, and using the constrained Galerkin weak form of 2D elastoplasticity to obtain the system equations, we obtain the corresponding formulae of the ICVEFG method for 2D elastoplasticity. Three selected numerical examples are presented using the ICVEFG method to show that the ICVEFG method has the advantages such as greater precision and computational efficiency over the conventional meshless methods. Project supported by the National Natural Science Foundation of China (Grant Nos. 11171208 and U1433104).

  10. Mixing Bandt-Pompe and Lempel-Ziv approaches: another way to analyze the complexity of continuous-state sequences

    NASA Astrophysics Data System (ADS)

    Zozor, S.; Mateos, D.; Lamberti, P. W.

    2014-05-01

    In this paper, we propose to mix the approach underlying Bandt-Pompe permutation entropy with Lempel-Ziv complexity, to design what we call Lempel-Ziv permutation complexity. The principle consists of two steps: (i) transformation of a continuous-state series that is intrinsically multivariate or arises from embedding into a sequence of permutation vectors, where the components are the positions of the components of the initial vector when re-arranged; (ii) performing the Lempel-Ziv complexity for this series of `symbols', as part of a discrete finite-size alphabet. On the one hand, the permutation entropy of Bandt-Pompe aims at the study of the entropy of such a sequence; i.e., the entropy of patterns in a sequence (e.g., local increases or decreases). On the other hand, the Lempel-Ziv complexity of a discrete-state sequence aims at the study of the temporal organization of the symbols (i.e., the rate of compressibility of the sequence). Thus, the Lempel-Ziv permutation complexity aims to take advantage of both of these methods. The potential from such a combined approach - of a permutation procedure and a complexity analysis - is evaluated through the illustration of some simulated data and some real data. In both cases, we compare the individual approaches and the combined approach.

  11. Generalized CNF satisfiability, local reductions and complexity of succinctly specified problems

    SciTech Connect

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.; Radhakrishnan, V.

    1995-02-01

    We, study the complexity and efficient approximability of various decision, counting and optimization problems when instances are specified using (1) the 1-dimensional finite periodic narrow specifications of Wanke, (2) the 2-way infinite 1-dimensional narrow periodic (sometimes called dynamic) specifications of Karp and Orlin et al., and (3) the hierarchical specification language of Lengauer et al. We outline how generalized CNF satisfiability problems and local reductions can be used to obtain both hardness and easiness results for a number of decision, counting, optimization and approximate optimization problems when instances are specified as in (1), (2) or (3). As corollaries we obtain a number of new PSPACE-hardness and {number_sign}PSPACE-hardness,9 results and a number of new polynomial time approximation algorithms for natural PSPACE-hard optimization problems. In particular assuming P {ne} PSPACE, we characterize completely the complexities of the generalized CNF satisfiability problems SAT(S) of Schaefer [Sc78], when instances are specified as in (1), (2) or (3).

  12. The lower bound on complexity of parallel branch-and-bound algorithm for subset sum problem

    NASA Astrophysics Data System (ADS)

    Kolpakov, Roman; Posypkin, Mikhail

    2016-10-01

    The subset sum problem is a particular case of the Boolean knapsack problem where each item has the price equal to its weight. This problem can be informally stated as searching for most dense packing of a set of items into a box with limited capacity. Recently, coarse-grain parallelization approaches to Branch-and-Bound (B&B) method attracted some attention due to the growing popularity of weakly-connected distributed computing platforms. In this paper we consider one of such approaches for solving the subset sum problem. One of the processors (manager) performs some number of B&B steps on the first stage with generating some subproblems. On the second stage, the generated subproblems are sent to other processors, one subproblem per processor. The processors solve completely the received subproblems, the manager collects all the obtained solutions and chooses the optimal one. For this algorithm we formally define the parallel execution model (frontal scheme of parallelization) and the notion of the frontal scheme complexity. We study the frontal scheme complexity for a series of subset sum problems.

  13. Extending XCS with Cyclic Graphs for Scalability on Complex Boolean Problems.

    PubMed

    Iqbal, Muhammad; Browne, Will N; Zhang, Mengjie

    2015-09-25

    A main research direction in the field of evolutionary machine learning is to develop a scalable classifier system to solve high-dimensional problems. Recently work has begun on autonomously reusing learned building blocks of knowledge to scale from low-dimensional problems to high-dimensional ones. An XCS-based classifier system, known as XCSCFC, has been shown to be scalable, through the addition of expression tree-like code fragments, to a limit beyond standard learning classifier systems. XCSCFC is especially beneficial if the target problem can be divided into a hierarchy of subproblems and each of them is solvable in a bottom-up fashion. However, if the hierarchy of subproblems is too deep, then XCSCFC becomes impractical because of the needed computational time and thus eventually hits a limit in problem size. A limitation in this technique is the lack of a cyclic representation, which is inherent in finite state machines (FSMs). However, the evolution of FSMs is a hard task owing to the combinatorially large number of possible states, connections, and interaction. Usually this requires supervised learning to minimize inappropriate FSMs, which for high-dimensional problems necessitates subsampling or incremental testing. To avoid these constraints, this work introduces a state-machine-based encoding scheme into XCS for the first time, termed XCSSMA. The proposed system has been tested on six complex Boolean problem domains: multiplexer, majority-on, carry, even-parity, count ones, and digital design verification problems. The proposed approach outperforms XCSCFA (an XCS that computes actions) and XCSF (an XCS that computes predictions) in three of the six problem domains, while the performance in others is similar. In addition, XCSSMA evolved, for the first time, compact and human readable general classifiers (i.e., solving any n-bit problems) for the even-parity and carry problem domains, demonstrating its ability to produce scalable solutions using a

  14. Complexation studies with lanthanides and humic acid analyzed by ultrafiltration and capillary electrophoresis-inductively coupled plasma mass spectrometry.

    PubMed

    Kautenburger, Ralf; Beck, Horst Philipp

    2007-08-03

    For the long-term storage of radioactive waste, detailed information about geo-chemical behavior of radioactive and toxic metal ions under environmental conditions is necessary. Humic acid (HA) can play an important role in the immobilisation or mobilisation of metal ions due to complexation and colloid formation. Therefore, we investigate the complexation behavior of HA and its influence on the migration or retardation of selected lanthanides (europium and gadolinium as homologues of the actinides americium and curium). Two independent speciation techniques, ultrafiltration and capillary electrophoresis coupled with inductively coupled plasma mass spectrometry (CE-ICP-MS) have been compared for the study of Eu and Gd interaction with (purified Aldrich) HA. The degree of complexation of Eu and Gd in 25 mg l(-1) Aldrich HA solutions was determined with a broad range of metal loading (Eu and Gd total concentration between 10(-6) and 10(-4) mol l(-1)), ionic strength of 10 mM (NaClO4) and different pH-values. From the CE-ICP-MS electropherograms, additional information on the charge of the Eu species was obtained by the use of 1-bromopropane as neutral marker. To detect HA in the ICP-MS and separate between HA complexed and non complexed metal ions in the CE-ICP-MS, we have halogenated the HA with iodine as ICP-MS marker.

  15. [Approaches to determine priorities and to analyze problems of health with a look from the equity: experience in the local level in Venezuela].

    PubMed

    Heredia, Henny; Artmann, Elizabeth; López, Nora; Useche, Julio

    2011-03-01

    This article analyzes the application of the explanatory moment of the Strategic Situational Planning (SSP) and the Analysis of the Situation of Health (ASIS), as approaches that together, allow to prioritize with a look from the equity problems of health in the local level feasible of intervention. By using the case study developed in the parish Zuata of Aragua State, Venezuela, it can be observed the application of both approaches The main actors of the above mentioned parish prioritized the low coverage of drinkable water, like a health problem. On having analyzed the problem, the following causes were selected to prepare the proposed action: scarce community participation, weakness of governmental plans, absence of political town-planning, inadequate administration of the public resources and lack of conscience in the rational use of water. At the end one concludes that the joint PES-ASIS allows to generate inputs that concretized for the actors in an action plan, they can contribute in the reduction of inequities. Also, the active participation of the actors allows to demonstrate the real problems of the population and to construct a plan of demands.

  16. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem.

    PubMed

    Williams, Patricia Ah; Woodward, Andrew J

    2015-01-01

    The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat.

  17. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem

    PubMed Central

    Williams, Patricia AH; Woodward, Andrew J

    2015-01-01

    The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat. PMID:26229513

  18. Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving.

    PubMed

    Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni

    2015-03-06

    It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load.

  19. Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving

    PubMed Central

    Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni

    2015-01-01

    It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load. PMID:25652466

  20. Spatio-Temporal Complexity and Large-Scale Structures in Problems of Continuum Mechanic

    DTIC Science & Technology

    1993-07-15

    TEMPORAL COMPLEXITY & LARGE SCALE STRUCTURES IN PROBLEMS OF CONTINUUM MECHANICS" (U) 61103D 3484/D7 (URI) 6. AUTHOR(S) Drs. Basil Nicolaenko, Dieter...orbits on-the attractor. We have applied our method to two experimental data sets from Taylor-Couette flows . 14. SUBJECT TERMS -WS.-UMBER OF PAGES’ 14 S...8217 .. . .,16. PRICE CODE 17. SECURITY CLASSIFICATION 1. SECURITY CLASSIFICATION 19., SECURITY CLASSIFICATION 20. LIMITATION OFA. OF REPORT OF THIS

  1. Solving the three-body Coulomb breakup problem using exterior complex scaling

    SciTech Connect

    McCurdy, C.W.; Baertschy, M.; Rescigno, T.N.

    2004-05-17

    Electron-impact ionization of the hydrogen atom is the prototypical three-body Coulomb breakup problem in quantum mechanics. The combination of subtle correlation effects and the difficult boundary conditions required to describe two electrons in the continuum have made this one of the outstanding challenges of atomic physics. A complete solution of this problem in the form of a ''reduction to computation'' of all aspects of the physics is given by the application of exterior complex scaling, a modern variant of the mathematical tool of analytic continuation of the electronic coordinates into the complex plane that was used historically to establish the formal analytic properties of the scattering matrix. This review first discusses the essential difficulties of the three-body Coulomb breakup problem in quantum mechanics. It then describes the formal basis of exterior complex scaling of electronic coordinates as well as the details of its numerical implementation using a variety of methods including finite difference, finite elements, discrete variable representations, and B-splines. Given these numerical implementations of exterior complex scaling, the scattering wave function can be generated with arbitrary accuracy on any finite volume in the space of electronic coordinates, but there remains the fundamental problem of extracting the breakup amplitudes from it. Methods are described for evaluating these amplitudes. The question of the volume-dependent overall phase that appears in the formal theory of ionization is resolved. A summary is presented of accurate results that have been obtained for the case of electron-impact ionization of hydrogen as well as a discussion of applications to the double photoionization of helium.

  2. TOPICAL REVIEW: Solving the three-body Coulomb breakup problem using exterior complex scaling

    NASA Astrophysics Data System (ADS)

    McCurdy, C. W.; Baertschy, M.; Rescigno, T. N.

    2004-09-01

    Electron-impact ionization of the hydrogen atom is the prototypical three-body Coulomb breakup problem in quantum mechanics. The combination of subtle correlation effects and the difficult boundary conditions required to describe two electrons in the continuum have made this one of the outstanding challenges of atomic physics. A complete solution of this problem in the form of a 'reduction to computation' of all aspects of the physics is given by the application of exterior complex scaling, a modern variant of the mathematical tool of analytic continuation of the electronic coordinates into the complex plane that was used historically to establish the formal analytic properties of the scattering matrix. This review first discusses the essential difficulties of the three-body Coulomb breakup problem in quantum mechanics. It then describes the formal basis of exterior complex scaling of electronic coordinates as well as the details of its numerical implementation using a variety of methods including finite difference, finite elements, discrete variable representations and B-splines. Given these numerical implementations of exterior complex scaling, the scattering wavefunction can be generated with arbitrary accuracy on any finite volume in the space of electronic coordinates, but there remains the fundamental problem of extracting the breakup amplitudes from it. Methods are described for evaluating these amplitudes. The question of the volume-dependent overall phase that appears in the formal theory of ionization is resolved. A summary is presented of accurate results that have been obtained for the case of electron-impact ionization of hydrogen as well as a discussion of applications to the double photoionization of helium.

  3. Migration of levonorgestrel IUS in a patient with complex medical problems: what should be done?

    PubMed

    Soleymani Majd, Hooman; El Hamamy, Essam; Chandrasekar, Ramya; Ismail, Lamiese

    2009-03-01

    Patients with complex medical problems should be counselled about the need for highly effective contraception. As failure resulting in pregnancy, could cause significant morbidity and mortality. The LNG-IUS has gained great popularity and generally has a low side effect profile; however, perforation of the uterus and migration of the device is a potentially serious complication known to be associated with its use. The current accepted management is removal of the device from the abdominal cavity in order to prevent further morbidity. However this is not always a simple matter in patients who have complex medical problems and who are deemed unfit for surgery. Each time the patient comes for renewal of the contraceptive method, clinicians need to reassess the risks and benefits. This is particularly relevant in patients who have complex medical problems where special attention needs to be given, not only to immediate risks but also to long-term ones. Careful individualised counselling and consideration are paramount and perhaps it would have been prudent to discuss vasectomy with this patient and her husband (as the first line of contraception), as this may have avoided the ensuing complications arising from the chosen method.

  4. Analyzing Student Motivation at the Confluence of Achievement Goals and Their Underlying Reasons: An Investigation of Goal Complexes

    ERIC Educational Resources Information Center

    Hodis, Flaviu A.; Tait, Carolyn; Hodis, Georgeta M.; Hodis, Monica A.; Scornavacca, Eusebio

    2016-01-01

    This research investigated the interrelations among achievement goals and the underlying reasons for pursuing them. To do so, it utilized the framework of goal complexes, which are regulatory constructs defined at the intersection of aims and reasons. Data from two independent large samples of New Zealand university students showed that across…

  5. On the Fractality of Complex Networks: Covering Problem, Algorithms and Ahlfors Regularity

    PubMed Central

    Wang, Lihong; Wang, Qin; Xi, Lifeng; Chen, Jin; Wang, Songjing; Bao, Liulu; Yu, Zhouyu; Zhao, Luming

    2017-01-01

    In this paper, we revisit the fractality of complex network by investigating three dimensions with respect to minimum box-covering, minimum ball-covering and average volume of balls. The first two dimensions are calculated through the minimum box-covering problem and minimum ball-covering problem. For minimum ball-covering problem, we prove its NP-completeness and propose several heuristic algorithms on its feasible solution, and we also compare the performance of these algorithms. For the third dimension, we introduce the random ball-volume algorithm. We introduce the notion of Ahlfors regularity of networks and prove that above three dimensions are the same if networks are Ahlfors regular. We also provide a class of networks satisfying Ahlfors regularity. PMID:28128289

  6. On the Fractality of Complex Networks: Covering Problem, Algorithms and Ahlfors Regularity

    NASA Astrophysics Data System (ADS)

    Wang, Lihong; Wang, Qin; Xi, Lifeng; Chen, Jin; Wang, Songjing; Bao, Liulu; Yu, Zhouyu; Zhao, Luming

    2017-01-01

    In this paper, we revisit the fractality of complex network by investigating three dimensions with respect to minimum box-covering, minimum ball-covering and average volume of balls. The first two dimensions are calculated through the minimum box-covering problem and minimum ball-covering problem. For minimum ball-covering problem, we prove its NP-completeness and propose several heuristic algorithms on its feasible solution, and we also compare the performance of these algorithms. For the third dimension, we introduce the random ball-volume algorithm. We introduce the notion of Ahlfors regularity of networks and prove that above three dimensions are the same if networks are Ahlfors regular. We also provide a class of networks satisfying Ahlfors regularity.

  7. Analysis and formulation of a class of complex dynamic optimization problems

    NASA Astrophysics Data System (ADS)

    Kameswaran, Shivakumar

    The Direct Transcription approach, also known as the direct simultaneous approach, is a widely used solution strategy for the solution of dynamic optimization problems involving differential-algebraic equations (DAEs). Direct transcription refers to the procedure of approximating the infinite dimensional problem by a finite dimensional one, which is then solved using a nonlinear programming (NLP) solver tailored to large-scale problems. Systems governed by partial differential equations (PDEs) can also be handled by spatially discretizing the PDEs to convert them to a system of DAEs. The objective of this thesis is firstly to ensure that direct transcription using Radau collocation is provably correct, and secondly to widen the applicability of the direct simultaneous approach to a larger class of dynamic optimization and optimal control problems (OCPs). This thesis aims at addressing these issues using rigorous theoretical tools and/or characteristic examples, and at the same time use the results for solving large-scale industrial applications to realize the benefits. The first part of this work deals with the analysis of convergence rates for direct transcription of unconstrained and final-time equality constrained optimal control problems. The problems are discretized using collocation at Radau points. Convergence is analyzed from an NLP/matrix-algebra perspective, which enables the prediction of the conditioning of the direct transcription NLP as the mesh size becomes finer. Several convergence results are presented along with tests on numerous example problems. These convergence results lead to an adjoint estimation procedure given the Lagrange multipliers for the large-scale NLP. The work also reveals the role of process control concepts such as controllability on the convergence analysis, and provides a very important link between control and optimization inside the framework of dynamic optimization. As an effort to extend the applicability of the direct

  8. Beyond roots alone: Novel methodologies for analyzing complex soil and minirhizotron imagery using image processing and GIS tools

    NASA Astrophysics Data System (ADS)

    Silva, Justina A.

    Quantifying belowground dynamics is critical to our understanding of plant and ecosystem function and belowground carbon cycling, yet currently available tools for complex belowground image analyses are insufficient. We introduce novel techniques combining digital image processing tools and geographic information systems (GIS) analysis to permit semi-automated analysis of complex root and soil dynamics. We illustrate methodologies with imagery from microcosms, minirhizotrons, and a rhizotron, in upland and peatland soils. We provide guidelines for correct image capture, a method that automatically stitches together numerous minirhizotron images into one seamless image, and image analysis using image segmentation and classification in SPRING or change analysis in ArcMap. These methods facilitate spatial and temporal root and soil interaction studies, providing a framework to expand a more comprehensive understanding of belowground dynamics.

  9. You Need to Know: There Is a Causal Relationship between Structural Knowledge and Control Performance in Complex Problem Solving Tasks

    ERIC Educational Resources Information Center

    Goode, Natassia; Beckmann, Jens F.

    2010-01-01

    This study investigates the relationships between structural knowledge, control performance and fluid intelligence in a complex problem solving (CPS) task. 75 participants received either complete, partial or no information regarding the underlying structure of a complex problem solving task, and controlled the task to reach specific goals.…

  10. Robust non-parametric tests for complex-repeated measures problems in ophthalmology.

    PubMed

    Brombin, Chiara; Midena, Edoardo; Salmaso, Luigi

    2013-12-01

    The NonParametric Combination methodology (NPC) of dependent permutation tests allows the experimenter to face many complex multivariate testing problems and represents a convincing and powerful alternative to standard parametric methods. The main advantage of this approach lies in its flexibility in handling any type of variable (categorical and quantitative, with or without missing values) while at the same time taking dependencies among those variables into account without the need of modelling them. NPC methodology enables to deal with repeated measures, paired data, restricted alternative hypotheses, missing data (completely at random or not), high-dimensional and small sample size data. Hence, NPC methodology can offer a significant contribution to successful research in biomedical studies with several endpoints, since it provides reasonably efficient solutions and clear interpretations of inferential results. Pesarin F. Multivariate permutation tests: with application in biostatistics. Chichester-New York: John Wiley &Sons, 2001; Pesarin F, Salmaso L. Permutation tests for complex data: theory, applications and software. Chichester, UK: John Wiley &Sons, 2010. We focus on non-parametric permutation solutions to two real-case studies in ophthalmology, concerning complex-repeated measures problems. For each data set, different analyses are presented, thus highlighting characteristic aspects of the data structure itself. Our goal is to present different solutions to multivariate complex case studies, guiding researchers/readers to choose, from various possible interpretations of a problem, the one that has the highest flexibility and statistical power under a set of less stringent assumptions. MATLAB code has been implemented to carry out the analyses.

  11. How to solve complex problems in foundry plants - future of casting simulation -

    NASA Astrophysics Data System (ADS)

    Ohnaka, I.

    2015-06-01

    Although the computer simulation of casting has progressed dramatically over the last decades, there are still many challenges and problems. This paper discusses how to solve complex engineering problems in foundry plants and what we should do in the future, in particular, for casting simulation. First, problem solving procedures including application of computer simulation are demonstrated and various difficulties are pointed-out exemplifying mainly porosity defects in sand castings of spheroidal graphite cast irons. Next, looking back conventional scientific and engineering research to understand casting phenomena, challenges and problems are discussed from problem solving view point, followed by discussion on the issues we should challenge such as how to integrate huge amount of dispersed knowledge in various disciplines, differentiation of science-oriented and engineering-oriented models, professional ethics, how to handle fluctuating materials, initial and boundary conditions, error accumulation, simulation codes as black-box, etc. Finally some suggestions are made on how to challenge the issues such as promotion of research on the simulation based on the science- oriented model and publication of reliable data of casting phenomena in complicated-shaped castings including reconsideration of the evaluation system.

  12. Inductive dielectric analyzer

    NASA Astrophysics Data System (ADS)

    Agranovich, Daniel; Polygalov, Eugene; Popov, Ivan; Ben Ishai, Paul; Feldman, Yuri

    2017-03-01

    One of the approaches to bypass the problem of electrode polarization in dielectric measurements is the free electrode method. The advantage of this technique is that, the probing electric field in the material is not supplied by contact electrodes, but rather by electromagnetic induction. We have designed an inductive dielectric analyzer based on a sensor comprising two concentric toroidal coils. In this work, we present an analytic derivation of the relationship between the impedance measured by the sensor and the complex dielectric permittivity of the sample. The obtained relationship was successfully employed to measure the dielectric permittivity and conductivity of various alcohols and aqueous salt solutions.

  13. Open-ITC: an alternate computational approach to analyze the isothermal titration calorimetry data of complex binding mechanisms.

    PubMed

    Krishnamoorthy, Janarthanan; Mohanty, Smita

    2011-01-01

    Isothermal titration calorimetry (ITC) is an important technique used in quantitatively analyzing the global mechanism of protein-protein or protein-ligand interactions through thermodynamic measurements. Among different binding mechanisms, the parallel and ligand induced protein oligomerization mechanisms are technically difficult to analyze compared with a sequential binding mechanism. Here, we present a methodology implemented as a program "Open-ITC" that eliminates the need for exact analytical expressions for free ligand concentrations [L] and mole fractions of bound ligand θ that are required for the thermogram analysis. Adopting a genetic algorithm-based optimization, the thermodynamic parameters are determined, and its standard error is evaluated at the global minimum by calculating the Jacobian matrix. This approach yielded a statistically consistent result for a single-site and a two-site binding protein-ligand system. Further, a comparative simulation of a two-step sequential, a parallel, and a ligand induced oligomerization model revealed that their mechanistic differences are discernable in ITC thermograms, only if the first binding step is weaker compared with the second binding step (K(1)

  14. Analyzing the tradeoff between electrical complexity and accuracy in patient-specific computational models of deep brain stimulation

    NASA Astrophysics Data System (ADS)

    Howell, Bryan; McIntyre, Cameron C.

    2016-06-01

    Objective. Deep brain stimulation (DBS) is an adjunctive therapy that is effective in treating movement disorders and shows promise for treating psychiatric disorders. Computational models of DBS have begun to be utilized as tools to optimize the therapy. Despite advancements in the anatomical accuracy of these models, there is still uncertainty as to what level of electrical complexity is adequate for modeling the electric field in the brain and the subsequent neural response to the stimulation. Approach. We used magnetic resonance images to create an image-based computational model of subthalamic DBS. The complexity of the volume conductor model was increased by incrementally including heterogeneity, anisotropy, and dielectric dispersion in the electrical properties of the brain. We quantified changes in the load of the electrode, the electric potential distribution, and stimulation thresholds of descending corticofugal (DCF) axon models. Main results. Incorporation of heterogeneity altered the electric potentials and subsequent stimulation thresholds, but to a lesser degree than incorporation of anisotropy. Additionally, the results were sensitive to the choice of method for defining anisotropy, with stimulation thresholds of DCF axons changing by as much as 190%. Typical approaches for defining anisotropy underestimate the expected load of the stimulation electrode, which led to underestimation of the extent of stimulation. More accurate predictions of the electrode load were achieved with alternative approaches for defining anisotropy. The effects of dielectric dispersion were small compared to the effects of heterogeneity and anisotropy. Significance. The results of this study help delineate the level of detail that is required to accurately model electric fields generated by DBS electrodes.

  15. Hybrid binary GA-EDA algorithms for complex “black-box” optimization problems

    NASA Astrophysics Data System (ADS)

    Sopov, E.

    2017-02-01

    Genetic Algorithms (GAs) have proved their efficiency solving many complex optimization problems. GAs can be also applied for “black-box” problems, because they realize the “blind” search and do not require any specific information about features of search space and objectives. It is clear that a GA uses the “Trial-and-Error” strategy to explorer search space, and collects some statistical information that is stored in the form of genes in the population. Estimation of Distribution Algorithms (EDA) have very similar realization as GAs, but use an explicit representation of search experience in the form of the statistical probabilities distribution. In this study we discus some approaches for improving the standard GA performance by combining the binary GA with EDA. Finally, a novel approach for the large-scale global optimization is proposed. The experimental results and comparison with some well-studied techniques are presented and discussed.

  16. On complex roots of an equation arising in the oblique derivative problem

    NASA Astrophysics Data System (ADS)

    Kostin, A. B.; Sherstyukov, V. B.

    2017-01-01

    The paper is concerned with the eigenvalue problem for the Laplace operator in a disc under the condition that the oblique derivative vanishes on the disc boundary. In a famous article by V.A. Il’in and E.I. Moiseev (Differential equations, 1994) it was found, in particular, that the root of any equation of the form with the Bessel function Jn (μ) determines the eigenvalue λ = μ 2 of the problem. In our work we correct the information about the location of eigenvalues. It is specified explicit view of the corner, containing all the eigenvalues. It is shown that all the nonzero roots of the equation are simple and given a refined description of the set of their localization on the complex plane. To prove these facts we use the partial differential equations methods and also methods of entire functions theory.

  17. [Present-day problems of complex hygienic evaluation of drinking water use].

    PubMed

    Tulakin, A V; Novikov, Iu V; Tsyplakova, G V; Ampleeva, G P; Shukelaĭt', A B

    2005-01-01

    The authors offer substantiated methodical approaches to complex evaluation of the sanitary reliability of drinking water supply systems. They recommend not only evaluating drinking water quality, but also assessing the sanitary state of water sources (catchment areas), the reliability of water preparation and transportation, the standards of water supply and the reliability of production laboratory control. A range of complex hygienic studies have demonstrated that the problems of Voronezh interurban reservoir as a water source are caused by its multi-purpose use. Under these conditions insufficient hygienic efficiency of the conventional water preparation schemes and low sanitary reliability of water transportation systems favors negative influence of water factor on population mortality. The offered methodical approaches give the systematic idea of factors that determine drinking water quality. Operative administrative decisions concerning hygienic safety of public water use may be made with these methodical approaches taken into consideration.

  18. Nanotechnology for sustainability: what does nanotechnology offer to address complex sustainability problems?

    NASA Astrophysics Data System (ADS)

    Wiek, Arnim; Foley, Rider W.; Guston, David H.

    2012-09-01

    Nanotechnology is widely associated with the promise of positively contributing to sustainability. However, this view often focuses on end-of-pipe applications, for instance, for water purification or energy efficiency, and relies on a narrow concept of sustainability. Approaching sustainability problems and solution options from a comprehensive and systemic perspective instead may yield quite different conclusions about the contribution of nanotechnology to sustainability. This study conceptualizes sustainability problems as complex constellations with several potential intervention points and amenable to different solution options. The study presents results from interdisciplinary workshops and literature reviews that appraise the contribution of the selected nanotechnologies to mitigate such problems. The study focuses exemplarily on the urban context to make the appraisals tangible and relevant. The solution potential of nanotechnology is explored not only for well-known urban sustainability problems such as water contamination and energy use but also for less obvious ones such as childhood obesity. Results indicate not only potentials but also limitations of nanotechnology's contribution to sustainability and can inform anticipatory governance of nanotechnology in general, and in the urban context in particular.

  19. Convergent validity of the aberrant behavior checklist and behavior problems inventory with people with complex needs.

    PubMed

    Hill, Jennie; Powlitch, Stephanie; Furniss, Frederick

    2008-01-01

    The current study aimed to replicate and extend Rojahn et al. [Rojahn, J., Aman, M. G., Matson, J. L., & Mayville, E. (2003). The aberrant behavior checklist and the behavior problems inventory: Convergent and divergent validity. Research in Developmental Disabilities, 24, 391-404] by examining the convergent validity of the behavior problems inventory (BPI) and the aberrant behavior checklist (ABC) for individuals presenting with multiple complex behavior problems. Data were collected from 69 children and adults with severe intellectual disabilities and challenging behavior living in residential establishments. MANCOVA analyses showed that individuals with elevated BPI stereotyped behavior subscale scores had higher scores on ABC lethargy and stereotypy subscales, while those with elevated BPI aggressive/destructive behavior subscale scores obtained higher scores on ABC irritability, stereotypy and hyperactivity subscales. Multiple regression analyses showed a corresponding pattern of results in the prediction of ABC subscale scores by BPI subscale scores. Exploratory factor analysis of the BPI data suggested a six-factor solution with an aggressive/destructive behavior factor, four factors relating to stereotypy, and one related to stereotypy and self-injury. These results, discussed with reference to Rojahn et al. [Rojahn, J., Aman, M. G., Matson, J. L., & Mayville, E. (2003). The aberrant behavior checklist and the behavior problems inventory: Convergent and divergent validity. Research in Developmental Disabilities, 24, 391-404], support the existence of relationships between specific subscales of the two instruments in addition to an overall association between total scores related to general severity of behavioral disturbance.

  20. The anatomical problem posed by brain complexity and size: a potential solution.

    PubMed

    DeFelipe, Javier

    2015-01-01

    Over the years the field of neuroanatomy has evolved considerably but unraveling the extraordinary structural and functional complexity of the brain seems to be an unattainable goal, partly due to the fact that it is only possible to obtain an imprecise connection matrix of the brain. The reasons why reaching such a goal appears almost impossible to date is discussed here, together with suggestions of how we could overcome this anatomical problem by establishing new methodologies to study the brain and by promoting interdisciplinary collaboration. Generating a realistic computational model seems to be the solution rather than attempting to fully reconstruct the whole brain or a particular brain region.

  1. The anatomical problem posed by brain complexity and size: a potential solution

    PubMed Central

    DeFelipe, Javier

    2015-01-01

    Over the years the field of neuroanatomy has evolved considerably but unraveling the extraordinary structural and functional complexity of the brain seems to be an unattainable goal, partly due to the fact that it is only possible to obtain an imprecise connection matrix of the brain. The reasons why reaching such a goal appears almost impossible to date is discussed here, together with suggestions of how we could overcome this anatomical problem by establishing new methodologies to study the brain and by promoting interdisciplinary collaboration. Generating a realistic computational model seems to be the solution rather than attempting to fully reconstruct the whole brain or a particular brain region. PMID:26347617

  2. A new approach to the solution of boundary value problems involving complex configurations

    NASA Technical Reports Server (NTRS)

    Rubbert, P. E.; Bussoletti, J. E.; Johnson, F. T.; Sidwell, K. W.; Rowe, W. S.; Samant, S. S.; Sengupta, G.; Weatherill, W. H.; Burkhart, R. H.; Woo, A. C.

    1986-01-01

    A new approach for solving certain types of boundary value problems about complex configurations is presented. Numerical algorithms from such diverse fields as finite elements, preconditioned Krylov subspace methods, discrete Fourier analysis, and integral equations are combined to take advantage of the memory, speed and architecture of current and emerging supercomputers. Although the approach has application to many branches of computational physics, the present effort is concentrated in areas of Computational Fluid Dynamics (CFD) such as steady nonlinear aerodynamics, time harmonic unsteady aerodynamics, and aeroacoustics. The most significant attribute of the approach is that it can handle truly arbitrary boundary geometries and eliminates the difficult task of generating surface fitted grids.

  3. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    SciTech Connect

    Tahvili, Sahar; Österberg, Jonas; Silvestrov, Sergei; Biteus, Jonas

    2014-12-10

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation.

  4. Analyzing Katana referral hospital as a complex adaptive system: agents, interactions and adaptation to a changing environment.

    PubMed

    Karemere, Hermès; Ribesse, Nathalie; Marchal, Bruno; Macq, Jean

    2015-01-01

    This study deals with the adaptation of Katana referral hospital in Eastern Democratic Republic of Congo in a changing environment that is affected for more than a decade by intermittent armed conflicts. His objective is to generate theoretical proposals for addressing differently the analysis of hospitals governance in the aims to assess their performance and how to improve that performance. The methodology applied approach uses a case study using mixed methods ( qualitative and quantitative) for data collection. It uses (1) hospital data to measure the output of hospitals, (2) literature review to identify among others, events and interventions recorded in the history of hospital during the study period and (3) information from individual interviews to validate the interpretation of the results of the previous two sources of data and understand the responsiveness of management team referral hospital during times of change. The study brings four theoretical propositions: (1) Interaction between key agents is a positive force driving adaptation if the actors share a same vision, (2) The strength of the interaction between agents is largely based on the nature of institutional arrangements, which in turn are shaped by the actors themselves, (3) The owner and the management team play a decisive role in the implementation of effective institutional arrangements and establishment of positive interactions between agents, (4) The analysis of recipient population's perception of health services provided allow to better tailor and adapt the health services offer to the population's needs and expectations. Research shows that it isn't enough just to provide support (financial and technical), to manage a hospital for operate and adapt to a changing environment but must still animate, considering that it is a complex adaptive system and that this animation is nothing other than the induction of a positive interaction between agents.

  5. SVD-GFD scheme to simulate complex moving body problems in 3D space

    NASA Astrophysics Data System (ADS)

    Wang, X. Y.; Yu, P.; Yeo, K. S.; Khoo, B. C.

    2010-03-01

    The present paper presents a hybrid meshfree-and-Cartesian grid method for simulating moving body incompressible viscous flow problems in 3D space. The method combines the merits of cost-efficient and accurate conventional finite difference approximations on Cartesian grids with the geometric freedom of generalized finite difference (GFD) approximations on meshfree grids. Error minimization in GFD is carried out by singular value decomposition (SVD). The Arbitrary Lagrangian-Eulerian (ALE) form of the Navier-Stokes equations on convecting nodes is integrated by a fractional-step projection method. The present hybrid grid method employs a relatively simple mode of nodal administration. Nevertheless, it has the geometrical flexibility of unstructured mesh-based finite-volume and finite element methods. Boundary conditions are precisely implemented on boundary nodes without interpolation. The present scheme is validated by a moving patch consistency test as well as against published results for 3D moving body problems. Finally, the method is applied on low-Reynolds number flapping wing applications, where large boundary motions are involved. The present study demonstrates the potential of the present hybrid meshfree-and-Cartesian grid scheme for solving complex moving body problems in 3D.

  6. Exploring the complexity of inquiry learning in an open-ended problem space

    NASA Astrophysics Data System (ADS)

    Clarke, Jody

    Data-gathering and problem identification are key components of scientific inquiry. However, few researchers have studied how students learn these skills because historically this required a time-consuming, complicated method of capturing the details of learners' data-gathering processes. Nor are classroom settings authentic contexts in which students could exhibit problem identification skills parallel to those involved in deconstructing complex real world situations. In this study of middle school students, because of my access to an innovative technology, I simulated a disease outbreak in a virtual community as a complicated, authentic problem. As students worked through the curriculum in the virtual world, their time-stamped actions were stored by the computer in event-logs. Using these records, I tracked in detail how the student scientists made sense of the complexity they faced and how they identified and investigated the problem using science-inquiry skills. To describe the degree to which students' data collection narrowed and focused on a specific disease over time, I developed a rubric and automated the coding of records in the event-logs. I measured the ongoing development of the students' "systematicity" in investigating the disease outbreak. I demonstrated that coding event-logs is an effective yet non-intrusive way of collecting and parsing detailed information about students' behaviors in real time in an authentic setting. My principal research question was "Do students who are more thoughtful about their inquiry prior to entry into the curriculum demonstrate increased systematicity in their inquiry behavior during the experience, by narrowing the focus of their data-gathering more rapidly than students who enter with lower levels of thoughtfulness about inquiry?" My sample consisted of 403 middle-school students from public schools in the US who volunteered to participate in the River City Project in spring 2008. Contrary to my hypothesis, I found

  7. The Syllis gracilis species complex: A molecular approach to a difficult taxonomic problem (Annelida, Syllidae).

    PubMed

    Álvarez-Campos, Patricia; Giribet, Gonzalo; Riesgo, Ana

    2017-04-01

    Syllis gracilis is an emblematic member of the subfamily Syllinae (Syllidae, Annelida), which inhabits shallow, temperate coastal waters and can be found on algae, coral rubble, and sponges. Their distinctive ypsiloid chaetae, usually found in specimens from populations all around the world, led to the consideration of the species as cosmopolitan, even though four other species have similar chaetae: Syllis magellanica, S. picta, S. mayeri and S. ypsiloides. The discovery of deeply divergent lineages in the Mediterranean Sea, that were morphologically similar, questioned the cosmopolitanism of S. gracilis and suggested the possibility of it being a species complex. In order to assess the speciation patterns within the putative S. gracilis complex, we undertook species delimitation and phylogenetic analyses on 61 specimens morphologically ascribed to Syllis gracilis and closely related species using a multilocus molecular dataset (two mitochondrial and two nuclear markers). Our results suggest high levels of genetic differentiation between the S. gracilis populations analyzed, some of which have morphologically distinctive features. Five to eight distinct lineages (depending on the analysis) were identified, all with geographically restricted distributions. Although the presence of ypsiloid chaetae has been traditionally considered the main character to identify S. gracilis, we conclude that this feature is homoplastic. Instead, we propose that characters such as the degree of fusion of blades and shafts in chaetae, the morphology of the posterior chaetae or the animal color pattern should be considered to differentiate lineages within the S. gracilis species complex. Our study does not support the cosmopolitanism of S. gracilis, and instead provides morphological and molecular evidence of the existence of a complex of pseudo-cryptic species.

  8. Combined Parameter and State Estimation Problem in a Complex Domain: RF Hyperthermia Treatment Using Nanoparticles

    NASA Astrophysics Data System (ADS)

    Bermeo Varon, L. A.; Orlande, H. R. B.; Eliçabe, G. E.

    2016-09-01

    The particle filter methods have been widely used to solve inverse problems with sequential Bayesian inference in dynamic models, simultaneously estimating sequential state variables and fixed model parameters. This methods are an approximation of sequences of probability distributions of interest, that using a large set of random samples, with presence uncertainties in the model, measurements and parameters. In this paper the main focus is the solution combined parameters and state estimation in the radiofrequency hyperthermia with nanoparticles in a complex domain. This domain contains different tissues like muscle, pancreas, lungs, small intestine and a tumor which is loaded iron oxide nanoparticles. The results indicated that excellent agreements between estimated and exact value are obtained.

  9. Communication: overcoming the root search problem in complex quantum trajectory calculations.

    PubMed

    Zamstein, Noa; Tannor, David J

    2014-01-28

    Three new developments are presented regarding the semiclassical coherent state propagator. First, we present a conceptually different derivation of Huber and Heller's method for identifying complex root trajectories and their equations of motion [D. Huber and E. J. Heller, J. Chem. Phys. 87, 5302 (1987)]. Our method proceeds directly from the time-dependent Schrödinger equation and therefore allows various generalizations of the formalism. Second, we obtain an analytic expression for the semiclassical coherent state propagator. We show that the prefactor can be expressed in a form that requires solving significantly fewer equations of motion than in alternative expressions. Third, the semiclassical coherent state propagator is used to formulate a final value representation of the time-dependent wavefunction that avoids the root search, eliminates problems with caustics and automatically includes interference. We present numerical results for the 1D Morse oscillator showing that the method may become an attractive alternative to existing semiclassical approaches.

  10. Diagnostic and management problems in a complex case of connective tissue disease.

    PubMed

    Yeap, S S; Deighton, C M; Powell, R J; Read, R C; Finch, R G

    1995-12-01

    A 28-year-old Nigerian woman presented with persistent pyrexia, marked pruritus, eosinophilia, myalgias, flitting arthralgias, serositis and massive splenomegaly. Intensive investigation for an infective or neoplastic aetiology proved negative. Empirical treatment for helminthic infections and tuberculosis was unhelpful. Although there were no specific clues to suggest an underlying connective tissue disease, a trial of steriods and azathioprine was introduced, with no obvious response. Her condition deteriorated to a point where it was decided that intravenous immunosuppressive therapy was needed and subsequently, her condition improved remarkably. This patient illustrates the problems in the diagnosis and management of complex disorders, particularly when classical tests for connective tissue diseases are absent. Also, we would like to report that marked pruritus can be associated with connective tissue disease.

  11. Perfect absorption in Schrödinger-like problems using non-equidistant complex grids

    NASA Astrophysics Data System (ADS)

    Weinmüller, Markus; Weinmüller, Michael; Rohland, Jonathan; Scrinzi, Armin

    2017-03-01

    Two non-equidistant grid implementations of infinite range exterior complex scaling are introduced that allow for perfect absorption in the time dependent Schrödinger equation. Finite element discrete variables discretizations provide as efficient absorption as the corresponding finite elements discretizations. This finding is at variance with results reported in literature [L. Tao et al., Phys. Rev. A 48, 063419 (2009)]. For finite differences, a new class of generalized Q-point schemes for non-equidistant grids is derived. Convergence of absorption is exponential ∼ Δx Q - 1 and numerically robust. Local relative errors ≲10-9 are achieved in a standard problem of strong-field ionization.

  12. Leadership and leadership development in healthcare settings - a simplistic solution to complex problems?

    PubMed

    McDonald, Ruth

    2014-10-01

    There is a trend in health systems around the world to place great emphasis on and faith in improving 'leadership'. Leadership has been defined in many ways and the elitist implications of traditional notions of leadership sit uncomfortably with modern healthcare organisations. The concept of distributed leadership incorporates inclusivity, collectiveness and collaboration, with the result that, to some extent, all staff, not just those in senior management roles, are viewed as leaders. Leadership development programmes are intended to equip individuals to improve leadership skills, but we know little about their effectiveness. Furthermore, the content of these programmes varies widely and the fact that many lack a sense of how they fit with individual or organisational goals raises questions about how they are intended to achieve their aims. It is important to avoid simplistic assumptions about the ability of improved leadership to solve complex problems. It is also important to evaluate leadership development programmes in ways that go beyond descriptive accounts.

  13. Class II major histocompatibility complex tetramer staining: progress, problems, and prospects

    PubMed Central

    Vollers, Sabrina S; Stern, Lawrence J

    2008-01-01

    The use of major histocompatibility complex (MHC) tetramers in the detection and analysis of antigen-specific T cells has become more widespread since its introduction 11 years ago. Early challenges in the application of tetramer staining to CD4+ T cells centred around difficulties in the expression of various class II MHC allelic variants and the detection of low-frequency T cells in mixed populations. As many of the technical obstacles to class II MHC tetramer staining have been overcome, the focus has returned to uncertainties concerning how oligomer valency and T-cell receptor/MHC affinity affect tetramer binding. Such issues have become more important with an increase in the number of studies relying on direct ex vivo analysis of antigen-specific CD4+ T cells. In this review we discuss which problems in class II MHC tetramer staining have been solved to date, and which matters remain to be considered. PMID:18251991

  14. Communication: Overcoming the root search problem in complex quantum trajectory calculations

    SciTech Connect

    Zamstein, Noa; Tannor, David J.

    2014-01-28

    Three new developments are presented regarding the semiclassical coherent state propagator. First, we present a conceptually different derivation of Huber and Heller's method for identifying complex root trajectories and their equations of motion [D. Huber and E. J. Heller, J. Chem. Phys. 87, 5302 (1987)]. Our method proceeds directly from the time-dependent Schrödinger equation and therefore allows various generalizations of the formalism. Second, we obtain an analytic expression for the semiclassical coherent state propagator. We show that the prefactor can be expressed in a form that requires solving significantly fewer equations of motion than in alternative expressions. Third, the semiclassical coherent state propagator is used to formulate a final value representation of the time-dependent wavefunction that avoids the root search, eliminates problems with caustics and automatically includes interference. We present numerical results for the 1D Morse oscillator showing that the method may become an attractive alternative to existing semiclassical approaches.

  15. Using Educational Data Mining Methods to Assess Field-Dependent and Field-Independent Learners' Complex Problem Solving

    ERIC Educational Resources Information Center

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    The present study investigated the problem-solving performance of 101 university students and their interactions with a computer modeling tool in order to solve a complex problem. Based on their performance on the hidden figures test, students were assigned to three groups of field-dependent (FD), field-mixed (FM), and field-independent (FI)…

  16. Seeing around a Ball: Complex, Technology-Based Problems in Calculus with Applications in Science and Engineering-Redux

    ERIC Educational Resources Information Center

    Winkel, Brian

    2008-01-01

    A complex technology-based problem in visualization and computation for students in calculus is presented. Strategies are shown for its solution and the opportunities for students to put together sequences of concepts and skills to build for success are highlighted. The problem itself involves placing an object under water in order to actually see…

  17. An Investigation of the Interrelationships between Motivation, Engagement, and Complex Problem Solving in Game-Based Learning

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Law, Victor; Ifenthaler, Dirk; Ge, Xun; Miller, Raymond

    2014-01-01

    Digital game-based learning, especially massively multiplayer online games, has been touted for its potential to promote student motivation and complex problem-solving competency development. However, current evidence is limited to anecdotal studies. The purpose of this empirical investigation is to examine the complex interplay between learners'…

  18. Enhancements of evolutionary algorithm for the complex requirements of a nurse scheduling problem

    NASA Astrophysics Data System (ADS)

    Tein, Lim Huai; Ramli, Razamin

    2014-12-01

    Over the years, nurse scheduling is a noticeable problem that is affected by the global nurse turnover crisis. The more nurses are unsatisfied with their working environment the more severe the condition or implication they tend to leave. Therefore, the current undesirable work schedule is partly due to that working condition. Basically, there is a lack of complimentary requirement between the head nurse's liability and the nurses' need. In particular, subject to highly nurse preferences issue, the sophisticated challenge of doing nurse scheduling is failure to stimulate tolerance behavior between both parties during shifts assignment in real working scenarios. Inevitably, the flexibility in shifts assignment is hard to achieve for the sake of satisfying nurse diverse requests with upholding imperative nurse ward coverage. Hence, Evolutionary Algorithm (EA) is proposed to cater for this complexity in a nurse scheduling problem (NSP). The restriction of EA is discussed and thus, enhancement on the EA operators is suggested so that the EA would have the characteristic of a flexible search. This paper consists of three types of constraints which are the hard, semi-hard and soft constraints that can be handled by the EA with enhanced parent selection and specialized mutation operators. These operators and EA as a whole contribute to the efficiency of constraint handling, fitness computation as well as flexibility in the search, which correspond to the employment of exploration and exploitation principles.

  19. Fibromyalgia and disability adjudication: No simple solutions to a complex problem

    PubMed Central

    Harth, Manfred; Nielson, Warren R

    2014-01-01

    BACKGROUND: Adjudication of disability claims related to fibromyalgia (FM) syndrome can be a challenging and complex process. A commentary published in the current issue of Pain Research & Management makes suggestions for improvement. The authors of the commentary contend that: previously and currently used criteria for the diagnosis of FM are irrelevant to clinical practice; the opinions of family physicians should supersede those of experts; there is little evidence that trauma can cause FM; no formal instruments are necessary to assess disability; and many FM patients on or applying for disability are exaggerating or malingering, and tests of symptoms validity should be used to identify malingerers. OBJECTIVES: To assess the assertions made by Fitzcharles et al. METHODS: A narrative review of the available research literature was performed. RESULTS: Available diagnostic criteria should be used in a medicolegal context; family physicians are frequently uncertain about FM and/or biased; there is considerable evidence that trauma can be a cause of FM; it is essential to use validated instruments to assess functional impairment; and the available tests of physical effort and symptom validity are of uncertain value in identifying malingering in FM. CONCLUSIONS: The available evidence does not support many of the suggestions presented in the commentary. Caution is advised in adopting simple solutions for disability adjudication in FM because they are generally incompatible with the inherently complex nature of the problem. PMID:25479149

  20. Decision Analysis for Environmental Problems

    EPA Science Inventory

    Environmental management problems are often complex and uncertain. A formal process with proper guidance is needed to understand the issues, identify sources of disagreement, and analyze the major uncertainties in environmental problems. This course will present a process that fo...

  1. Exploring Corn-Ethanol As A Complex Problem To Teach Sustainability Concepts Across The Science-Business-Liberal Arts Curriculum

    NASA Astrophysics Data System (ADS)

    Oches, E. A.; Szymanski, D. W.; Snyder, B.; Gulati, G. J.; Davis, P. T.

    2012-12-01

    The highly interdisciplinary nature of sustainability presents pedagogic challenges when sustainability concepts are incorporated into traditional disciplinary courses. At Bentley University, where over 90 percent of students major in business disciplines, we have created a multidisciplinary course module centered on corn ethanol that explores a complex social, environmental, and economic problem and develops basic data analysis and analytical thinking skills in several courses spanning the natural, physical, and social sciences within the business curriculum. Through an NSF-CCLI grant, Bentley faculty from several disciplines participated in a summer workshop to define learning objectives, create course modules, and develop an assessment plan to enhance interdisciplinary sustainability teaching. The core instructional outcome was a data-rich exercise for all participating courses in which students plot and analyze multiple parameters of corn planted and harvested for various purposes including food (human), feed (animal), ethanol production, and commodities exchanged for the years 1960 to present. Students then evaluate patterns and trends in the data and hypothesize relationships among the plotted data and environmental, social, and economic drivers, responses, and unintended consequences. After the central data analysis activity, students explore corn ethanol production as it relates to core disciplinary concepts in their individual classes. For example, students in Environmental Chemistry produce ethanol using corn and sugar as feedstocks and compare the efficiency of each process, while learning about enzymes, fermentation, distillation, and other chemical principles. Principles of Geology students examine the effects of agricultural runoff on surface water quality associated with extracting greater agricultural yield from mid-continent croplands. The American Government course examines the role of political institutions, the political process, and various

  2. Subspace Iteration Method for Complex Eigenvalue Problems with Nonsymmetric Matrices in Aeroelastic System

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Lung, Shun-fat

    2009-01-01

    Modern airplane design is a multidisciplinary task which combines several disciplines such as structures, aerodynamics, flight controls, and sometimes heat transfer. Historically, analytical and experimental investigations concerning the interaction of the elastic airframe with aerodynamic and in retia loads have been conducted during the design phase to determine the existence of aeroelastic instabilities, so called flutter .With the advent and increased usage of flight control systems, there is also a likelihood of instabilities caused by the interaction of the flight control system and the aeroelastic response of the airplane, known as aeroservoelastic instabilities. An in -house code MPASES (Ref. 1), modified from PASES (Ref. 2), is a general purpose digital computer program for the analysis of the closed-loop stability problem. This program used subroutines given in the International Mathematical and Statistical Library (IMSL) (Ref. 3) to compute all of the real and/or complex conjugate pairs of eigenvalues of the Hessenberg matrix. For high fidelity configuration, these aeroelastic system matrices are large and compute all eigenvalues will be time consuming. A subspace iteration method (Ref. 4) for complex eigenvalues problems with nonsymmetric matrices has been formulated and incorporated into the modified program for aeroservoelastic stability (MPASES code). Subspace iteration method only solve for the lowest p eigenvalues and corresponding eigenvectors for aeroelastic and aeroservoelastic analysis. In general, the selection of p is ranging from 10 for wing flutter analysis to 50 for an entire aircraft flutter analysis. The application of this newly incorporated code is an experiment known as the Aerostructures Test Wing (ATW) which was designed by the National Aeronautic and Space Administration (NASA) Dryden Flight Research Center, Edwards, California to research aeroelastic instabilities. Specifically, this experiment was used to study an instability

  3. Computational issues in complex water-energy optimization problems: Time scales, parameterizations, objectives and algorithms

    NASA Astrophysics Data System (ADS)

    Efstratiadis, Andreas; Tsoukalas, Ioannis; Kossieris, Panayiotis; Karavokiros, George; Christofides, Antonis; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris

    2015-04-01

    Modelling of large-scale hybrid renewable energy systems (HRES) is a challenging task, for which several open computational issues exist. HRES comprise typical components of hydrosystems (reservoirs, boreholes, conveyance networks, hydropower stations, pumps, water demand nodes, etc.), which are dynamically linked with renewables (e.g., wind turbines, solar parks) and energy demand nodes. In such systems, apart from the well-known shortcomings of water resources modelling (nonlinear dynamics, unknown future inflows, large number of variables and constraints, conflicting criteria, etc.), additional complexities and uncertainties arise due to the introduction of energy components and associated fluxes. A major difficulty is the need for coupling two different temporal scales, given that in hydrosystem modeling, monthly simulation steps are typically adopted, yet for a faithful representation of the energy balance (i.e. energy production vs. demand) a much finer resolution (e.g. hourly) is required. Another drawback is the increase of control variables, constraints and objectives, due to the simultaneous modelling of the two parallel fluxes (i.e. water and energy) and their interactions. Finally, since the driving hydrometeorological processes of the integrated system are inherently uncertain, it is often essential to use synthetically generated input time series of large length, in order to assess the system performance in terms of reliability and risk, with satisfactory accuracy. To address these issues, we propose an effective and efficient modeling framework, key objectives of which are: (a) the substantial reduction of control variables, through parsimonious yet consistent parameterizations; (b) the substantial decrease of computational burden of simulation, by linearizing the combined water and energy allocation problem of each individual time step, and solve each local sub-problem through very fast linear network programming algorithms, and (c) the substantial

  4. Can fuzzy logic bring complex problems into focus? Modeling imprecise factors in environmental policy

    SciTech Connect

    McKone, Thomas E.; Deshpande, Ashok W.

    2004-06-14

    In modeling complex environmental problems, we often fail to make precise statements about inputs and outcome. In this case the fuzzy logic method native to the human mind provides a useful way to get at these problems. Fuzzy logic represents a significant change in both the approach to and outcome of environmental evaluations. Risk assessment is currently based on the implicit premise that probability theory provides the necessary and sufficient tools for dealing with uncertainty and variability. The key advantage of fuzzy methods is the way they reflect the human mind in its remarkable ability to store and process information which is consistently imprecise, uncertain, and resistant to classification. Our case study illustrates the ability of fuzzy logic to integrate statistical measurements with imprecise health goals. But we submit that fuzzy logic and probability theory are complementary and not competitive. In the world of soft computing, fuzzy logic has been widely used and has often been the ''smart'' behind smart machines. But it will require more effort and case studies to establish its niche in risk assessment or other types of impact assessment. Although we often hear complaints about ''bright lines,'' could we adapt to a system that relaxes these lines to fuzzy gradations? Would decision makers and the public accept expressions of water or air quality goals in linguistic terms with computed degrees of certainty? Resistance is likely. In many regions, such as the US and European Union, it is likely that both decision makers and members of the public are more comfortable with our current system in which government agencies avoid confronting uncertainties by setting guidelines that are crisp and often fail to communicate uncertainty. But some day perhaps a more comprehensive approach that includes exposure surveys, toxicological data, epidemiological studies coupled with fuzzy modeling will go a long way in resolving some of the conflict, divisiveness

  5. Accurate gradient approximation for complex interface problems in 3D by an improved coupling interface method

    SciTech Connect

    Shu, Yu-Chen; Chern, I-Liang; Chang, Chien C.

    2014-10-15

    Most elliptic interface solvers become complicated for complex interface problems at those “exceptional points” where there are not enough neighboring interior points for high order interpolation. Such complication increases especially in three dimensions. Usually, the solvers are thus reduced to low order accuracy. In this paper, we classify these exceptional points and propose two recipes to maintain order of accuracy there, aiming at improving the previous coupling interface method [26]. Yet the idea is also applicable to other interface solvers. The main idea is to have at least first order approximations for second order derivatives at those exceptional points. Recipe 1 is to use the finite difference approximation for the second order derivatives at a nearby interior grid point, whenever this is possible. Recipe 2 is to flip domain signatures and introduce a ghost state so that a second-order method can be applied. This ghost state is a smooth extension of the solution at the exceptional point from the other side of the interface. The original state is recovered by a post-processing using nearby states and jump conditions. The choice of recipes is determined by a classification scheme of the exceptional points. The method renders the solution and its gradient uniformly second-order accurate in the entire computed domain. Numerical examples are provided to illustrate the second order accuracy of the presently proposed method in approximating the gradients of the original states for some complex interfaces which we had tested previous in two and three dimensions, and a real molecule ( (1D63)) which is double-helix shape and composed of hundreds of atoms.

  6. What Does (and Doesn't) Make Analogical Problem Solving Easy? A Complexity-Theoretic Perspective

    ERIC Educational Resources Information Center

    Wareham, Todd; Evans, Patricia; van Rooij, Iris

    2011-01-01

    Solving new problems can be made easier if one can build on experiences with other problems one has already successfully solved. The ability to exploit earlier problem-solving experiences in solving new problems seems to require several cognitive sub-abilities. Minimally, one needs to be able to retrieve relevant knowledge of earlier solved…

  7. Managing the Complexity of Design Problems through Studio-Based Learning

    ERIC Educational Resources Information Center

    Cennamo, Katherine; Brandt, Carol; Scott, Brigitte; Douglas, Sarah; McGrath, Margarita; Reimer, Yolanda; Vernon, Mitzi

    2011-01-01

    The ill-structured nature of design problems makes them particularly challenging for problem-based learning. Studio-based learning (SBL), however, has much in common with problem-based learning and indeed has a long history of use in teaching students to solve design problems. The purpose of this ethnographic study of an industrial design class,…

  8. Leadership and leadership development in healthcare settings – a simplistic solution to complex problems?

    PubMed Central

    McDonald, Ruth

    2014-01-01

    There is a trend in health systems around the world to place great emphasis on and faith in improving ‘leadership’. Leadership has been defined in many ways and the elitist implications of traditional notions of leadership sit uncomfortably with modern healthcare organisations. The concept of distributed leadership incorporates inclusivity, collectiveness and collaboration, with the result that, to some extent, all staff, not just those in senior management roles, are viewed as leaders. Leadership development programmes are intended to equip individuals to improve leadership skills, but we know little about their effectiveness. Furthermore, the content of these programmes varies widely and the fact that many lack a sense of how they fit with individual or organisational goals raises questions about how they are intended to achieve their aims. It is important to avoid simplistic assumptions about the ability of improved leadership to solve complex problems. It is also important to evaluate leadership development programmes in ways that go beyond descriptive accounts. PMID:25337595

  9. Effectiveness in learning complex problem solving and salivary ion indices of psychological stress and activation.

    PubMed

    Richter, P; Hinton, J W; Reinhold, S

    1998-11-01

    Following Hinton et al. (1992, Biol. Psychol. 33, 63-71) and Richter et al. (1995, Biol. Psychol. 39, 131-142) ionic concentration of [K+] in unstimulated saliva was predicted to rise with perceived challenge, while lowered [Na+] was expected when experiencing psychological stress (PS). Subjects had to learn an engaging complex problem-solving 'game', via positive and negative feed-back on three 'games' lasting 2.5-3.0 h overall. Comparisons were made between three groups: (1) high success; (2) partial success ('strugglers'); and (3) total failure to learn. Saliva was sampled after resting and after each of three 'games'. Successful learners had a significant rise in [K+] on the first 'game' followed by a significant fall, consistent with task-challenge reaction followed by fast autonomic adaptation with successful learning. The 'strugglers' [Na+] fell significantly over the 'games', indicating mineralocorticoid-induced PS response of Na+ reabsorption. The 'total failure' subjects had generally significantly higher [K+] than the successful ones, showing raised tonic sympathetic relative to parasympathetic activity--this outcome being interpreted from interference theories. The 'failures' also had significantly higher tonic [Na+] on 'games'--indicating low PS as predicted from McGrath's (1976) theory.

  10. LETTER TO THE EDITOR: Information complexity-based regularization parameter selection for solution of ill conditioned inverse problems

    NASA Astrophysics Data System (ADS)

    Urmanov, A. M.; Gribok, A. V.; Bozdogan, H.; Hines, J. W.; Uhrig, R. E.

    2002-04-01

    We propose an information complexity-based regularization parameter selection method for solution of ill conditioned inverse problems. The regularization parameter is selected to be the minimizer of the Kullback-Leibler (KL) distance between the unknown data-generating distribution and the fitted distribution. The KL distance is approximated by an information complexity criterion developed by Bozdogan. The method is not limited to the white Gaussian noise case. It can be extended to correlated and non-Gaussian noise. It can also account for possible model misspecification. We demonstrate the performance of the proposed method on a test problem from Hansen's regularization tools.

  11. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  12. Untangling the Complex Needs of People Experiencing Gambling Problems and Homelessness

    ERIC Educational Resources Information Center

    Holdsworth, Louise; Tiyce, Margaret

    2013-01-01

    People with gambling problems are now recognised among those at increased risk of homelessness, and the link between housing and gambling problems has been identified as an area requiring further research. This paper discusses the findings of a qualitative study that explored the relationship between gambling problems and homelessness. Interviews…

  13. A framework to approach problems of forensic anthropology using complex networks

    NASA Astrophysics Data System (ADS)

    Caridi, Inés; Dorso, Claudio O.; Gallo, Pablo; Somigliana, Carlos

    2011-05-01

    We have developed a method to analyze and interpret emerging structures in a set of data which lacks some information. It has been conceived to be applied to the problem of getting information about people who disappeared in the Argentine state of Tucumán from 1974 to 1981. Even if the military dictatorship formally started in Argentina had begun in 1976 and lasted until 1983, the disappearance and assassination of people began some months earlier. During this period several circuits of Illegal Detention Centres (IDC) were set up in different locations all over the country. In these secret centres, disappeared people were illegally kept without any sort of constitutional guarantees, and later assassinated. Even today, the final destination of most of the disappeared people’s remains is still unknown. The fundamental hypothesis in this work is that a group of people with the same political affiliation whose disappearances were closely related in time and space shared the same place of captivity (the same IDC or circuit of IDCs). This hypothesis makes sense when applied to the systematic method of repression and disappearances which was actually launched in Tucumán, Argentina (2007) [11]. In this work, the missing individuals are identified as nodes on a network and connections are established among them based on the individuals’ attributes while they were alive, by using rules to link them. In order to determine which rules are the most effective in defining the network, we use other kind of knowledge available in this problem: previous results from the anthropological point of view (based on other sources of information, both oral and written, historical and anthropological data, etc.); and information about the place (one or more IDCs) where some people were kept during their captivity. For these best rules, a prediction about these people’s possible destination is assigned (one or more IDCs where they could have been kept), and the success of the

  14. The Average Network Flow Problem: Shortest Path and Minimum Cost Flow Formulations, Algorithms, Heuristics, and Complexity

    DTIC Science & Technology

    2012-09-13

    38 2.4 Computational Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 2.5 Transportation Mode Selection...allowing the decision maker to tradeoff increases in the value obtained versus the number of arcs used. 9. Computational complexity proofs for the MASP... computational complexity , and transportation mode selection. Chapter 3 is a tutorial on Value Focused Thinking for Supply Chain Applications

  15. Does Visualization Enhance Complex Problem Solving? The Effect of Causal Mapping on Performance in the Computer-Based Microworld Tailorshop

    ERIC Educational Resources Information Center

    Öllinger, Michael; Hammon, Stephanie; von Grundherr, Michael; Funke, Joachim

    2015-01-01

    Causal mapping is often recognized as a technique to support strategic decisions and actions in complex problem situations. Such drawing of causal structures is supposed to particularly foster the understanding of the interaction of the various system elements and to further encourage holistic thinking. It builds on the idea that humans make use…

  16. Learning by Preparing to Teach: Fostering Self-Regulatory Processes and Achievement during Complex Mathematics Problem Solving

    ERIC Educational Resources Information Center

    Muis, Krista R.; Psaradellis, Cynthia; Chevrier, Marianne; Di Leo, Ivana; Lajoie, Susanne P.

    2016-01-01

    We developed an intervention based on the learning by teaching paradigm to foster self-regulatory processes and better learning outcomes during complex mathematics problem solving in a technology-rich learning environment. Seventy-eight elementary students were randomly assigned to 1 of 2 conditions: learning by preparing to teach, or learning for…

  17. Linking Complex Problem Solving and General Mental Ability to Career Advancement: Does a Transversal Skill Reveal Incremental Predictive Validity?

    ERIC Educational Resources Information Center

    Mainert, Jakob; Kretzschmar, André; Neubert, Jonas C.; Greiff, Samuel

    2015-01-01

    Transversal skills, such as complex problem solving (CPS) are viewed as central twenty-first-century skills. Recent empirical findings have already supported the importance of CPS for early academic advancement. We wanted to determine whether CPS could also contribute to the understanding of career advancement later in life. Towards this end, we…

  18. Validity of the MicroDYN Approach: Complex Problem Solving Predicts School Grades beyond Working Memory Capacity

    ERIC Educational Resources Information Center

    Schweizer, Fabian; Wustenberg, Sascha; Greiff, Samuel

    2013-01-01

    This study examines the validity of the complex problem solving (CPS) test MicroDYN by investigating a) the relation between its dimensions--rule identification (exploration strategy), rule knowledge (acquired knowledge), rule application (control performance)--and working memory capacity (WMC), and b) whether CPS predicts school grades in…

  19. DIFFERENTIAL ANALYZER

    DOEpatents

    Sorensen, E.G.; Gordon, C.M.

    1959-02-10

    Improvements in analog eomputing machines of the class capable of evaluating differential equations, commonly termed differential analyzers, are described. In general form, the analyzer embodies a plurality of basic computer mechanisms for performing integration, multiplication, and addition, and means for directing the result of any one operation to another computer mechanism performing a further operation. In the device, numerical quantities are represented by the rotation of shafts, or the electrical equivalent of shafts.

  20. Process Analyzer

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The ChemScan UV-6100 is a spectrometry system originally developed by Biotronics Technologies, Inc. under a Small Business Innovation Research (SBIR) contract. It is marketed to the water and wastewater treatment industries, replacing "grab sampling" with on-line data collection. It analyzes the light absorbance characteristics of a water sample, simultaneously detects hundreds of individual wavelengths absorbed by chemical substances in a process solution, and quantifies the information. Spectral data is then processed by ChemScan analyzer and compared with calibration files in the system's memory in order to calculate concentrations of chemical substances that cause UV light absorbance in specific patterns. Monitored substances can be analyzed for quality and quantity. Applications include detection of a variety of substances, and the information provided enables an operator to control a process more efficiently.

  1. Blood Analyzer

    NASA Technical Reports Server (NTRS)

    1992-01-01

    In the 1970's, NASA provided funding for development of an automatic blood analyzer for Skylab at the Oak Ridge National Laboratory (ORNL). ORNL devised "dynamic loading," which employed a spinning rotor to load, transfer, and analyze blood samples by centrifugal processing. A refined, commercial version of the system was produced by ABAXIS and is marketed as portable ABAXIS MiniLab MCA. Used in a doctor's office, the equipment can perform 80 to 100 chemical blood tests on a single drop of blood and report results in five minutes. Further development is anticipated.

  2. Pupils' Problem-Solving Processes in a Complex Computerized Learning Environment.

    ERIC Educational Resources Information Center

    Suomala, Jyrki; Alajaaski, Jarkko

    2002-01-01

    Describes a study that examined fifth-grade Finnish pupils' problem-solving processes in a LEGO/Logo technology-based learning environment. Results indicate that learning model and gender account for group differences in problem solving processes, and are interpreted as supporting the validity of discovery learning. (Author/LRW)

  3. Complex Problem Solving in Radiologic Technology: Understanding the Roles of Experience, Reflective Judgment, and Workplace Culture

    ERIC Educational Resources Information Center

    Yates, Jennifer L.

    2011-01-01

    The purpose of this research study was to explore the process of learning and development of problem solving skills in radiologic technologists. The researcher sought to understand the nature of difficult problems encountered in clinical practice, to identify specific learning practices leading to the development of professional expertise, and to…

  4. Defibrillator analyzers.

    PubMed

    1999-12-01

    Defibrillator analyzers automate the inspection and preventive maintenance (IPM) testing of defibrillators. They need to be able to test at least four basic defibrillator performance characteristics: discharge energy, synchronized-mode operation, automated external defibrillation, and ECG monitoring. We prefer that they also be able to test a defibrillator's external noninvasive pacing function--but this is not essential if a facility already has a pacemaker analyzer that can perform this testing. In this Evaluation, we tested seven defibrillator analyzers from six suppliers. All seven units accurately measure the energies of a variety of discharge wave-forms over a wide range of energy levels--from 1 J for use in a neonatal intensive care unit to 360 J for use on adult patients requiring maximum discharge energy. Most of the analyzers are easy to use. However, only three of the evaluated units could perform the full range of defibrillator tests that we prefer. We rated these units Acceptable--Preferred. Three more units could perform four of the five tests, they could not test the pacing feature of a defibrillator. These units were rated Acceptable. The seventh unit could perform only discharge energy testing and synchronized-mode testing and was difficult to use. We rate that unit Acceptable--Not Recommended.

  5. The Role of Prior Knowledge and Problem Contexts in Students' Explanations of Complex System

    ERIC Educational Resources Information Center

    Barth-Cohen, Lauren April

    2012-01-01

    The purpose of this dissertation is to study students' competencies in generating scientific explanations within the domain of complex systems, an interdisciplinary area in which students tend to have difficulties. While considering students' developing explanations of how complex systems work, I investigate the role of prior knowledge…

  6. Contextual approach to technology assessment: Implications for one-factor fix solutions to complex social problems

    NASA Technical Reports Server (NTRS)

    Mayo, L. H.

    1975-01-01

    The contextual approach is discussed which undertakes to demonstrate that technology assessment assists in the identification of the full range of implications of taking a particular action and facilitates the consideration of alternative means by which the total affected social problem context might be changed by available project options. It is found that the social impacts of an application on participants, institutions, processes, and social interests, and the accompanying interactions may not only induce modifications in the problem contest delineated for examination with respect to the design, operations, regulation, and use of the posited application, but also affect related social problem contexts.

  7. Process Analyzer

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Under a NASA Small Business Innovation Research (SBIR) contract, Axiomatics Corporation developed a shunting Dielectric Sensor to determine the nutrient level and analyze plant nutrient solutions in the CELSS, NASA's space life support program. (CELSS is an experimental facility investigating closed-cycle plant growth and food processing for long duration manned missions.) The DiComp system incorporates a shunt electrode and is especially sensitive to changes in dielectric property changes in materials at measurements much lower than conventional sensors. The analyzer has exceptional capabilities for predicting composition of liquid streams or reactions. It measures concentrations and solids content up to 100 percent in applications like agricultural products, petrochemicals, food and beverages. The sensor is easily installed; maintenance is low, and it can be calibrated on line. The software automates data collection and analysis.

  8. Atmosphere Analyzer

    NASA Technical Reports Server (NTRS)

    1982-01-01

    California Measurements, Inc.'s model PC-2 Aerosol Particle Analyzer is produced in both airborne and ground-use versions. Originating from NASA technology, it is a quick and accurate method of detecting minute amounts of mass loadings on a quartz crystal -- offers utility as highly sensitive detector of fine particles suspended in air. When combined with suitable air delivery system, it provides immediate information on the size distribution and mass concentrations of aerosols. William Chiang, obtained a NASA license for multiple crystal oscillator technology, and initially developed a particle analyzer for NASA use with Langley Research Center assistance. Later his company produced the modified PC-2 for commercial applications Brunswick Corporation uses the device for atmospheric research and in studies of smoke particles in Fires. PC-2 is used by pharmaceutical and chemical companies in research on inhalation toxicology and environmental health. Also useful in testing various filters for safety masks and nuclear installations.

  9. Oxygen analyzer

    DOEpatents

    Benner, William H.

    1986-01-01

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  10. Oxygen analyzer

    DOEpatents

    Benner, W.H.

    1984-05-08

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N/sub 2/), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable obtained by decomposing the sample at 1135/sup 0/C, or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135/sup 0/C as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N/sub 2/, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  11. MULTICHANNEL ANALYZER

    DOEpatents

    Kelley, G.G.

    1959-11-10

    A multichannel pulse analyzer having several window amplifiers, each amplifier serving one group of channels, with a single fast pulse-lengthener and a single novel interrogation circuit serving all channels is described. A pulse followed too closely timewise by another pulse is disregarded by the interrogation circuit to prevent errors due to pulse pileup. The window amplifiers are connected to the pulse lengthener output, rather than the linear amplifier output, so need not have the fast response characteristic formerly required.

  12. [Mediative behavior therapy in complex behavioral problems. Illustrative cases from a psychogeriatric nursing home].

    PubMed

    Geelen, R; Bleijenberg, G

    1999-04-01

    An application in a psychogeriatric nursing home. This article describes the application of mediative behaviour therapy in a psychogeriatric nursing home. Behavioural interventions carried out by the nursing team addressed a variety of problems: quarreling between an institutionalised woman and her visiting husband, complaining about this staff by the husband to team members of another department, and the patient who let herself drop on the floor about once a week. Special regard is given to the analysis of the problems, the learning of appropriate responses by team members, as well as changing their cognitions and emotions about the problem behaviours. A meaningful reduction of the problem behaviours and of the burden experienced by team members was achieved.

  13. Gas Analyzer

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A miniature gas chromatograph, a system which separates a gaseous mixture into its components and measures the concentration of the individual gases, was designed for the Viking Lander. The technology was further developed under National Institute for Occupational Safety and Health (NIOSH) and funded by Ames Research Center/Stanford as a toxic gas leak detection device. Three researchers on the project later formed Microsensor Technology, Inc. to commercialize the product. It is a battery-powered system consisting of a sensing wand connected to a computerized analyzer. Marketed as the Michromonitor 500, it has a wide range of applications.

  14. Contamination Analyzer

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  15. Analyzing Orientations

    NASA Astrophysics Data System (ADS)

    Ruggles, Clive L. N.

    Archaeoastronomical field survey typically involves the measurement of structural orientations (i.e., orientations along and between built structures) in relation to the visible landscape and particularly the surrounding horizon. This chapter focuses on the process of analyzing the astronomical potential of oriented structures, whether in the field or as a desktop appraisal, with the aim of establishing the archaeoastronomical "facts". It does not address questions of data selection (see instead Chap. 25, "Best Practice for Evaluating the Astronomical Significance of Archaeological Sites", 10.1007/978-1-4614-6141-8_25) or interpretation (see Chap. 24, "Nature and Analysis of Material Evidence Relevant to Archaeoastronomy", 10.1007/978-1-4614-6141-8_22). The main necessity is to determine the azimuth, horizon altitude, and declination in the direction "indicated" by any structural orientation. Normally, there are a range of possibilities, reflecting the various errors and uncertainties in estimating the intended (or, at least, the constructed) orientation, and in more formal approaches an attempt is made to assign a probability distribution extending over a spread of declinations. These probability distributions can then be cumulated in order to visualize and analyze the combined data from several orientations, so as to identify any consistent astronomical associations that can then be correlated with the declinations of particular astronomical objects or phenomena at any era in the past. The whole process raises various procedural and methodological issues and does not proceed in isolation from the consideration of corroborative data, which is essential in order to develop viable cultural interpretations.

  16. A Study of Theory U and Its Application to a Complex Japanese Maritime Self-Defense Force Problem

    DTIC Science & Technology

    2014-06-01

    examples of large organizations employing systems thinking to solve problems exhibiting dynamic complexity with the help of coaching contracts with SoL... Coaching programs for about 200 government agency executive officers from 2003 to 2006. SoL supports the agency’s leaders to transform their...the MCH-101 was too busy, and thus tried to escape or just react only in superficial ways.  The MSQ wanted to develop a proposal to solve the MCH

  17. Solving Hard Computational Problems Efficiently: Asymptotic Parametric Complexity 3-Coloring Algorithm

    PubMed Central

    Martín H., José Antonio

    2013-01-01

    Many practical problems in almost all scientific and technological disciplines have been classified as computationally hard (NP-hard or even NP-complete). In life sciences, combinatorial optimization problems frequently arise in molecular biology, e.g., genome sequencing; global alignment of multiple genomes; identifying siblings or discovery of dysregulated pathways. In almost all of these problems, there is the need for proving a hypothesis about certain property of an object that can be present if and only if it adopts some particular admissible structure (an NP-certificate) or be absent (no admissible structure), however, none of the standard approaches can discard the hypothesis when no solution can be found, since none can provide a proof that there is no admissible structure. This article presents an algorithm that introduces a novel type of solution method to “efficiently” solve the graph 3-coloring problem; an NP-complete problem. The proposed method provides certificates (proofs) in both cases: present or absent, so it is possible to accept or reject the hypothesis on the basis of a rigorous proof. It provides exact solutions and is polynomial-time (i.e., efficient) however parametric. The only requirement is sufficient computational power, which is controlled by the parameter . Nevertheless, here it is proved that the probability of requiring a value of to obtain a solution for a random graph decreases exponentially: , making tractable almost all problem instances. Thorough experimental analyses were performed. The algorithm was tested on random graphs, planar graphs and 4-regular planar graphs. The obtained experimental results are in accordance with the theoretical expected results. PMID:23349711

  18. Solving hard computational problems efficiently: asymptotic parametric complexity 3-coloring algorithm.

    PubMed

    Martín H, José Antonio

    2013-01-01

    Many practical problems in almost all scientific and technological disciplines have been classified as computationally hard (NP-hard or even NP-complete). In life sciences, combinatorial optimization problems frequently arise in molecular biology, e.g., genome sequencing; global alignment of multiple genomes; identifying siblings or discovery of dysregulated pathways. In almost all of these problems, there is the need for proving a hypothesis about certain property of an object that can be present if and only if it adopts some particular admissible structure (an NP-certificate) or be absent (no admissible structure), however, none of the standard approaches can discard the hypothesis when no solution can be found, since none can provide a proof that there is no admissible structure. This article presents an algorithm that introduces a novel type of solution method to "efficiently" solve the graph 3-coloring problem; an NP-complete problem. The proposed method provides certificates (proofs) in both cases: present or absent, so it is possible to accept or reject the hypothesis on the basis of a rigorous proof. It provides exact solutions and is polynomial-time (i.e., efficient) however parametric. The only requirement is sufficient computational power, which is controlled by the parameter α∈N. Nevertheless, here it is proved that the probability of requiring a value of α>k to obtain a solution for a random graph decreases exponentially: P(α>k)≤2(-(k+1)), making tractable almost all problem instances. Thorough experimental analyses were performed. The algorithm was tested on random graphs, planar graphs and 4-regular planar graphs. The obtained experimental results are in accordance with the theoretical expected results.

  19. Optical analyzer

    DOEpatents

    Hansen, A.D.

    1987-09-28

    An optical analyzer wherein a sample of particulate matter, and particularly of organic matter, which has been collected on a quartz fiber filter is placed in a combustion tube, and light from a light source is passed through the sample. The temperature of the sample is raised at a controlled rate and in a controlled atmosphere. The magnitude of the transmission of light through the sample is detected as the temperature is raised. A data processor, differentiator and a two pen recorder provide a chart of the optical transmission versus temperature and the rate of change of optical transmission versus temperature signatures (T and D) of the sample. These signatures provide information as to physical and chemical processes and a variety of quantitative and qualitative information about the sample. Additional information is obtained by repeating the run in different atmospheres and/or different rates or heating with other samples of the same particulate material collected on other filters. 7 figs.

  20. Speech analyzer

    NASA Technical Reports Server (NTRS)

    Lokerson, D. C. (Inventor)

    1977-01-01

    A speech signal is analyzed by applying the signal to formant filters which derive first, second and third signals respectively representing the frequency of the speech waveform in the first, second and third formants. A first pulse train having approximately a pulse rate representing the average frequency of the first formant is derived; second and third pulse trains having pulse rates respectively representing zero crossings of the second and third formants are derived. The first formant pulse train is derived by establishing N signal level bands, where N is an integer at least equal to two. Adjacent ones of the signal bands have common boundaries, each of which is a predetermined percentage of the peak level of a complete cycle of the speech waveform.

  1. Extension of the tridiagonal reduction (FEER) method for complex eigenvalue problems in NASTRAN

    NASA Technical Reports Server (NTRS)

    Newman, M.; Mann, F. I.

    1978-01-01

    As in the case of real eigenvalue analysis, the eigensolutions closest to a selected point in the eigenspectrum were extracted from a reduced, symmetric, tridiagonal eigenmatrix whose order was much lower than that of the full size problem. The reduction process was effected automatically, and thus avoided the arbitrary lumping of masses and other physical quantities at selected grid points. The statement of the algebraic eigenvalue problem admitted mass, damping, and stiffness matrices which were unrestricted in character, i.e., they might be real, symmetric or nonsymmetric, singular or nonsingular.

  2. Complexity and approximability for a problem of intersecting of proximity graphs with minimum number of equal disks

    NASA Astrophysics Data System (ADS)

    Kobylkin, Konstantin

    2016-10-01

    Computational complexity and approximability are studied for the problem of intersecting of a set of straight line segments with the smallest cardinality set of disks of fixed radii r > 0 where the set of segments forms straight line embedding of possibly non-planar geometric graph. This problem arises in physical network security analysis for telecommunication, wireless and road networks represented by specific geometric graphs defined by Euclidean distances between their vertices (proximity graphs). It can be formulated in a form of known Hitting Set problem over a set of Euclidean r-neighbourhoods of segments. Being of interest computational complexity and approximability of Hitting Set over so structured sets of geometric objects did not get much focus in the literature. Strong NP-hardness of the problem is reported over special classes of proximity graphs namely of Delaunay triangulations, some of their connected subgraphs, half-θ6 graphs and non-planar unit disk graphs as well as APX-hardness is given for non-planar geometric graphs at different scales of r with respect to the longest graph edge length. Simple constant factor approximation algorithm is presented for the case where r is at the same scale as the longest edge length.

  3. The Complex Relationship between Students' Critical Thinking and Epistemological Beliefs in the Context of Problem Solving

    ERIC Educational Resources Information Center

    Hyytinen, Heidi; Holma, Katariina; Toom, Auli; Shavelson, Richard J.; Lindblom-Ylänne, Sari

    2014-01-01

    The study utilized a multi-method approach to explore the connection between critical thinking and epistemological beliefs in a specific problem-solving situation. Data drawn from a sample of ten third-year bioscience students were collected using a combination of a cognitive lab and a performance task from the Collegiate Learning Assessment…

  4. New ecology education: Preparing students for the complex human-environmental problems of dryland East Asia

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Present-day environmental problems of Dryland East Asia are serious, and future prospects look especially disconcerting owing to current trends in population growth and economic development. Land degradation and desertification, invasive species, biodiversity losses, toxic waste and air pollution, a...

  5. Convergent Validity of the Aberrant Behavior Checklist and Behavior Problems Inventory with People with Complex Needs

    ERIC Educational Resources Information Center

    Hill, Jennie; Powlitch, Stephanie; Furniss, Frederick

    2008-01-01

    The current study aimed to replicate and extend Rojahn et al. [Rojahn, J., Aman, M. G., Matson, J. L., & Mayville, E. (2003). "The aberrant behavior checklist and the behavior problems inventory: Convergent and divergent validity." "Research in Developmental Disabilities", 24, 391-404] by examining the convergent validity of the behavior problems…

  6. Analogize This! The Politics of Scale and the Problem of Substance in Complexity-Based Composition

    ERIC Educational Resources Information Center

    Roderick, Noah R.

    2012-01-01

    In light of recent enthusiasm in composition studies (and in the social sciences more broadly) for complexity theory and ecology, this article revisits the debate over how much composition studies can or should align itself with the natural sciences. For many in the discipline, the science debate--which was ignited in the 1970s, both by the…

  7. The Species Problem and the Value of Teaching and the Complexities of Species

    ERIC Educational Resources Information Center

    Chung, Carl

    2004-01-01

    Discussions on species taxa directly refer to a range of complex biological phenomena. Given these phenomena, biologists have developed and continue to appeal to a series of species concepts and do not have a clear definition for it as each species concept tells us part of the story or helps the biologists to explain and understand a subset of…

  8. Simplifying the model of a complex heat-transfer system for solving the relay control problem

    NASA Astrophysics Data System (ADS)

    Shilin, A. A.; Bukreev, V. G.

    2014-09-01

    A method for approximating the high-dimensionality model of a complex heat-transfer system with time delay by a nonlinear second-order differential equation is proposed. The modeling results confirming adequacy of the nonlinear properties of the reduced and initial models and their correspondence to the controlled plant actual data are presented.

  9. Foucault as Complexity Theorist: Overcoming the Problems of Classical Philosophical Analysis

    ERIC Educational Resources Information Center

    Olssen, Mark

    2008-01-01

    This article explores the affinities and parallels between Foucault's Nietzschean view of history and models of complexity developed in the physical sciences in the twentieth century. It claims that Foucault's rejection of structuralism and Marxism can be explained as a consequence of his own approach which posits a radical ontology whereby the…

  10. Integral Solution for Diffraction Problems Involving Conducting Surfaces with Complex Geometries. 1. Theory

    DTIC Science & Technology

    1988-02-01

    involving conducting surfaces with complex geometries. I. Theory Mohamed F. El-Hewle and Richard 1. Cook Franh !. Seiler Research Laboratory, U.S. Air...can be generalized to a surface of an arbi- mittance are then employed with the new geometry- and trary coordinate function as follows. At a surface

  11. ETD QA CORE TEAM: AN ELOQUENT SOLUTION TO A COMPLEX PROBLEM

    EPA Science Inventory

    ETD QA CORE TEAM: AN ELOQUENT SOLUTION TO A COMPLEX PROBLEMThomas J. Hughes, QA and Records Manager, Experimental Toxicology Division (ETD), National Health and Environmental Effects Research Laboratory (NHEERL), ORD, U.S. EPA, RTP, NC 27709

    ETD is the largest health divis...

  12. Optical analyzer

    DOEpatents

    Hansen, Anthony D.

    1989-01-01

    An optical analyzer (10) wherein a sample (19) of particulate matter, and particularly of organic matter, which has been collected on a quartz fiber filter (20) is placed in a combustion tube (11), and light from a light source (14) is passed through the sample (19). The temperature of the sample (19) is raised at a controlled rate and in a controlled atmosphere. The magnitude of the transmission of light through the sample (19) is detected (18) as the temperature is raised. A data processor (23), differentiator (28) and a two pen recorder (24) provide a chart of the optical transmission versus temperature and the rate of change of optical transmission versus temperature signatures (T and D) of the sample (19). These signatures provide information as to physical and chemical processes and a variety of quantitative and qualitative information about the sample (19). Additional information is obtained by repeating the run in different atmospheres and/or different rates of heating with other samples of the same particulate material collected on other filters.

  13. Optical analyzer

    DOEpatents

    Hansen, Anthony D.

    1989-02-07

    An optical analyzer (10) wherein a sample (19) of particulate matter, and particularly of organic matter, which has been collected on a quartz fiber filter (20) is placed in a combustion tube (11), and light from a light source (14) is passed through the sample (19). The temperature of the sample (19) is raised at a controlled rate and in a controlled atmosphere. The magnitude of the transmission of light through the sample (19) is detected (18) as the temperature is raised. A data processor (23), differentiator (28) and a two pen recorder (24) provide a chart of the optical transmission versus temperature and the rate of change of optical transmission versus temperature signatures (T and D) of the sample (19). These signatures provide information as to physical and chemical processes and a variety of quantitative and qualitative information about the sample (19). Additional information is obtained by repeating the run in different atmospheres and/or different rates of heating with other samples of the same particulate material collected on other filters.

  14. Dynamic Modeling as a Cognitive Regulation Scaffold for Developing Complex Problem-Solving Skills in an Educational Massively Multiplayer Online Game Environment

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Ge, Xun; Ifenthaler, Dirk; Law, Victor

    2011-01-01

    Following a design-based research framework, this article reports two empirical studies with an educational MMOG, called "McLarin's Adventures," on facilitating 9th-grade students' complex problem-solving skill acquisition in interdisciplinary STEM education. The article discusses the nature of complex and ill-structured problem solving…

  15. ABSORPTION ANALYZER

    DOEpatents

    Brooksbank, W.A. Jr.; Leddicotte, G.W.; Strain, J.E.; Hendon, H.H. Jr.

    1961-11-14

    A means was developed for continuously computing and indicating the isotopic assay of a process solution and for automatically controlling the process output of isotope separation equipment to provide a continuous output of the desired isotopic ratio. A counter tube is surrounded with a sample to be analyzed so that the tube is exactly in the center of the sample. A source of fast neutrons is provided and is spaced from the sample. The neutrons from the source are thermalized by causing them to pass through a neutron moderator, and the neutrons are allowed to diffuse radially through the sample to actuate the counter. A reference counter in a known sample of pure solvent is also actuated by the thermal neutrons from the neutron source. The number of neutrons which actuate the detectors is a function of a concentration of the elements in solution and their neutron absorption cross sections. The pulses produced by the detectors responsive to each neu tron passing therethrough are amplified and counted. The respective times required to accumulate a selected number of counts are measured by associated timing devices. The concentration of a particular element in solution may be determined by utilizing the following relation: T2/Ti = BCR, where B is a constant proportional to the absorption cross sections, T2 is the time of count collection for the unknown solution, Ti is the time of count collection for the pure solvent, R is the isotopic ratlo, and C is the molar concentration of the element to be determined. Knowing the slope constant B for any element and when the chemical concentration is known, the isotopic concentration may be readily determined, and conversely when the isotopic ratio is known, the chemical concentrations may be determined. (AEC)

  16. Recent advances in hopanoids analysis: Quantification protocols overview, main research targets and selected problems of complex data exploration.

    PubMed

    Zarzycki, Paweł K; Portka, Joanna K

    2015-09-01

    Pentacyclic triterpenoids, particularly hopanoids, are organism-specific compounds and are generally considered as useful biomarkers that allow fingerprinting and classification of biological, environmental and geological samples. Simultaneous quantification of various hopanoids together with battery of related non-polar and low-molecular mass compounds may provide principal information for geochemical and environmental research focusing on both modern and ancient investigations. Target compounds can be derived from microbial biomass, water columns, sediments, coals, crude fossils or rocks. This create number of analytical problems due to different composition of the analytical matrix and interfering compounds and therefore, proper optimization of quantification protocols for such biomarkers is still the challenge. In this work we summarizing typical analytical protocols that were recently applied for quantification of hopanoids like compounds from different samples. Main steps including components of interest extraction, pre-purification, fractionation, derivatization and quantification involving gas (1D and 2D) as well as liquid separation techniques (liquid-liquid extraction, solid-phase extraction, planar and low resolution column chromatography, high-performance liquid chromatography) are described and discussed from practical point of view, mainly based on the experimental papers that were published within last two years, where significant increase in hopanoids research was noticed. The second aim of this review is to describe the latest research trends concerning determination of hopanoids and related low-molecular mass lipids analyzed in various samples including sediments, rocks, coals, crude oils and plant fossils as well as stromatolites and microbial biomass cultivated under different conditions. It has been found that majority of the most recent papers are based on uni- or bivariate approach for complex data analysis. Data interpretation involves

  17. Studying PubMed usages in the field for complex problem solving: Implications for tool design

    PubMed Central

    Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa

    2012-01-01

    Many recent studies on MEDLINE-based information seeking have shed light on scientists’ behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists’ problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375

  18. Simulations for Complex Fluid Flow Problems from Berkeley Lab's Center for Computational Sciences and Engineering (CCSE)

    DOE Data Explorer

    The Center for Computational Sciences and Engineering (CCSE) develops and applies advanced computational methodologies to solve large-scale scientific and engineering problems arising in the Department of Energy (DOE) mission areas involving energy, environmental, and industrial technology. The primary focus is in the application of structured-grid finite difference methods on adaptive grid hierarchies for compressible, incompressible, and low Mach number flows. The diverse range of scientific applications that drive the research typically involve a large range of spatial and temporal scales (e.g. turbulent reacting flows) and require the use of extremely large computing hardware, such as the 153,000-core computer, Hopper, at NERSC. The CCSE approach to these problems centers on the development and application of advanced algorithms that exploit known separations in scale; for many of the application areas this results in algorithms are several orders of magnitude more efficient than traditional simulation approaches.

  19. Criteria for assessing problem solving and decision making in complex environments

    NASA Technical Reports Server (NTRS)

    Orasanu, Judith

    1993-01-01

    Training crews to cope with unanticipated problems in high-risk, high-stress environments requires models of effective problem solving and decision making. Existing decision theories use the criteria of logical consistency and mathematical optimality to evaluate decision quality. While these approaches are useful under some circumstances, the assumptions underlying these models frequently are not met in dynamic time-pressured operational environments. Also, applying formal decision models is both labor and time intensive, a luxury often lacking in operational environments. Alternate approaches and criteria are needed. Given that operational problem solving and decision making are embedded in ongoing tasks, evaluation criteria must address the relation between those activities and satisfaction of broader task goals. Effectiveness and efficiency become relevant for judging reasoning performance in operational environments. New questions must be addressed: What is the relation between the quality of decisions and overall performance by crews engaged in critical high risk tasks? Are different strategies most effective for different types of decisions? How can various decision types be characterized? A preliminary model of decision types found in air transport environments will be described along with a preliminary performance model based on an analysis of 30 flight crews. The performance analysis examined behaviors that distinguish more and less effective crews (based on performance errors). Implications for training and system design will be discussed.

  20. Outcomes, moderators, and mediators of empathic-emotion recognition training for complex conduct problems in childhood.

    PubMed

    Dadds, Mark Richard; Cauchi, Avril Jessica; Wimalaweera, Subodha; Hawes, David John; Brennan, John

    2012-10-30

    Impairments in emotion recognition skills are a trans-diagnostic indicator of early mental health problems and may be responsive to intervention. We report on a randomized controlled trial of "Emotion-recognition-training" (ERT) versus treatment-as-usual (TAU) with N=195 mixed diagnostic children (mean age 10.52 years) referred for behavioral/emotional problems measured at pre- and 6 months post-treatment. We tested overall outcomes plus moderation and mediation models, whereby diagnostic profile was tested as a moderator of change. ERT had no impact on the group as a whole. Diagnostic status of the child did not moderate outcomes; however, levels of callous-unemotional (CU) traits moderated outcomes such that children with high CU traits responded less well to TAU, while ERT produced significant improvements in affective empathy and conduct problems in these children. Emotion recognition training has potential as an adjunctive intervention specifically for clinically referred children with high CU traits, regardless of their diagnostic status.

  1. OBSESSIVE COMPULSIVE DISORDER: IS IT A PROBLEM OF COMPLEX MOTOR PROGRAMMING?*

    PubMed Central

    Khanna, Sumant; Mukundan, C.R.; Channabasavanna, S.M.

    1987-01-01

    SUMMARY 44 subjects with Obsessive compulsive disorder (OCD) and 40 normals were compared using an experimental paradigm involving recording of the bereitschaftspotential. A decreased onset latency and increased amplitude was found in the OCD sample as compared to normals. A neurophysiological substrate for the bereitschaftspotential has been proposed. The implications of these findings in OCD as compared to Gilles de la Tourette syndrome, and for a focal neuro-physiological dysfunction have also been discussed. The findings of this study implicate a dysfunction in complex motor programming in OCD, with the possibility of this dysfunction being in the prefrontal area. PMID:21927207

  2. Infinite-range exterior complex scaling as a perfect absorber in time-dependent problems

    SciTech Connect

    Scrinzi, Armin

    2010-05-15

    We introduce infinite range exterior complex scaling (irECS) which provides for complete absorption of outgoing flux in numerical solutions of the time-dependent Schroedinger equation with strong infrared fields. This is demonstrated by computing high harmonic spectra and wave-function overlaps with the exact solution for a one-dimensional model system and by three-dimensional calculations for the H atom and an Ne atom model. We lay out the key ingredients for correct implementation and identify criteria for efficient discretization.

  3. Human Problem Solving in Dynamic Environments. Understanding and Supporting Operators in Large-Scale, Complex Systems

    DTIC Science & Technology

    1987-10-01

    AUTHOR(*) S. CONTRACT OR GRANT NUMBER(*) Richard L. Henneman and William B. Rouse MDA903-2- C -Ol45 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM...n measure Qf c Rlity. The literature review [ Henneman and I’ Rouse 1986] also suggested that an appropriate dependent measure of complexity is the... Henneman , R.L., and W.B. Rouse. Measures of human performance in fault diagnosis tasks. j= Kansactions on Sysems, Man and C .7Xkiat i’. its, SMC-14, (1):99

  4. Traveling salesman problems with PageRank Distance on complex networks reveal community structure

    NASA Astrophysics Data System (ADS)

    Jiang, Zhongzhou; Liu, Jing; Wang, Shuai

    2016-12-01

    In this paper, we propose a new algorithm for community detection problems (CDPs) based on traveling salesman problems (TSPs), labeled as TSP-CDA. Since TSPs need to find a tour with minimum cost, cities close to each other are usually clustered in the tour. This inspired us to model CDPs as TSPs by taking each vertex as a city. Then, in the final tour, the vertices in the same community tend to cluster together, and the community structure can be obtained by cutting the tour into a couple of paths. There are two challenges. The first is to define a suitable distance between each pair of vertices which can reflect the probability that they belong to the same community. The second is to design a suitable strategy to cut the final tour into paths which can form communities. In TSP-CDA, we deal with these two challenges by defining a PageRank Distance and an automatic threshold-based cutting strategy. The PageRank Distance is designed with the intrinsic properties of CDPs in mind, and can be calculated efficiently. In the experiments, benchmark networks with 1000-10,000 nodes and varying structures are used to test the performance of TSP-CDA. A comparison is also made between TSP-CDA and two well-established community detection algorithms. The results show that TSP-CDA can find accurate community structure efficiently and outperforms the two existing algorithms.

  5. Wolves, dogs, rearing and reinforcement: complex interactions underlying species differences in training and problem-solving performance.

    PubMed

    Frank, Harry

    2011-11-01

    Frank and Frank et al. (1982-1987) administered a series of age-graded training and problem-solving tasks to samples of Eastern timber wolf (C. lupus lycaon) and Alaskan Malamute (C. familiaris) pups to test Frank's (Zeitschrift für Tierpsychologie 53:389-399, 1980) model of the evolution of information processing under conditions of natural and artificial selection. Results confirmed the model's prediction that wolves should perform better than dogs on problem-solving tasks and that dogs should perform better than wolves on training tasks. Further data collected at the University of Connecticut in 1983 revealed a more complex and refined picture, indicating that species differences can be mediated by a number of factors influencing wolf performance, including socialization regimen (hand-rearing vs. mother-rearing), interactive effects of socialization on the efficacy of both rewards and punishments, and the flexibility to select learning strategies that experimenters might not anticipate.

  6. A Different Trolley Problem: The Limits of Environmental Justice and the Promise of Complex Moral Assessments for Transportation Infrastructure.

    PubMed

    Epting, Shane

    2016-12-01

    Transportation infrastructure tremendously affects the quality of life for urban residents, influences public and mental health, and shapes social relations. Historically, the topic is rich with social and political controversy and the resultant transit systems in the United States cause problems for minority residents and issues for the public. Environmental justice frameworks provide a means to identify and address harms that affect marginalized groups, but environmental justice has limits that cannot account for the mainstream population. To account for this condition, I employ a complex moral assessment measure that provides a way to talk about harms that affect the public.

  7. Content-Adaptive Finite Element Mesh Generation of 3-D Complex MR Volumes for Bioelectromagnetic Problems.

    PubMed

    Lee, W; Kim, T-S; Cho, M; Lee, S

    2005-01-01

    In studying bioelectromagnetic problems, finite element method offers several advantages over other conventional methods such as boundary element method. It allows truly volumetric analysis and incorporation of material properties such as anisotropy. Mesh generation is the first requirement in the finite element analysis and there are many different approaches in mesh generation. However conventional approaches offered by commercial packages and various algorithms do not generate content-adaptive meshes, resulting in numerous elements in the smaller volume regions, thereby increasing computational load and demand. In this work, we present an improved content-adaptive mesh generation scheme that is efficient and fast along with options to change the contents of meshes. For demonstration, mesh models of the head from a volume MRI are presented in 2-D and 3-D.

  8. An unstructured-grid software system for solving complex aerodynamic problems

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Pirzadeh, Shahyar; Parikh, Paresh

    1995-01-01

    A coordinated effort has been underway over the past four years to elevate unstructured-grid methodology to a mature level. The goal of this endeavor is to provide a validated capability to non-expert users for performing rapid aerodynamic analysis and design of complex configurations. The Euler component of the system is well developed, and is impacting a broad spectrum of engineering needs with capabilities such as rapid grid generation and inviscid flow analysis, inverse design, interactive boundary layers, and propulsion effects. Progress is also being made in the more tenuous Navier-Stokes component of the system. A robust grid generator is under development for constructing quality thin-layer tetrahedral grids, along with a companion Navier-Stokes flow solver. This paper presents an overview of this effort, along with a perspective on the present and future status of the methodology.

  9. Effective descriptions of complex quantum systems: path integrals and operator ordering problems

    NASA Astrophysics Data System (ADS)

    Eckern, U.; Gruber, M. J.; Schwab, P.

    2005-09-01

    [Dedicated to Bernhard Mühlschlegel on the occasion ofhis 80th birthday]We study certain aspects of the effective, occasionally called collective, description of complex quantum systems within the framework of the path integral formalism, in which the environment is integrated out. Generalising the standard Feynman-Vernon Caldeira-Leggett model to include a non-linear coupling between particle and environment, and considering a particular spectral density of the coupling, a coordinate-dependent mass (or velocity-dependent potential) is obtained. The related effective quantum theory, which depends on the proper discretisation of the path integral, is derived and discussed. As a result, we find that in general a simple effective low-energy Hamiltonian, in which only the coordinate-dependent mass enters, cannot be formulated. The quantum theory of weakly coupled superconductors and the quantum dynamics of vortices in Josephson junction arrays are physical examples where these considerations, in principle, are of relevance.

  10. Complex angular momenta approach for scattering problems in the presence of both monopoles and short range potentials

    NASA Astrophysics Data System (ADS)

    Canfora, Fabrizio

    2016-10-01

    I analyze the quantum mechanical scattering off a topological defect (such as a Dirac monopole) as well as a Yukawa-like potential(s) representing the typical effects of strong interactions. This system, due to the presence of a short-range potential, can be analyzed using the powerful technique of the complex angular momenta which, so far, has not been employed in the presence of monopoles (nor of other topological solitons). Due to the fact that spatial spherical symmetry is achieved only up to internal rotations, the partial wave expansion becomes very similar to the Jacob-Wick helicity amplitudes for particles with spin. However, since the angular-momentum operator has an extra "internal" contribution, fixed cuts in the complex angular momentum plane appear. Correspondingly, the background integral in the Regge formula does not decrease for large values of |cos θ | (namely, large values of the Mandelstam variable s ). Hence, the experimental observation of this kind of behavior could be a direct signal of nontrivial topological structures in strong interactions. The possible relations of these results with the soft Pomeron are shortly analyzed.

  11. Review of the Uruguayan Kidney Allocation System: the solution to a complex problem, preliminary data.

    PubMed

    Bengochea, M; Alvarez, I; Toledo, R; Carretto, E; Forteza, D

    2010-01-01

    The National Kidney Transplant Program with cadaveric donors is based on centralized and unique waitlist, serum bank, and allocation criteria, approved by Instituto Nacional de Donación y Trasplante (INDT) in agreement with clinical teams. The median donor rates over last 3 years is 20 per million population and the median number of waitlist candidates is 450. The increased number of waiting list patients and the rapid aging of our populations demanded strategies for donor acceptance, candidate assignment, and analysis of more efficient and equitable allocation models. The objectives of the new national allocation system were to improve posttransplant patient and graft survivals, allow equal access to transplantation, and reduce waitlist times. The objective of this study was to analyze variables in our current allocation system and to create a mathematical/simulation model to evaluate a new allocation system. We compared candidates and transplanted patients for gender, age, ABO blood group, human leukocyte agents (HLA), percentage of reactive antibodies (PRA), and waiting list and dialysis times. Only 2 factors showed differences: highly sensitized and patients >65 years old (Bernoulli test). An agreement between INDT and Engineering Faculty yielded a major field of study. During 2008 the data analysis and model building began. The waiting list data of the last decade of donors and transplants were processed to develop a virtual model. We used inputs of candidates and donors, with outputs and structure of the simulation system to evaluate the proposed changes. Currently, the INDT and the Mathematics and Statistics Institute are working to develop a simulation model, that is able to analyze our new national allocation system.

  12. ATSDR evaluation of health effects of chemicals. IV. Polycyclic aromatic hydrocarbons (PAHs): understanding a complex problem.

    PubMed

    Mumtaz, M M; George, J D; Gold, K W; Cibulas, W; DeRosa, C T

    1996-01-01

    Polycyclic Aromatic Hydrocarbons (PAHs) are a group of chemicals that are formed during the incomplete burning of coal, oil, gas, wood, garbage, or other organic substances, such as tobacco and charbroiled meat. There are more than 100 PAHs. PAHs generally occur as complex mixtures (for example, as part of products such as soot), not as single compounds. PAHs are found throughout the environment in the air, water, and soil. As part of its mandate, the Agency for Toxic Substances and Disease Registry (ATSDR) prepares toxicological profiles on hazardous chemicals, including PAHs (ATSDR, 1995), found at facilities on the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) National Priorities List (NPL) and which pose the most significant potential threat to human health, as determined by ATSDR and the Environmental Protection Agency (EPA). These profiles include information on health effects of chemicals from different routes and durations of exposure, their potential for exposure, regulations and advisories, and the adequacy of the existing database. Assessing the health effects of PAHs is a major challenge because environmental exposures to these chemicals are usually to complex mixtures of PAHs with other chemicals. The biological consequences of human exposure to mixtures of PAHs depend on the toxicity, carcinogenic and noncarcinogenic, of the individual components of the mixture, the types of interactions among them, and confounding factors that are not thoroughly understood. Also identified are components of exposure and health effects research needed on PAHs that will allow estimation of realistic human health risks posed by exposures to PAHs. The exposure assessment component of research should focus on (1) development of reliable analytical methods for the determination of bioavailable PAHs following ingestion, (2) estimation of bioavailable PAHs from environmental media, particularly the determination of particle-bound PAHs, (3

  13. Complexity.

    PubMed

    Gómez-Hernández, J Jaime

    2006-01-01

    It is difficult to define complexity in modeling. Complexity is often associated with uncertainty since modeling uncertainty is an intrinsically difficult task. However, modeling uncertainty does not require, necessarily, complex models, in the sense of a model requiring an unmanageable number of degrees of freedom to characterize the aquifer. The relationship between complexity, uncertainty, heterogeneity, and stochastic modeling is not simple. Aquifer models should be able to quantify the uncertainty of their predictions, which can be done using stochastic models that produce heterogeneous realizations of aquifer parameters. This is the type of complexity addressed in this article.

  14. On the Parameterized Complexity of Some Optimization Problems Related to Multiple-Interval Graphs

    NASA Astrophysics Data System (ADS)

    Jiang, Minghui

    We show that for any constant t ≥ 2, K -Independent Set and K-Dominating Set in t-track interval graphs are W[1]-hard. This settles an open question recently raised by Fellows, Hermelin, Rosamond, and Vialette. We also give an FPT algorithm for K-Clique in t-interval graphs, parameterized by both k and t, with running time max { t O(k), 2 O(klogk) } ·poly(n), where n is the number of vertices in the graph. This slightly improves the previous FPT algorithm by Fellows, Hermelin, Rosamond, and Vialette. Finally, we use the W[1]-hardness of K-Independent Set in t-track interval graphs to obtain the first parameterized intractability result for a recent bioinformatics problem called Maximal Strip Recovery (MSR). We show that MSR-d is W[1]-hard for any constant d ≥ 4 when the parameter is either the total length of the strips, or the total number of adjacencies in the strips, or the number of strips in the optimal solution.

  15. Combination Therapies for Lysosomal Storage Diseases: A Complex Answer to a Simple Problem

    PubMed Central

    Macauley, Shannon L

    2017-01-01

    Lysosomal storage diseases (LSDs) are a group of 40–50 rare monogenic disorders that result in disrupted lysosomal function and subsequent lysosomal pathology. Depending on the protein or enzyme deficiency associated with each disease, LSDs affect an array of organ systems and elicit a complex set of secondary disease mechanisms that make many of these disorders difficult to fully treat. The etiology of most LSDs is known and the innate biology of lysosomal enzymes favors therapeutic intervention, yet most attempts at treating LSDs with enzyme replacement strategies fall short of being curative. Even with the advent of more sophisticated approaches, like substrate reduction therapy, pharmacologic chaperones, gene therapy or stem cell therapy, comprehensive treatments for LSDs have yet to be achieved. Given the limitations with individual therapies, recent research has focused on using a combination approach to treat LSDs. By coupling protein-, cell-, and gene- based therapies with small molecule drugs, researchers have found greater success in eradicating the clinical features of disease. This review seeks to discuss the positive and negatives of singular therapies used to treat LSDs, and discuss how, in combination, studies have demonstrated a more holistic benefit on pathological and functional parameters. By optimizing routes of delivery, therapeutic timing, and targeting secondary disease mechanisms, combination therapy represents the future for LSD treatment. PMID:27491211

  16. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  17. Application of a low order panel method to complex three-dimensional internal flow problems

    NASA Technical Reports Server (NTRS)

    Ashby, D. L.; Sandlin, D. R.

    1986-01-01

    An evaluation of the ability of a low order panel method to predict complex three-dimensional internal flow fields was made. The computer code VSAERO was used as a basis for the evaluation. Guidelines for modeling internal flow geometries were determined and the effects of varying the boundary conditions and the use of numerical approximations on the solutions accuracy were studied. Several test cases were run and the results were compared with theoretical or experimental results. Modeling an internal flow geometry as a closed box with normal velocities specified on an inlet and exit face provided accurate results and gave the user control over the boundary conditions. The values of the boundary conditions greatly influenced the amount of leakage an internal flow geometry suffered and could be adjusted to eliminate leakage. The use of the far-field approximation to reduce computation time influenced the accuracy of a solution and was coupled with the values of the boundary conditions needed to eliminate leakage. The error induced in the influence coefficients by using the far-field approximation was found to be dependent on the type of influence coefficient, the far-field radius, and the aspect ratio of the panels.

  18. Social and ethical dimension of the natural sciences, complex problems of the age, interdisciplinarity, and the contribution of education

    NASA Astrophysics Data System (ADS)

    Develaki, Maria

    2008-09-01

    In view of the complex problems of this age, the question of the socio-ethical dimension of science acquires particular importance. We approach this matter from a philosophical and sociological standpoint, looking at such focal concerns as the motivation, purposes and methods of scientific activity, the ambivalence of scientific research and the concomitant risks, and the conflict between research freedom and external socio-political intervention. We then point out the impediments to the effectiveness of cross-disciplinary or broader meetings for addressing these complex problems and managing the associated risks, given the difficulty in communication between experts in different fields and non-experts, difficulties that education is challenged to help resolve. We find that the social necessity of informed decision-making on the basis of cross-disciplinary collaboration is reflected in the newer curricula, such as that of Greece, in aims like the acquisition of cross-subject knowledge and skills, and the ability to make decisions on controversial issues involving value conflicts. The interest and the reflections of the science education community in these matters increase its—traditionally limited—contribution to the theoretical debate on education and, by extension, the value of science education in the education system.

  19. A new approach to the problem of multiple comparisons in the genetic dissection of complex traits.

    PubMed Central

    Weller, J I; Song, J Z; Heyen, D W; Lewin, H A; Ron, M

    1998-01-01

    Saturated genetic marker maps are being used to map individual genes affecting quantitative traits. Controlling the "experimentwise" type-I error severely lowers power to detect segregating loci. For preliminary genome scans, we propose controlling the "false discovery rate," that is, the expected proportion of true null hypotheses within the class of rejected null hypotheses. Examples are given based on a granddaughter design analysis of dairy cattle and simulated backcross populations. By controlling the false discovery rate, power to detect true effects is not dependent on the number of tests performed. If no detectable genes are segregating, controlling the false discovery rate is equivalent to controlling the experimentwise error rate. If quantitative loci are segregating in the population, statistical power is increased as compared to control of the experimentwise type-I error. The difference between the two criteria increases with the increase in the number of false null hypotheses. The false discovery rate can be controlled at the same level whether the complete genome or only part of it has been analyzed. Additional levels of contrasts, such as multiple traits or pedigrees, can be handled without the necessity of a proportional decrease in the critical test probability. PMID:9832544

  20. BioDMET: a physiologically based pharmacokinetic simulation tool for assessing proposed solutions to complex biological problems.

    PubMed

    Graf, John F; Scholz, Bernhard J; Zavodszky, Maria I

    2012-02-01

    We developed a detailed, whole-body physiologically based pharmacokinetic (PBPK) modeling tool for calculating the distribution of pharmaceutical agents in the various tissues and organs of a human or animal as a function of time. Ordinary differential equations (ODEs) represent the circulation of body fluids through organs and tissues at the macroscopic level, and the biological transport mechanisms and biotransformations within cells and their organelles at the molecular scale. Each major organ in the body is modeled as composed of one or more tissues. Tissues are made up of cells and fluid spaces. The model accounts for the circulation of arterial and venous blood as well as lymph. Since its development was fueled by the need to accurately predict the pharmacokinetic properties of imaging agents, BioDMET is more complex than most PBPK models. The anatomical details of the model are important for the imaging simulation endpoints. Model complexity has also been crucial for quickly adapting the tool to different problems without the need to generate a new model for every problem. When simpler models are preferred, the non-critical compartments can be dynamically collapsed to reduce unnecessary complexity. BioDMET has been used for imaging feasibility calculations in oncology, neurology, cardiology, and diabetes. For this purpose, the time concentration data generated by the model is inputted into a physics-based image simulator to establish imageability criteria. These are then used to define agent and physiology property ranges required for successful imaging. BioDMET has lately been adapted to aid the development of antimicrobial therapeutics. Given a range of built-in features and its inherent flexibility to customization, the model can be used to study a variety of pharmacokinetic and pharmacodynamic problems such as the effects of inter-individual differences and disease-states on drug pharmacokinetics and pharmacodynamics, dosing optimization, and inter

  1. Analyzing Bilingual Education Costs.

    ERIC Educational Resources Information Center

    Bernal, Joe J.

    This paper examines the particular problems involved in analyzing the costs of bilingual education and suggests that cost analysis of bilingual education requires a fundamentally different approach than that followed in other recent school finance studies. Focus of the discussion is the Intercultural Development Research Association's (IDRA)…

  2. Improving and validating 3D models for the leaf energy balance in canopy-scale problems with complex geometry

    NASA Astrophysics Data System (ADS)

    Bailey, B.; Stoll, R., II; Miller, N. E.; Pardyjak, E.; Mahaffee, W.

    2014-12-01

    Plants cover the majority of Earth's land surface, and thus play a critical role in the surface energy balance. Within individual plant communities, the leaf energy balance is a fundamental component of most biophysical processes. Absorbed radiation drives the energy balance and provides the means by which plants produce food. Available energy is partitioned into sensible and latent heat fluxes to determine surface temperature, which strongly influences rates of metabolic activity and growth. The energy balance of an individual leaf is coupled with other leaves in the community through longwave radiation emission and advection through the air. This complex coupling can make scaling models from leaves to whole-canopies difficult, specifically in canopies with complex, heterogeneous geometries. We present a new three-dimensional canopy model that simultaneously resolves sub-tree to whole-canopy scales. The model provides spatially explicit predictions of net radiation exchange, boundary-layer and stomatal conductances, evapotranspiration rates, and ultimately leaf surface temperature. The radiation model includes complex physics such as anisotropic emission and scattering. Radiation calculations are accelerated by leveraging graphics processing unit (GPU) technology, which allows canopy-scale problems to be performed on a standard desktop workstation. Since validating the three-dimensional distribution of leaf temperature can be extremely challenging, we used several independent measurement techniques to quantify errors in measured and modeled values. When compared with measured leaf temperatures, the model gave a mean error of about 2°C, which was close to the estimated measurement uncertainty.

  3. Theoretical, spectroscopic, and mechanistic studies on transition-metal dinitrogen complexes: implications to reactivity and relevance to the nitrogenase problem.

    PubMed

    Studt, Felix; Tuczek, Felix

    2006-09-01

    Dinitrogen complexes of transition metals exhibit different binding geometries of N2 (end-on terminal, end-on bridging, side-on bridging, side-on end-on bridging), which are investigated by spectroscopy and DFT calculations, analyzing their electronic structure and reactivity. For comparison, a bis(mu-nitrido) complex, where the N--N bond has been split, has been studied as well. Most of these systems are highly covalent, and have strong metal-nitrogen bonds. In the present review, particular emphasis is put on a consideration of the activation of the coordinated dinitrogen ligand, making it susceptible to protonation, reactions with electrophiles or cleavage. In this context, theoretical, structural, and spectroscopic data giving informations on the amount of charge on the N2 unit are presented. The orbital interactions leading to a charge transfer from the metals to the dinitrogen ligand and the charge distribution within the coordinated N2 group are analyzed. Correlations between the binding mode and the observed reactivity of N2 are discussed.

  4. Eddy covariance measurements in complex terrain with a new fast response, closed-path analyzer: spectral characteristics and cross-system comparisons

    EPA Science Inventory

    In recent years, a new class of enclosed, closed-path gas analyzers suitable for eddy covariance applications has come to market, designed to combine the advantages of traditional closed-path systems (small density corrections, good performance in poor weather) and open-path syst...

  5. The Cauchy Problem in Local Spaces for the Complex Ginzburg-Landau EquationII. Contraction Methods

    NASA Astrophysics Data System (ADS)

    Ginibre, J.; Velo, G.

    We continue the study of the initial value problem for the complex Ginzburg-Landau equation (with a > 0, b > 0, g>= 0) in initiated in a previous paper [I]. We treat the case where the initial data and the solutions belong to local uniform spaces, more precisely to spaces of functions satisfying local regularity conditions and uniform bounds in local norms, but no decay conditions (or arbitrarily weak decay conditions) at infinity in . In [I] we used compactness methods and an extended version of recent local estimates [3] and proved in particular the existence of solutions globally defined in time with local regularity of the initial data corresponding to the spaces Lr for r>= 2 or H1. Here we treat the same problem by contraction methods. This allows us in particular to prove that the solutions obtained in [I] are unique under suitable subcriticality conditions, and to obtain for them additional regularity properties and uniform bounds. The method extends some of those previously applied to the nonlinear heat equation in global spaces to the framework of local uniform spaces.

  6. A system for measuring complex dielectric properties of thin films at submillimeter wavelengths using an open hemispherical cavity and a vector network analyzer

    NASA Astrophysics Data System (ADS)

    Rahman, Rezwanur; Taylor, P. C.; Scales, John A.

    2013-08-01

    Quasi-optical (QO) methods of dielectric spectroscopy are well established in the millimeter and submillimeter frequency bands. These methods exploit standing wave structure in the sample produced by a transmitted Gaussian beam to achieve accurate, low-noise measurement of the complex permittivity of the sample [e.g., J. A. Scales and M. Batzle, Appl. Phys. Lett. 88, 062906 (2006);, 10.1063/1.2172403 R. N. Clarke and C. B. Rosenberg, J. Phys. E 15, 9 (1982);, 10.1088/0022-3735/15/1/002 T. M. Hirovnen, P. Vainikainen, A. Lozowski, and A. V. Raisanen, IEEE Trans. Instrum. Meas. 45, 780 (1996)], 10.1109/19.516996. In effect the sample itself becomes a low-Q cavity. On the other hand, for optically thin samples (films of thickness much less than a wavelength) or extremely low loss samples (loss tangents below 10-5) the QO approach tends to break down due to loss of signal. In such a case it is useful to put the sample in a high-Q cavity and measure the perturbation of the cavity modes. Provided that the average mode frequency divided by the shift in mode frequency is less than the Q (quality factor) of the mode, then the perturbation should be resolvable. Cavity perturbation techniques are not new, but there are technological difficulties in working in the millimeter/submillimeter wave region. In this paper we will show applications of cavity perturbation to the dielectric characterization of semi-conductor thin films of the type used in the manufacture of photovoltaics in the 100 and 350 GHz range. We measured the complex optical constants of hot-wire chemical deposition grown 1-μm thick amorphous silicon (a-Si:H) film on borosilicate glass substrate. The real part of the refractive index and dielectric constant of the glass-substrate varies from frequency-independent to linearly frequency-dependent. We also see power-law behavior of the frequency-dependent optical conductivity from 316 GHz (9.48 cm-1) down to 104 GHz (3.12 cm-1).

  7. A system for measuring complex dielectric properties of thin films at submillimeter wavelengths using an open hemispherical cavity and a vector network analyzer.

    PubMed

    Rahman, Rezwanur; Taylor, P C; Scales, John A

    2013-08-01

    Quasi-optical (QO) methods of dielectric spectroscopy are well established in the millimeter and submillimeter frequency bands. These methods exploit standing wave structure in the sample produced by a transmitted Gaussian beam to achieve accurate, low-noise measurement of the complex permittivity of the sample [e.g., J. A. Scales and M. Batzle, Appl. Phys. Lett. 88, 062906 (2006); R. N. Clarke and C. B. Rosenberg, J. Phys. E 15, 9 (1982); T. M. Hirovnen, P. Vainikainen, A. Lozowski, and A. V. Raisanen, IEEE Trans. Instrum. Meas. 45, 780 (1996)]. In effect the sample itself becomes a low-Q cavity. On the other hand, for optically thin samples (films of thickness much less than a wavelength) or extremely low loss samples (loss tangents below 10(-5)) the QO approach tends to break down due to loss of signal. In such a case it is useful to put the sample in a high-Q cavity and measure the perturbation of the cavity modes. Provided that the average mode frequency divided by the shift in mode frequency is less than the Q (quality factor) of the mode, then the perturbation should be resolvable. Cavity perturbation techniques are not new, but there are technological difficulties in working in the millimeter/submillimeter wave region. In this paper we will show applications of cavity perturbation to the dielectric characterization of semi-conductor thin films of the type used in the manufacture of photovoltaics in the 100 and 350 GHz range. We measured the complex optical constants of hot-wire chemical deposition grown 1-μm thick amorphous silicon (a-Si:H) film on borosilicate glass substrate. The real part of the refractive index and dielectric constant of the glass-substrate varies from frequency-independent to linearly frequency-dependent. We also see power-law behavior of the frequency-dependent optical conductivity from 316 GHz (9.48 cm(-1)) down to 104 GHz (3.12 cm(-1)).

  8. Problem Solving & Comprehension. Fourth Edition.

    ERIC Educational Resources Information Center

    Whimbey, Arthur; Lochhead, Jack

    This book shows how to increase one's power to analyze and comprehend problems. First, it outlines and illustrates the methods that good problem solvers use in attacking complex ideas. Then it gives some practice in applying these methods to a variety of questions in comprehension and reasoning. Chapters include: (1) "Test Your Mind--See How…

  9. Promoting Experimental Problem-Solving Ability in Sixth-Grade Students through Problem-Oriented Teaching of Ecology: Findings of an Intervention Study in a Complex Domain

    ERIC Educational Resources Information Center

    Roesch, Frank; Nerb, Josef; Riess, Werner

    2015-01-01

    Our study investigated whether problem-oriented designed ecology lessons with phases of direct instruction and of open experimentation foster the development of cross-domain and domain-specific components of "experimental problem-solving ability" better than conventional lessons in science. We used a paper-and-pencil test to assess…

  10. A comparison of two approaches for teaching complex, authentic mathematics problems to adolescents in remedial math classes.

    PubMed

    Bottge, B A; Hasselbring, T S

    1993-05-01

    Two groups of adolescents with learning difficulties in mathematics were compared on their ability to generate solutions to a contextualized problem after being taught problem-solving skills under two conditions, one involving standard word problems, the other involving a contextualized problem on videodisc. All problems focused on adding and subtracting fractions in relation to money and linear measurement. Both groups of students improved their performance on solving word problems, but students in the contextualized problem group did significantly better on the contextualized problem posttest and were able to use their skills in two transfer tasks that followed instruction.

  11. "No more amputations": a complex scientific problem and a challenge for effective preventive strategy implementation on vascular field.

    PubMed

    Kolossváry, Endre; Farkas, Katalin; Colgan, Mary P; Edmonds, Michael; Fitzgerald, Hannah P; Fox, Martin; Pécsvárady, Zsolt; Wautrecht, Jean C; Catalano, Mariella

    2017-04-01

    Lower limb vascular amputations represent serious problem in the vascular care. As a consequence of critical limb ischemia, often associated with diabetes, it is highly critical to health care service aiming at prevention of limb loss. Understanding of the nature and complexity of amputation scenario is paramount for the effective preventive strategy planning and implementation. Amputation incidence and data of the trends show high variability in the international reports. Variability is also remarkable in a more granular, regional comparison. Different calculation methods for incidence fraction, varying epidemiological, demographic features of the populations, different socio-economic, cultural backgrounds and disparity in vascular care are the main factors contributing to this variability in reports. Lower limb amputations can be considered as a valuable healthcare quality indicator with some limitations. One of these limitations is the lower actionability that corresponds to the reduced ability of health care providers to intervene influencing the burden of amputations. Lower limb vascular amputations represent a lifetime risk, therefore not only the effective revascularization is to be achieved but the importance of the early recognition of peripheral arterial disease, no delay in referral to special vascular care, effective vascular risk prevention and collaboration in multidisciplinary teams should be also emphasized.

  12. Supporting Shared Decision-making for Children's Complex Behavioral Problems: Development and User Testing of an Option Grid™ Decision Aid.

    PubMed

    Barnett, Erin R; Boucher, Elizabeth A; Daviss, William B; Elwyn, Glyn

    2017-04-11

    There is a lack of research to guide collaborative treatment decision-making for children who have complex behavioral problems, despite the extensive use of mental health services in this population. We developed and pilot-tested a one-page Option Grid™ patient decision aid to facilitate shared decision-making for these situations. An editorial team of parents, child psychiatrists, researchers, and other stakeholders developed the scope and structure of the decision aid. Researchers included information about a carefully chosen number of psychosocial and pharmacological treatment options, using descriptions based on the best available evidence. Using semi-structured qualitative interviews (n = 18), we conducted user testing with four parents and four clinical prescribers and field testing with four parents, four clinical prescribers, and two clinic administrators. The researchers coded and synthesized the interview responses using mixed inductive and deductive methods. Parents, clinicians, and administrators felt the Option Grid had significant value, although they reported that additional training and other support would be required in order to successfully implement the Option Grid and achieve shared decision-making in clinical practice.

  13. Structural factoring approach for analyzing stochastic networks

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  14. Fractional channel multichannel analyzer

    DOEpatents

    Brackenbush, Larry W.; Anderson, Gordon A.

    1994-01-01

    A multichannel analyzer incorporating the features of the present invention obtains the effect of fractional channels thus greatly reducing the number of actual channels necessary to record complex line spectra. This is accomplished by using an analog-to-digital converter in the asynscronous mode, i.e., the gate pulse from the pulse height-to-pulse width converter is not synchronized with the signal from a clock oscillator. This saves power and reduces the number of components required on the board to achieve the effect of radically expanding the number of channels without changing the circuit board.

  15. Fractional channel multichannel analyzer

    DOEpatents

    Brackenbush, L.W.; Anderson, G.A.

    1994-08-23

    A multichannel analyzer incorporating the features of the present invention obtains the effect of fractional channels thus greatly reducing the number of actual channels necessary to record complex line spectra. This is accomplished by using an analog-to-digital converter in the asynchronous mode, i.e., the gate pulse from the pulse height-to-pulse width converter is not synchronized with the signal from a clock oscillator. This saves power and reduces the number of components required on the board to achieve the effect of radically expanding the number of channels without changing the circuit board. 9 figs.

  16. Collab-Analyzer: An Environment for Conducting Web-Based Collaborative Learning Activities and Analyzing Students' Information-Searching Behaviors

    ERIC Educational Resources Information Center

    Wu, Chih-Hsiang; Hwang, Gwo-Jen; Kuo, Fan-Ray

    2014-01-01

    Researchers have found that students might get lost or feel frustrated while searching for information on the Internet to deal with complex problems without real-time guidance or supports. To address this issue, a web-based collaborative learning system, Collab-Analyzer, is proposed in this paper. It is not only equipped with a collaborative…

  17. Robust Decision Making: The Cognitive and Computational Modeling of Team Problem Solving for Decision Making under Complex and Dynamic Conditions

    DTIC Science & Technology

    2015-07-14

    when solving a variety of hard numerical problems. Identification of the impact of team structure on problem solving behavior under changing conditions...of the algorithm also performed well as an optimization algorithm when solving a variety of hard numerical problems. Identification of the impact of...algorithm also performed well as an optimization algorithm when solving a variety of hard numerical problems. Identification of the impact of team structure

  18. Cognitive Analysis of U.S. and Chinese Students' Mathematical Performance on Tasks Involving Computation, Simple Problem Solving, and Complex Problem Solving.

    ERIC Educational Resources Information Center

    Cai, Jinfa

    1995-01-01

    This document is 7th in the Journal for Research in Mathematics Education monograph series. The mathematical performance of (n=250) U.S. 6th-grade students from both private and public schools and (n=425) Chinese 6th-graders from both key and common schools was examined on multiple-choice tasks assessing computation and simple problem solving, and…

  19. Towards efficient uncertainty quantification in complex and large-scale biomechanical problems based on a Bayesian multi-fidelity scheme.

    PubMed

    Biehler, Jonas; Gee, Michael W; Wall, Wolfgang A

    2015-06-01

    . Additionally, the employed approach results in a tremendous reduction of computational costs, rendering uncertainty quantification with complex patient-specific nonlinear biomechanical models practical for the first time. Second, we also analyze the impact of the uncertainty in the input parameter on mechanical quantities typically related to abdominal aortic aneurysm rupture potential such as von Mises stress, von Mises strain and strain energy. Thus, providing first estimates on the variability of these mechanical quantities due to an uncertain constitutive parameter, and revealing the potential error made by assuming population averaged mean values in patient-specific simulations of abdominal aortic aneurysms. Moreover, the influence of correlation length of the random field is investigated in a parameter study using MC.

  20. The Computer-Based Assessment of Complex Problem Solving and How It Is Influenced by Students' Information and Communication Technology Literacy

    ERIC Educational Resources Information Center

    Greiff, Samuel; Kretzschmar, André; Müller, Jonas C.; Spinath, Birgit; Martin, Romain

    2014-01-01

    The 21st-century work environment places strong emphasis on nonroutine transversal skills. In an educational context, complex problem solving (CPS) is generally considered an important transversal skill that includes knowledge acquisition and its application in new and interactive situations. The dynamic and interactive nature of CPS requires a…

  1. The ESPAT tool: a general-purpose DSS shell for solving stochastic optimization problems in complex river-aquifer systems

    NASA Astrophysics Data System (ADS)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel; Tilmant, Amaury

    2015-04-01

    Stochastic programming methods are better suited to deal with the inherent uncertainty of inflow time series in water resource management. However, one of the most important hurdles in their use in practical implementations is the lack of generalized Decision Support System (DSS) shells, usually based on a deterministic approach. The purpose of this contribution is to present a general-purpose DSS shell, named Explicit Stochastic Programming Advanced Tool (ESPAT), able to build and solve stochastic programming problems for most water resource systems. It implements a hydro-economic approach, optimizing the total system benefits as the sum of the benefits obtained by each user. It has been coded using GAMS, and implements a Microsoft Excel interface with a GAMS-Excel link that allows the user to introduce the required data and recover the results. Therefore, no GAMS skills are required to run the program. The tool is divided into four modules according to its capabilities: 1) the ESPATR module, which performs stochastic optimization procedures in surface water systems using a Stochastic Dual Dynamic Programming (SDDP) approach; 2) the ESPAT_RA module, which optimizes coupled surface-groundwater systems using a modified SDDP approach; 3) the ESPAT_SDP module, capable of performing stochastic optimization procedures in small-size surface systems using a standard SDP approach; and 4) the ESPAT_DET module, which implements a deterministic programming procedure using non-linear programming, able to solve deterministic optimization problems in complex surface-groundwater river basins. The case study of the Mijares river basin (Spain) is used to illustrate the method. It consists in two reservoirs in series, one aquifer and four agricultural demand sites currently managed using historical (XIV century) rights, which give priority to the most traditional irrigation district over the XX century agricultural developments. Its size makes it possible to use either the SDP or

  2. Providing Formative Assessment to Students Solving Multipath Engineering Problems with Complex Arrangements of Interacting Parts: An Intelligent Tutor Approach

    ERIC Educational Resources Information Center

    Steif, Paul S.; Fu, Luoting; Kara, Levent Burak

    2016-01-01

    Problems faced by engineering students involve multiple pathways to solution. Students rarely receive effective formative feedback on handwritten homework. This paper examines the potential for computer-based formative assessment of student solutions to multipath engineering problems. In particular, an intelligent tutor approach is adopted and…

  3. Training Preschool Children to Use Visual Imagining as a Problem-Solving Strategy for Complex Categorization Tasks

    ERIC Educational Resources Information Center

    Kisamore, April N.; Carr, James E.; LeBlanc, Linda A.

    2011-01-01

    It has been suggested that verbally sophisticated individuals engage in a series of precurrent behaviors (e.g., covert intraverbal behavior, grouping stimuli, visual imagining) to solve problems such as answering questions (Palmer, 1991; Skinner, 1953). We examined the effects of one problem solving strategy--visual imagining--on increasing…

  4. X-ray absorption spectroscopy to analyze nuclear geometry and electronic structure of biological metal centers--potential and questions examined with special focus on the tetra-nuclear manganese complex of oxygenic photosynthesis.

    PubMed

    Dau, Holger; Liebisch, Peter; Haumann, Michael

    2003-07-01

    X-ray absorption spectroscopy (XAS) has become a prominent tool for the element-specific analysis of transition metals at the catalytic center of metalloenzymes. In the present study the information content of X-ray spectra with respect to the nuclear geometry and, in particular, to the electronic structure of the protein-bound metal ions is explored using the manganese complex of photosystem II (PSIII) as a model system. The EXAFS range carries direct information on the number and distances of ligands as well as on the chemical type of the ligand donor function. For first-sphere ligands and second-sphere metals (in multinuclear complexes), the determination of precise distances is mostly straightforward, whereas the determination of coordination numbers clearly requires more effort. The EXAFS section starts with an exemplifying discussion of a PSII spectrum data set with focus on the coordination number problem. Subsequently, the method of linear dichroism EXAFS spectroscopy is introduced and it is shown how the EXAFS data leads to an atomic resolution model for the tetra-manganese complex of PSII. In the XANES section the following aspects are considered: (1) Alternative approaches are evaluated for determination of the metal-oxidation state by comparison with a series of model compounds. (2) The interpretation of XANES spectra in terms of molecular orbitals (MOs) is approached by comparative multiple-scattering calculations and MO calculations. (3) The underlying reasons for the oxidation-state dependence of the XANES spectra are explored. Furthermore, the potential of modern XANES theory is demonstrated by presenting first simulations of the dichroism in the XANES spectra of the PSII manganese complex.

  5. The management of cognitive load during complex cognitive skill acquisition by means of computer-simulated problem solving.

    PubMed

    Kester, Liesbeth; Kirschner, Paul A; van Merriënboer, Jeroen J G

    2005-03-01

    This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram, while information in the integrated format condition was integrated in the circuit diagram. It was hypothesized that learners in the integrated format would achieve better test results than the learners in the split-source format. Equivalent-test problem and transfer-test problem performance were studied. Transfertest scores confirmed the hypothesis, though no differences were found on the equivalent-test scores.

  6. Bourbaki's structure theory in the problem of complex systems simulation models synthesis and model-oriented programming

    NASA Astrophysics Data System (ADS)

    Brodsky, Yu. I.

    2015-01-01

    The work is devoted to the application of Bourbaki's structure theory to substantiate the synthesis of simulation models of complex multicomponent systems, where every component may be a complex system itself. An application of the Bourbaki's structure theory offers a new approach to the design and computer implementation of simulation models of complex multicomponent systems—model synthesis and model-oriented programming. It differs from the traditional object-oriented approach. The central concept of this new approach and at the same time, the basic building block for the construction of more complex structures is the concept of models-components. A model-component endowed with a more complicated structure than, for example, the object in the object-oriented analysis. This structure provides to the model-component an independent behavior-the ability of standard responds to standard requests of its internal and external environment. At the same time, the computer implementation of model-component's behavior is invariant under the integration of models-components into complexes. This fact allows one firstly to construct fractal models of any complexity, and secondly to implement a computational process of such constructions uniformly-by a single universal program. In addition, the proposed paradigm allows one to exclude imperative programming and to generate computer code with a high degree of parallelism.

  7. Advising a Bus Company on Number of Needed Buses: How High-School Physics Students' Deal With a "Complex Problem"?

    ERIC Educational Resources Information Center

    Balukovic, Jasmina; Slisko, Josip; Hadzibegovic, Zalkida

    2011-01-01

    Since 2003, international project PISA evaluates 15-year old students in solving problems that include "decision taking", "analysis and design of systems" and "trouble-shooting". This article presents the results of a pilot research conducted with 215 students from first to fourth grade of a high school in Sarajevo…

  8. The Management of Cognitive Load During Complex Cognitive Skill Acquisition by Means of Computer-Simulated Problem Solving

    ERIC Educational Resources Information Center

    Kester, Liesbeth; Kirschner, Paul A.; van Merrienboer, Jeroen J.G.

    2005-01-01

    This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram, while information in the integrated format condition…

  9. The Application of Linguistics to the Problem of Teaching Pupils to Translate Complex Latin Sentences Into English.

    ERIC Educational Resources Information Center

    Schofield, Harry

    1968-01-01

    Remedies are suggested for difficulties encountered in Latin to English translations by pupils in the fourth and fifth forms of English Grammar schools. Reading skills proficiency is seen as a prerequisite for effective translation, and stave analysis is suggested as a method of solving the problem of gross error in pupils' translations of complex…

  10. Retrospective Discourse Discussions: How Teacher Talk Enables One Novice Literacy Teacher to Make Sense of Complex Teaching Problems

    ERIC Educational Resources Information Center

    Bausch, Linda S.; Voorhees, Susan C.

    2008-01-01

    In this article, the authors describe a retrospective discourse discussions approach that was developed in a graduate literacy education course. This method represents a reconceptualization of supervising and coaching graduate students where meanings are constructed, problems are reframed, and beginning professionals can develop more nuanced…

  11. Simple Solutions to Complex Problems: Moral Panic and the Fluid Shift from "Equity" to "Quality" in Education

    ERIC Educational Resources Information Center

    Mockler, Nicole

    2014-01-01

    Education is increasingly conceptualised by governments and policymakers in western democracies in terms of productivity and human capital, emphasising elements of individualism and competition over concerns around democracy and equity. More and more, solutions to intransigent educational problems related to equity are seen in terms of quality and…

  12. The Use of the Solihull Approach with Children with Complex Neurodevelopmental Difficulties and Sleep Problems: A Case Study

    ERIC Educational Resources Information Center

    Williams, Laura; Newell, Reetta

    2013-01-01

    The following article introduces the Solihull Approach, a structured framework for intervention work with families (Douglas, "Solihull resource pack; the first five years." Cambridge: Jill Rogers Associates, 2001) and aims to demonstrate the usefulness of this approach in working with school-age children with complex neurodevelopmental…

  13. A case-based, problem-based learning approach to prepare master of public health candidates for the complexities of global health.

    PubMed

    Leon, Juan S; Winskell, Kate; McFarland, Deborah A; del Rio, Carlos

    2015-03-01

    Global health is a dynamic, emerging, and interdisciplinary field. To address current and emerging global health challenges, we need a public health workforce with adaptable and collaborative problem-solving skills. In the 2013-2014 academic year, the Hubert Department of Global Health at the Rollins School of Public Health-Emory University launched an innovative required core course for its first-year Master of Public Health students in the global health track. The course uses a case-based, problem-based learning approach to develop global health competencies. Small teams of students propose solutions to these problems by identifying learning issues and critically analyzing and synthesizing new information. We describe the course structure and logistics used to apply this approach in the context of a large class and share lessons learned.

  14. A Case-Based, Problem-Based Learning Approach to Prepare Master of Public Health Candidates for the Complexities of Global Health

    PubMed Central

    Winskell, Kate; McFarland, Deborah A.; del Rio, Carlos

    2015-01-01

    Global health is a dynamic, emerging, and interdisciplinary field. To address current and emerging global health challenges, we need a public health workforce with adaptable and collaborative problem-solving skills. In the 2013–2014 academic year, the Hubert Department of Global Health at the Rollins School of Public Health–Emory University launched an innovative required core course for its first-year Master of Public Health students in the global health track. The course uses a case-based, problem-based learning approach to develop global health competencies. Small teams of students propose solutions to these problems by identifying learning issues and critically analyzing and synthesizing new information. We describe the course structure and logistics used to apply this approach in the context of a large class and share lessons learned. PMID:25706029

  15. An integrated in silico approach to analyze the involvement of single amino acid polymorphisms in FANCD1/BRCA2-PALB2 and FANCD1/BRCA2-RAD51 complex.

    PubMed

    Doss, C George Priya; Nagasundaram, N

    2014-11-01

    Fanconi anemia (FA) is an autosomal recessive human disease characterized by genomic instability and a marked increase in cancer risk. The importance of FANCD1 gene is manifested by the fact that deleterious amino acid substitutions were found to confer susceptibility to hereditary breast and ovarian cancers. Attaining experimental knowledge about the possible disease-associated substitutions is laborious and time consuming. The recent introduction of genome variation analyzing in silico tools have the capability to identify the deleterious variants in an efficient manner. In this study, we conducted in silico variation analysis of deleterious non-synonymous SNPs at both functional and structural level in the breast cancer and FA susceptibility gene BRCA2/FANCD1. To identify and characterize deleterious mutations in this study, five in silico tools based on two different prediction methods namely pathogenicity prediction (SIFT, PolyPhen, and PANTHER), and protein stability prediction (I-Mutant 2.0 and MuStab) were analyzed. Based on the deleterious scores that overlap in these in silico approaches, and the availability of three-dimensional structures, structure analysis was carried out with the major mutations that occurred in the native protein coded by FANCD1/BRCA2 gene. In this work, we report the results of the first molecular dynamics (MD) simulation study performed to analyze the structural level changes in time scale level with respect to the native and mutated protein complexes (G25R, W31C, W31R in FANCD1/BRCA2-PALB2, and F1524V, V1532F in FANCD1/BRCA2-RAD51). Analysis of the MD trajectories indicated that predicted deleterious variants alter the structural behavior of BRCA2-PALB2 and BRCA2-RAD51 protein complexes. In addition, statistical analysis was employed to test the significance of these in silico tool predictions. Based on these predictions, we conclude that the identification of disease-related SNPs by in silico methods, in combination with MD

  16. Problem Solving and Comprehension. Third Edition.

    ERIC Educational Resources Information Center

    Whimbey, Arthur; Lochhead, Jack

    This book is directed toward increasing students' ability to analyze problems and comprehend what they read and hear. It outlines and illustrates the methods that good problem solvers use in attacking complex ideas, and provides practice in applying these methods to a variety of questions involving comprehension and reasoning. Chapter I includes a…

  17. Testing the Usability of Interactive Visualizations for Complex Problem-Solving: Findings Related to Improving Interfaces and Help.

    ERIC Educational Resources Information Center

    Mirel, Barbara

    2001-01-01

    Conducts a scenario-based usability test with 10 data analysts using visual querying (visually analyzing data with interactive graphics). Details a range of difficulties found in visual selection that, at times, gave rise to inaccurate selections, invalid conclusions, and misguided decisions. Argues that support for visual selection must be built…

  18. Downhole Fluid Analyzer Development

    SciTech Connect

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  19. A 3D Interface-Enriched Generalized Finite Element Method for Weakly Discontinuous Problems with Complex Internal Geometries

    DTIC Science & Technology

    2012-01-04

    c l e i n f o Article history: Received 24 July 2011 Received in revised form 14 November 2011 Accepted 27 December 2011 Available online 4 January...2012 Keywords: GFEM/XFEM Weak discontinuity Convection–diffusion equation Heat transfer problem Convergence study Heterogeneous materials a b s t r a c ...unclassified b. ABSTRACT unclassified c . THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 previously presented in [30

  20. Experiences with explicit finite-difference schemes for complex fluid dynamics problems on STAR-100 and CYBER-203 computers

    NASA Technical Reports Server (NTRS)

    Kumar, A.; Rudy, D. H.; Drummond, J. P.; Harris, J. E.

    1982-01-01

    Several two- and three-dimensional external and internal flow problems solved on the STAR-100 and CYBER-203 vector processing computers are described. The flow field was described by the full Navier-Stokes equations which were then solved by explicit finite-difference algorithms. Problem results and computer system requirements are presented. Program organization and data base structure for three-dimensional computer codes which will eliminate or improve on page faulting, are discussed. Storage requirements for three-dimensional codes are reduced by calculating transformation metric data in each step. As a result, in-core grid points were increased in number by 50% to 150,000, with a 10% execution time increase. An assessment of current and future machine requirements shows that even on the CYBER-205 computer only a few problems can be solved realistically. Estimates reveal that the present situation is more storage limited than compute rate limited, but advancements in both storage and speed are essential to realistically calculate three-dimensional flow.

  1. Megacities in the coastal zone: Using a driver-pressure-state-impact-response framework to address complex environmental problems

    NASA Astrophysics Data System (ADS)

    Sekovski, Ivan; Newton, Alice; Dennison, William C.

    2012-01-01

    The purpose of this study was to elaborate on the role of coastal megacities in environmental degradation and their contribution to global climate change. Although only less than 4 percent of the total world's population resides in coastal megacities, their impact on environment is significant due to their rapid development, high population densities and high consumption rate of their residents. This study was carried out by implementing a Drivers-Pressures-States-Impacts-Responses (DPSIR) framework. This analytical framework was chosen because of its potential to link the existing data, gathered from various previous studies, in causal relationship. In this text, coastal megacities have been defined as cities exceeding 10 million inhabitants, situated in "near-coastal zone". Their high rates of the consumption of food, water, space and energy were observed and linked to the high performance rates of related economic activities (industry, transportation, power generation, agriculture and water extraction). In many of the studied coastal megacities, deteriorated quality of air and water was perceived, which can, in combination with global warming, lead to health problems and economic and social disturbance among residents. The extent of problems varied between developing and developed countries, showing higher rates of population growth and certain harmful emissions in megacities of developing countries, as well as more problems regarding food and water shortages, sanitation, and health care support. Although certain projections predict slowdown of growth in most coastal megacities, their future impact on environment is still unclear due to the uncertainties regarding future climate change and trajectories of consumption patterns.

  2. TRAINING PRESCHOOL CHILDREN TO USE VISUAL IMAGINING AS A PROBLEM-SOLVING STRATEGY FOR COMPLEX CATEGORIZATION TASKS

    PubMed Central

    Kisamore, April N; Carr, James E; LeBlanc, Linda A

    2011-01-01

    It has been suggested that verbally sophisticated individuals engage in a series of precurrent behaviors (e.g., covert intraverbal behavior, grouping stimuli, visual imagining) to solve problems such as answering questions (Palmer, 1991; Skinner, 1953). We examined the effects of one problem solving strategy—visual imagining—on increasing responses to intraverbal categorization questions. Participants were 4 typically developing preschoolers between the ages of 4 and 5 years. Visual imagining training was insufficient to produce a substantial increase in target responses. It was not until the children were prompted to use the visual imagining strategy that a large and immediate increase in the number of target responses was observed. The number of prompts did not decrease until the children were given a rule describing the use of the visual imagining strategy. Within-session response patterns indicated that none of the children used visual imagining prior to being prompted to do so and that use of the strategy continued after introduction of the rule. These results were consistent for 3 of 4 children. Within-session response patterns suggested that the 4th child occasionally imagined when prompted to do so, but the gains were not maintained. The results are discussed in terms of Skinner's analysis of problem solving and the development of visual imagining. PMID:21709783

  3. Analyzing Software Piracy in Education.

    ERIC Educational Resources Information Center

    Lesisko, Lee James

    This study analyzes the controversy of software piracy in education. It begins with a real world scenario that presents the setting and context of the problem. The legalities and background of software piracy are explained and true court cases are briefly examined. Discussion then focuses on explaining why individuals and organizations pirate…

  4. Self-adaptive difference method for the effective solution of computationally complex problems of boundary layer theory

    NASA Technical Reports Server (NTRS)

    Schoenauer, W.; Daeubler, H. G.; Glotz, G.; Gruening, J.

    1986-01-01

    An implicit difference procedure for the solution of equations for a chemically reacting hypersonic boundary layer is described. Difference forms of arbitrary error order in the x and y coordinate plane were used to derive estimates for discretization error. Computational complexity and time were minimized by the use of this difference method and the iteration of the nonlinear boundary layer equations was regulated by discretization error. Velocity and temperature profiles are presented for Mach 20.14 and Mach 18.5; variables are velocity profiles, temperature profiles, mass flow factor, Stanton number, and friction drag coefficient; three figures include numeric data.

  5. Inverse problem of the multislice method in retrieving projected complex potentials from the exit-wave function.

    PubMed

    Lin, Fang; Jin, Chuanhong

    2014-03-01

    We proposed a new algorithm that retrieves the projected potentials from the EW of object. This algorithm is based on the traditional multislice method which involves the convolution operation in calculation. The retrieved potential is complex including both the electrostatic and absorptive components. Tests with the simulated exit waves of a 200 K InP crystal prove the algorithm effective for objects in wide thickness range. For thick specimen where dynamical electron diffraction prevails, the retrieved potential could present structure and chemical information of object by completely mapping an atom's scattering potential during interaction with incident electrons.

  6. You Can't Get There From Here! Problems and Potential Solutions in Developing New Classes of Complex Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy

    2005-01-01

    The explosion of capabilities and new products within the sphere of Information Technology (IT) has fostered widespread, overly optimistic opinions regarding the industry, based on common but unjustified assumptions of quality and correctness of software. These assumptions are encouraged by software producers and vendors, who at this late date have not succeeded in finding a way to overcome the lack of an automated, mathematically sound way to develop correct systems from requirements. NASA faces this dilemma as it envisages advanced mission concepts that involve large swarms of small spacecraft that will engage cooperatively to acheve science goals. Such missions entail levels of complexity that beg for new methods for system development far beyond today's methods, which are inadequate for ensuring correct behavior of large numbers of interacting intelligent mission elements. New system development techniques recently devised through NASA-led research will offer some innovative approaches to achieving correctness in complex system development, including autonomous swarm missions that exhibit emergent behavior, as well as general software products created by the computing industry.

  7. Zero kinetic energy spectroscopy: mass-analyzed threshold ionization spectra of chromium sandwich complexes with alkylbenzenes, (η(6)-RPh)(2)Cr (R = Me, Et, i-Pr, t-Bu).

    PubMed

    Ketkov, Sergey Y; Selzle, Heinrich L; Cloke, F Geoffrey N; Markin, Gennady V; Shevelev, Yury A; Domrachev, Georgy A; Schlag, Edward W

    2010-10-28

    For over 25 years zero kinetic energy (ZEKE) spectroscopy has yielded a rich foundation of high-resolution results of molecular ions. This was based on the discovery in the late 60's of long-lived ion states throughout the ionization continuum of molecular ions. Here, an example is chosen from another fundamental system pioneered at this university. The mass-analyzed threshold ionization (MATI) spectra of jet-cooled chromium bisarene complexes (η(6)-RPh)(2)Cr (R = Me (1), Et (2), i-Pr (3), and t-Bu (4)) have been measured and interpreted on the basis of DFT calculations. The MATI spectra of complexes 1 and 2 appear to reveal features arising from ionizations of the isomers formed by the rotation of one arene ring relative to the other. The 1 and 2 MATI spectra show two intense peaks corresponding to the 0(0)(0) ionizations with inverse intensity ratios. As indicated by the DFT calculations, the intensity ratio change on going from 1 to 2 results from different isomers contributing to each MATI peak. The ionization energies corresponding to the 0(0)(0) peaks are 42746 ± 5 and 42809 ± 5 cm(-1) for compound 1 and 42379 ± 5 and 42463 ± 5 cm(-1) for complex 2. The 1 and 2 spectra show also the weaker features representing transitions to the vibrationally excited cationic levels, the signals of individual rotamers being detected and assigned on the basis of calculated vibrational frequencies. The MATI spectra of compounds 3 and 4 reveal only one strong peak because of close ionization potentials of the isomers contributing to the MATI signal. The 3 and 4 ionization energies are 42104 ± 5 and 41917 ± 5 cm(-1), respectively. The precise values of ionization energies obtained from the MATI spectra reveal a nonlinear dependence of the IE on the number of Me groups in the alkyl substituents of (η(6)-RPh)(2)Cr. This can be explained by an increase in the molecular zero point energies on methylation of the substituents.

  8. Ultrasonic simulation—Imagine3D and SimScan: Tools to solve the inverse problem for complex turbine components

    NASA Astrophysics Data System (ADS)

    Mair, H. D.; Ciorau, P.; Owen, D.; Hazelton, T.; Dunning, G.

    2000-05-01

    Two ultrasonic simulation packages: Imagine 3D and SIMSCAN have specifically been developed to solve the inverse problem for blade root and rotor steeple of low-pressure turbine. The software was integrated with the 3D drawing of the inspected parts, and with the dimensions of linear phased-array probes. SIMSCAN simulates the inspection scenario in both optional conditions: defect location and probe movement/refracted angle range. The results are displayed into Imagine 3-D, with a variety of options: rendering, display 1:1, grid, generated UT beam. The results are very useful for procedure developer, training and to optimize the phased-array probe inspection sequence. A spreadsheet is generated to correlate the defect coordinates with UT data (probe position, skew and refracted angle, UT path, and probe movement). The simulation models were validated during experimental work with phased-array systems. The accuracy in probe position is ±1 mm, and the refracted/skew angle is within ±0.5°. Representative examples of phased array focal laws/probe movement for a specific defect location, are also included.

  9. Attention-deficit hyperactivity disorder (ADHD), substance use disorders, and criminality: a difficult problem with complex solutions.

    PubMed

    Knecht, Carlos; de Alvaro, Raquel; Martinez-Raga, Jose; Balanza-Martinez, Vicent

    2015-05-01

    The association between attention-deficit hyperactivity disorder (ADHD) and criminality has been increasingly recognized as an important societal concern. Studies conducted in different settings have revealed high rates of ADHD among adolescent offenders. The risk for criminal behavior among individuals with ADHD is increased when there is psychiatric comorbidity, particularly conduct disorder and substance use disorder. In the present report, it is aimed to systematically review the literature on the epidemiological, neurobiological, and other risk factors contributing to this association, as well as the key aspects of the assessment, diagnosis, and treatment of ADHD among offenders. A systematic literature search of electronic databases (PubMed, EMBASE, and PsycINFO) was conducted to identify potentially relevant studies published in English, in peer-reviewed journals. Studies conducted in various settings within the judicial system and in many different countries suggest that the rate of adolescent and adult inmates with ADHD far exceeds that reported in the general population; however, underdiagnosis is common. Similarly, follow-up studies of children with ADHD have revealed high rates of criminal behaviors, arrests, convictions, and imprisonment in adolescence and adulthood. Assessment of ADHD and comorbid condition requires an ongoing and careful process. When treating offenders or inmates with ADHD, who commonly present other comorbid psychiatric disorder complex, comprehensive and tailored interventions, combining pharmacological and psychosocial strategies are likely to be needed.

  10. Explorations of the Concept of Local Capacity for Problem Solving: An Introduction to a Series of Papers Analyzing Nine School Improvement Projects. Draft. Documentation and Technical Assistance in Urban Schools.

    ERIC Educational Resources Information Center

    Wilson, Stephen H.

    A model for enhancing the local capacity of urban schools for solving problems by restructuring school settings is the subject of this paper. In identifying the strength and weaknesses of such a concept, the paper reviews data from nine sites studied by the Documentation and Technical Assistance (DTA) project for their applicability to other…

  11. A longitudinal study of higher-order thinking skills: working memory and fluid reasoning in childhood enhance complex problem solving in adolescence

    PubMed Central

    Greiff, Samuel; Wüstenberg, Sascha; Goetz, Thomas; Vainikainen, Mari-Pauliina; Hautamäki, Jarkko; Bornstein, Marc H.

    2015-01-01

    Scientists have studied the development of the human mind for decades and have accumulated an impressive number of empirical studies that have provided ample support for the notion that early cognitive performance during infancy and childhood is an important predictor of later cognitive performance during adulthood. As children move from childhood into adolescence, their mental development increasingly involves higher-order cognitive skills that are crucial for successful planning, decision-making, and problem solving skills. However, few studies have employed higher-order thinking skills such as complex problem solving (CPS) as developmental outcomes in adolescents. To fill this gap, we tested a longitudinal developmental model in a sample of 2,021 Finnish sixth grade students (M = 12.41 years, SD = 0.52; 1,041 female, 978 male, 2 missing sex). We assessed working memory (WM) and fluid reasoning (FR) at age 12 as predictors of two CPS dimensions: knowledge acquisition and knowledge application. We further assessed students’ CPS performance 3 years later as a developmental outcome (N = 1696; M = 15.22 years, SD = 0.43; 867 female, 829 male). Missing data partly occurred due to dropout and technical problems during the first days of testing and varied across indicators and time with a mean of 27.2%. Results revealed that FR was a strong predictor of both CPS dimensions, whereas WM exhibited only a small influence on one of the two CPS dimensions. These results provide strong support for the view that CPS involves FR and, to a lesser extent, WM in childhood and from there evolves into an increasingly complex structure of higher-order cognitive skills in adolescence. PMID:26283992

  12. A longitudinal study of higher-order thinking skills: working memory and fluid reasoning in childhood enhance complex problem solving in adolescence.

    PubMed

    Greiff, Samuel; Wüstenberg, Sascha; Goetz, Thomas; Vainikainen, Mari-Pauliina; Hautamäki, Jarkko; Bornstein, Marc H

    2015-01-01

    Scientists have studied the development of the human mind for decades and have accumulated an impressive number of empirical studies that have provided ample support for the notion that early cognitive performance during infancy and childhood is an important predictor of later cognitive performance during adulthood. As children move from childhood into adolescence, their mental development increasingly involves higher-order cognitive skills that are crucial for successful planning, decision-making, and problem solving skills. However, few studies have employed higher-order thinking skills such as complex problem solving (CPS) as developmental outcomes in adolescents. To fill this gap, we tested a longitudinal developmental model in a sample of 2,021 Finnish sixth grade students (M = 12.41 years, SD = 0.52; 1,041 female, 978 male, 2 missing sex). We assessed working memory (WM) and fluid reasoning (FR) at age 12 as predictors of two CPS dimensions: knowledge acquisition and knowledge application. We further assessed students' CPS performance 3 years later as a developmental outcome (N = 1696; M = 15.22 years, SD = 0.43; 867 female, 829 male). Missing data partly occurred due to dropout and technical problems during the first days of testing and varied across indicators and time with a mean of 27.2%. Results revealed that FR was a strong predictor of both CPS dimensions, whereas WM exhibited only a small influence on one of the two CPS dimensions. These results provide strong support for the view that CPS involves FR and, to a lesser extent, WM in childhood and from there evolves into an increasingly complex structure of higher-order cognitive skills in adolescence.

  13. Portable automatic blood analyzer

    NASA Technical Reports Server (NTRS)

    Coleman, R. L.

    1975-01-01

    Analyzer employs chemical-sensing electrodes for determination of blood, gas, and ion concentrations. It is rugged, easily serviced, and comparatively simple to operate. System can analyze up to eight parameters and can be modified to measure other blood constituents including nonionic species, such as urea, glucose, and oxygen.

  14. Analyzing Peace Pedagogies

    ERIC Educational Resources Information Center

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  15. A Categorization of Dynamic Analyzers

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  16. Rigged or rigorous? Partnerships for research and evaluation of complex social problems: Lessons from the field of violence against women and girls.

    PubMed

    Zimmerman, Cathy; Michau, Lori; Hossain, Mazeda; Kiss, Ligia; Borland, Rosilyne; Watts, Charlotte

    2016-09-01

    There is growing demand for robust evidence to address complex social phenomena such as violence against women and girls (VAWG). Research partnerships between scientists and non-governmental or international organizations (NGO/IO) are increasingly popular, but can pose challenges, including concerns about potential conflicts of interest. Drawing on our experience collaborating on VAWG research, we describe challenges and contributions that NGO/IO and academic partners can make at different stages of the research process and the effects that collaborations can have on scientific inquiry. Partners may struggle with differing priorities and misunderstandings about roles, limitations, and intentions. Benefits of partnerships include a shared vision of study goals, differing and complementary expertise, mutual respect, and a history of constructive collaboration. Our experience suggests that when investigating multi-faceted social problems, instead of 'rigging' study results, research collaborations can strengthen scientific rigor and offer the greatest potential for impact in the communities we seek to serve.

  17. Generating and Analyzing Data.

    ERIC Educational Resources Information Center

    Stevens, Jill

    1993-01-01

    Presents activities in which students develop and analyze scatterplots on graphing calculators to model corn growth, decay, a box of maximum volume, and weather prediction. Provides reproducible worksheets. (MDH)

  18. Analyzing Microarray Data.

    PubMed

    Hung, Jui-Hung; Weng, Zhiping

    2017-03-01

    Because there is no widely used software for analyzing RNA-seq data that has a graphical user interface, this protocol provides an example of analyzing microarray data using Babelomics. This analysis entails performing quantile normalization and then detecting differentially expressed genes associated with the transgenesis of a human oncogene c-Myc in mice. Finally, hierarchical clustering is performed on the differentially expressed genes using the Cluster program, and the results are visualized using TreeView.

  19. Portable Fuel Quality Analyzer

    DTIC Science & Technology

    2014-01-27

    response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing...Brouillette 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) real- Time Analyzers,362...Portable Fuel Quality Analyzer Contract Number: W56HZV-13-C-0296 PI: Dr. Stuart Farquharson (860-635-9800, stu@rta.biz), Company: Real- Time

  20. Soil Rock Analyzer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A redesigned version of a soil/rock analyzer developed by Martin Marietta under a Langley Research Center contract is being marketed by Aurora Tech, Inc. Known as the Aurora ATX-100, it has self-contained power, an oscilloscope, a liquid crystal readout, and a multichannel spectrum analyzer. It measures energy emissions to determine what elements in what percentages a sample contains. It is lightweight and may be used for mineral exploration, pollution monitoring, etc.

  1. A new multi-frequency approach based on Padé approximants for the treatment of transient dynamics problems with the variational theory of complex rays

    NASA Astrophysics Data System (ADS)

    Rouzaud, C.; Gatuingt, F.; Hervé, G.; Dorival, O.

    2017-03-01

    Frequency-based methods were set up in order to circumvent the limits of classical finite element methods in fast dynamic simulations due to discretizations. In this approach the dynamic loading was shifted in the frequency domain by FFT, then treated by the Variational Theory of Complex Rays, and then the time response was reconstructed through an IFFT. This strategy proved to be very efficient due to the CPU VTCR very low cost. However in the case of a large loading spectrum this frequency-by-frequency approach could seriously degrade the computational performances of the strategy. This paper addresses this point by proposing the use of Padé approximants in order to limit the number of frequencies at which the response should be calculated. Padé approximation is applied to the overall VTCR system based on its frequency dependency. Finally, as simulations on a simple academic case and on a civil engineering structure show, this method is found to be very efficient for interpolating the frequency response functions of a complex structure. This is a key point to preserve the efficiency of the complete VTCR strategy for transient dynamic problems.

  2. Analyzing Mode Confusion via Model Checking

    NASA Technical Reports Server (NTRS)

    Luettgen, Gerald; Carreno, Victor

    1999-01-01

    Mode confusion is one of the most serious problems in aviation safety. Today's complex digital flight decks make it difficult for pilots to maintain awareness of the actual states, or modes, of the flight deck automation. NASA Langley leads an initiative to explore how formal techniques can be used to discover possible sources of mode confusion. As part of this initiative, a flight guidance system was previously specified as a finite Mealy automaton, and the theorem prover PVS was used to reason about it. The objective of the present paper is to investigate whether state-exploration techniques, especially model checking, are better able to achieve this task than theorem proving and also to compare several verification tools for the specific application. The flight guidance system is modeled and analyzed in Murphi, SMV, and Spin. The tools are compared regarding their system description language, their practicality for analyzing mode confusion, and their capabilities for error tracing and for animating diagnostic information. It turns out that their strengths are complementary.

  3. Total organic carbon analyzer

    NASA Astrophysics Data System (ADS)

    Godec, Richard G.; Kosenka, Paul P.; Smith, Brian D.; Hutte, Richard S.; Webb, Johanna V.; Sauer, Richard L.

    The development and testing of a breadboard version of a highly sensitive total-organic-carbon (TOC) analyzer are reported. Attention is given to the system components including the CO2 sensor, oxidation reactor, acidification module, and the sample-inlet system. Research is reported for an experimental reagentless oxidation reactor, and good results are reported for linearity, sensitivity, and selectivity in the CO2 sensor. The TOC analyzer is developed with gravity-independent components and is designed for minimal additions of chemical reagents. The reagentless oxidation reactor is based on electrolysis and UV photolysis and is shown to be potentially useful. The stability of the breadboard instrument is shown to be good on a day-to-day basis, and the analyzer is capable of 5 sample analyses per day for a period of about 80 days. The instrument can provide accurate TOC and TIC measurements over a concentration range of 20 ppb to 50 ppm C.

  4. Electrosurgical unit analyzers.

    PubMed

    1998-07-01

    Electrosurgical unit (ESU) analyzers automate the testing and inspection of the output circuits and safety features of ESUs. They perform testing that would otherwise require several other pieces of equipment, as well as considerably more time and greater technician expertise. They are used largely by clinical engineering departments for routine inspection and preventive maintenance (IPM) procedures and, less often, for accident investigations and troubleshooting. In this Evaluation, we tested three ESU analyzers from three suppliers. We rated all three analyzers Acceptable and ranked them in two groupings. In ranking the units, we placed the greatest weight on ease of use for routine ESU inspections, and gave additional consideration to versatility for advanced applications such as ESU research. The unit in Group 1 was the easiest to use, especially for infrequent users. The units in Group 2 were satisfactory but require more frequent use to maintain proficiency and to avoid user errors.

  5. Total organic carbon analyzer

    NASA Technical Reports Server (NTRS)

    Godec, Richard G.; Kosenka, Paul P.; Smith, Brian D.; Hutte, Richard S.; Webb, Johanna V.; Sauer, Richard L.

    1991-01-01

    The development and testing of a breadboard version of a highly sensitive total-organic-carbon (TOC) analyzer are reported. Attention is given to the system components including the CO2 sensor, oxidation reactor, acidification module, and the sample-inlet system. Research is reported for an experimental reagentless oxidation reactor, and good results are reported for linearity, sensitivity, and selectivity in the CO2 sensor. The TOC analyzer is developed with gravity-independent components and is designed for minimal additions of chemical reagents. The reagentless oxidation reactor is based on electrolysis and UV photolysis and is shown to be potentially useful. The stability of the breadboard instrument is shown to be good on a day-to-day basis, and the analyzer is capable of 5 sample analyses per day for a period of about 80 days. The instrument can provide accurate TOC and TIC measurements over a concentration range of 20 ppb to 50 ppm C.

  6. Using Networks to Visualize and Analyze Process Data for Educational Assessment

    ERIC Educational Resources Information Center

    Zhu, Mengxiao; Shu, Zhan; von Davier, Alina A.

    2016-01-01

    New technology enables interactive and adaptive scenario-based tasks (SBTs) to be adopted in educational measurement. At the same time, it is a challenging problem to build appropriate psychometric models to analyze data collected from these tasks, due to the complexity of the data. This study focuses on process data collected from SBTs. We…

  7. List mode multichannel analyzer

    DOEpatents

    Archer, Daniel E.; Luke, S. John; Mauger, G. Joseph; Riot, Vincent J.; Knapp, David A.

    2007-08-07

    A digital list mode multichannel analyzer (MCA) built around a programmable FPGA device for onboard data analysis and on-the-fly modification of system detection/operating parameters, and capable of collecting and processing data in very small time bins (<1 millisecond) when used in histogramming mode, or in list mode as a list mode MCA.

  8. Electronic sleep analyzer

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.

    1970-01-01

    Electronic instrument automatically monitors the stages of sleep of a human subject. The analyzer provides a series of discrete voltage steps with each step corresponding to a clinical assessment of level of consciousness. It is based on the operation of an EEG and requires very little telemetry bandwidth or time.

  9. Analyzing Workforce Education. Monograph.

    ERIC Educational Resources Information Center

    Texas Community & Technical Coll. Workforce Education Consortium.

    This monograph examines the issue of task analysis as used in workplace literacy programs, debating the need for it and how to perform it in a rapidly changing environment. Based on experiences of community colleges in Texas, the report analyzes ways that task analysis can be done and how to implement work force education programs more quickly.…

  10. Analyzing Stereotypes in Media.

    ERIC Educational Resources Information Center

    Baker, Jackie

    1996-01-01

    A high school film teacher studied how students recognized messages in film, examining how film education could help students identify and analyze racial and gender stereotypes. Comparison of students' attitudes before and after the film course found that the course was successful in raising students' consciousness. (SM)

  11. Analyzing Faculty Workload

    ERIC Educational Resources Information Center

    Holliman, Juanita M.

    1977-01-01

    Describes a step-by-step method for analyzing faculty workload which the author notes can determine exactly how a faculty member's time is spent and whether the hours available for teaching equal the hours required for teaching. Suggested uses for the method are noted, e.g., organizing the total work force based on desired curriculum changes. (SH)

  12. Micro acoustic spectrum analyzer

    DOEpatents

    Schubert, W. Kent; Butler, Michael A.; Adkins, Douglas R.; Anderson, Larry F.

    2004-11-23

    A micro acoustic spectrum analyzer for determining the frequency components of a fluctuating sound signal comprises a microphone to pick up the fluctuating sound signal and produce an alternating current electrical signal; at least one microfabricated resonator, each resonator having a different resonant frequency, that vibrate in response to the alternating current electrical signal; and at least one detector to detect the vibration of the microfabricated resonators. The micro acoustic spectrum analyzer can further comprise a mixer to mix a reference signal with the alternating current electrical signal from the microphone to shift the frequency spectrum to a frequency range that is a better matched to the resonant frequencies of the microfabricated resonators. The micro acoustic spectrum analyzer can be designed specifically for portability, size, cost, accuracy, speed, power requirements, and use in a harsh environment. The micro acoustic spectrum analyzer is particularly suited for applications where size, accessibility, and power requirements are limited, such as the monitoring of industrial equipment and processes, detection of security intrusions, or evaluation of military threats.

  13. Analyzing HVAC piping systems

    SciTech Connect

    Smith, W.W. )

    1993-10-01

    This article describes requirements and considerations for a software tool for analyzing both the hydraulic and heat transfer characteristics of a HVAC system to help in selecting systems components and predicting their performance. The topics of the article include analysis of installed system evolution, selection and analysis of pumps and valves, heat transfer in heating and cooling coils, and capacity to handle large systems.

  14. Development of BWR plant analyzer

    SciTech Connect

    Wulff, W.; Cheng, H.S.; Lekach, S.V.; Stritar, A.; Mallen, A.N.

    1984-01-01

    The BWR Plant Analyzer has been developed for realistic and accurate simulations of normal and severe abnormal transients in BWR power plants at high simulation speeds, low capital and operating costs and with outstanding user conveniences. The simulation encompasses neutron kinetics, heat conduction in fuel structures, nonequilibrium, nonhomogeneous coolant dynamics, steam line acoustics, and the dynamics of turbines, condensers, feedwater pumps and heaters, of the suppression pool, the control systems and the plant protection systems. These objectives have been achieved. Advanced modeling, using extensively analytical integration and dynamic evaluation of analytical solutions, has been combined with modern minicomputer technology for high-speed simulation of complex systems. The High-Speed Interactive Plant Analyzer code HIPA-BWR has been implemented on the AD10 peripheral parallel processor.

  15. Analyzing EUV mask costs

    NASA Astrophysics Data System (ADS)

    Lercel, Michael; Kasprowicz, Bryan

    2016-10-01

    The introduction of Extreme Ultraviolet Lithography (EUV) as a replacement for multiple patterning is based on improvements of cycle time, yield, and cost. Earlier cost studies have assumed a simple assumption that EUV masks (being more complex with the multilayer coated blank) are not more than three times as expensive as advanced ArFi (ArF immersion) masks. EUV masks are expected to be more expensive during the ramp of the technology because of the added cost of the complex mask blank, the use of EUV specific mask tools, and a ramp of yield learning relative to the more mature technologies. This study concludes that, within a range of scenarios, the hypothesis that EUV mask costs are not more than three times that of advanced ArFi masks is valid and conservative.

  16. PULSE AMPLITUDE ANALYZER

    DOEpatents

    Greenblatt, M.H.

    1958-03-25

    This patent pertains to pulse amplitude analyzers for sorting and counting a serles of pulses, and specifically discloses an analyzer which ls simple in construction and presents the puise height distribution visually on an oscilloscope screen. According to the invention, the pulses are applied to the vertical deflection plates of an oscilloscope and trigger the horizontal sweep. Each pulse starts at the same point on the screen and has a maximum amplitude substantially along the same vertical line. A mask is placed over the screen except for a slot running along the line where the maximum amplitudes of the pulses appear. After the slot has been scanned by a photocell in combination with a slotted rotating disk, the photocell signal is displayed on an auxiliary oscilloscope as vertical deflection along a horizontal time base to portray the pulse amplitude distribution.

  17. Soft Decision Analyzer

    NASA Technical Reports Server (NTRS)

    Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam

    2013-01-01

    We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.

  18. PULSE AMPLITUDE ANALYZER

    DOEpatents

    Gray, G.W.; Jensen, A.S.

    1957-10-22

    A pulse-height analyzer system of improved design for sorting and counting a series of pulses, such as provided by a scintillation detector in nuclear radiation measurements, is described. The analyzer comprises a main transmission line, a cathode-ray tube for each section of the line with its deflection plates acting as the line capacitance; means to bias the respective cathode ray tubes so that the beam strikes a target only when a prearranged pulse amplitude is applied, with each tube progressively biased to respond to smaller amplitudes; pulse generating and counting means associated with each tube to respond when the beam is deflected; a control transmission line having the same time constant as the first line per section with pulse generating means for each tube for initiating a pulse on the second transmission line when a pulse triggers the tube of corresponding amplitude response, the former pulse acting to prevent successive tubes from responding to the pulse under test. This arrangement permits greater deflection sensitivity in the cathode ray tube and overcomes many of the disadvantages of prior art pulse-height analyzer circuits.

  19. RELAPS desktop analyzer

    SciTech Connect

    Beelman, R.J.; Grush, W.H.; Mortensen, G.A.; Snider, D.M.; Wagner, K.L.

    1989-01-01

    The previously mainframe bound RELAP5 reactor safety computer code has been installed on a microcomputer. A simple color-graphic display driver has been developed to enable the user to view the code results as the calculation advances. In order to facilitate future interactive desktop applications, the Nuclear Plant Analyzer (NPA), also previously mainframe bound, is being redesigned to encompass workstation applications. The marriage of RELAP5 simulation capabilities with NPA interactive graphics on a desktop workstation promises to revolutionize reactor safety analysis methodology. 8 refs.

  20. Mineral/Water Analyzer

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An x-ray fluorescence spectrometer developed for the Viking Landers by Martin Marietta was modified for geological exploration, water quality monitoring, and aircraft engine maintenance. The aerospace system was highly miniaturized and used very little power. It irradiates the sample causing it to emit x-rays at various energies, then measures the energy levels for sample composition analysis. It was used in oceanographic applications and modified to identify element concentrations in ore samples, on site. The instrument can also analyze the chemical content of water, and detect the sudden development of excessive engine wear.

  1. Electrodynamic thermogravimetric analyzer

    NASA Astrophysics Data System (ADS)

    Spjut, R. Erik; Bar-Ziv, Ezra; Sarofim, Adel F.; Longwell, John P.

    1986-08-01

    The design and operation of a new device for studying single-aerosol-particle kinetics at elevated temperatures, the electrodynamic thermogravimetric analyzer (EDTGA), was examined theoretically and experimentally. The completed device consists of an electrodynamic balance modified to permit particle heating by a CO2 laser, temperature measurement by a three-color infrared-pyrometry system, and continuous weighing by a position-control system. In this paper, the position-control, particle-weight-measurement, heating, and temperature-measurement systems are described and their limitations examined.

  2. Analyzing the "correct" endpoint.

    PubMed

    Atherton, Pamela J; Novotny, Paul J; Tan, Angelina D

    2006-01-01

    The choice of QOL endpoints for a study should be based on which score will most likely change if the treatment is favorable. How the QOL change is calculated should be based on the expected amount of missing data, how many time points data will be collected, and whether extreme outliers in the scores impact results. The study should have sufficient power to detect a meaningful difference between arms (typically 10 points on a 0-100 point scale) in the chosen QOL endpoint. At the conclusion of a study, several secondary endpoints can be analyzed which can provide additional information and confirm primary endpoint results.

  3. Fluorescence analyzer for lignin

    DOEpatents

    Berthold, John W.; Malito, Michael L.; Jeffers, Larry

    1993-01-01

    A method and apparatus for measuring lignin concentration in a sample of wood pulp or black liquor comprises a light emitting arrangement for emitting an excitation light through optical fiber bundles into a probe which has an undiluted sensing end facing the sample. The excitation light causes the lignin concentration to produce fluorescent emission light which is then conveyed through the probe to analyzing equipment which measures the intensity of the emission light. Measures a This invention was made with Government support under Contract Number DOE: DE-FC05-90CE40905 awarded by the Department of Energy (DOE). The Government has certain rights in this invention.

  4. Portable Gas Analyzer

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The Michromonitor M500 universal gas analyzer contains a series of miniature modules, each of which is a complete gas chromatograph, an instrument which separates a gaseous mixture into its components and measures the concentrations of each gas in the mixture. The system is manufactured by Microsensor Technology, and is used for environmental analysis, monitoring for gas leaks and chemical spills, compliance with pollution laws, etc. The technology is based on a Viking attempt to detect life on Mars. Ames/Stanford miniaturized the system and NIOSH funded further development. Three Stanford researchers commercialized the technology, which can be operated by unskilled personnel.

  5. Multiple capillary biochemical analyzer

    DOEpatents

    Dovichi, Norman J.; Zhang, Jian Z.

    1995-01-01

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibres to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands.

  6. Field Deployable DNA analyzer

    SciTech Connect

    Wheeler, E; Christian, A; Marion, J; Sorensen, K; Arroyo, E; Vrankovich, G; Hara, C; Nguyen, C

    2005-02-09

    This report details the feasibility of a field deployable DNA analyzer. Steps for swabbing cells from surfaces and extracting DNA in an automatable way are presented. Since enzymatic amplification reactions are highly sensitive to environmental contamination, sample preparation is a crucial step to make an autonomous deployable instrument. We perform sample clean up and concentration in a flow through packed bed. For small initial samples, whole genome amplification is performed in the packed bed resulting in enough product for subsequent PCR amplification. In addition to DNA, which can be used to identify a subject, protein is also left behind, the analysis of which can be used to determine exposure to certain substances, such as radionuclides. Our preparative step for DNA analysis left behind the protein complement as a waste stream; we determined to learn if the proteins themselves could be analyzed in a fieldable device. We successfully developed a two-step lateral flow assay for protein analysis and demonstrate a proof of principle assay.

  7. Multiple capillary biochemical analyzer

    DOEpatents

    Dovichi, N.J.; Zhang, J.Z.

    1995-08-08

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibers to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands. 21 figs.

  8. Plutonium solution analyzer

    SciTech Connect

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  9. Analyzing the platelet proteome.

    PubMed

    García, Angel; Zitzmann, Nicole; Watson, Steve P

    2004-08-01

    During the last 10 years, mass spectrometry (MS) has become a key tool for protein analysis and has underpinned the emerging field of proteomics. Using high-throughput tandem MS/MS following protein separation, it is potentially possible to analyze hundreds to thousands of proteins in a sample at a time. This technology can be used to analyze the protein content (i.e., the proteome) of any cell or tissue and complements the powerful field of genomics. The technology is particularly suitable for platelets because of the absence of a nucleus. Cellular proteins can be separated by either gel-based methods such as two-dimensional gel electrophoresis or one-dimensional sodium dodecyl sulfate polyacrylamide gel electrophoresis followed by liquid chromatography (LC) -MS/MS or by multidimensional LC-MS/MS. Prefractionation techniques, such as subcellular fractionations or immunoprecipitations, can be used to improve the analysis. Each method has particular advantages and disadvantages. Proteomics can be used to compare the proteome of basal and diseased platelets, helping to reveal information on the molecular basis of the disease.

  10. Ring Image Analyzer

    NASA Technical Reports Server (NTRS)

    Strekalov, Dmitry V.

    2012-01-01

    Ring Image Analyzer software analyzes images to recognize elliptical patterns. It determines the ellipse parameters (axes ratio, centroid coordinate, tilt angle). The program attempts to recognize elliptical fringes (e.g., Newton Rings) on a photograph and determine their centroid position, the short-to-long-axis ratio, and the angle of rotation of the long axis relative to the horizontal direction on the photograph. These capabilities are important in interferometric imaging and control of surfaces. In particular, this program has been developed and applied for determining the rim shape of precision-machined optical whispering gallery mode resonators. The program relies on a unique image recognition algorithm aimed at recognizing elliptical shapes, but can be easily adapted to other geometric shapes. It is robust against non-elliptical details of the image and against noise. Interferometric analysis of precision-machined surfaces remains an important technological instrument in hardware development and quality analysis. This software automates and increases the accuracy of this technique. The software has been developed for the needs of an R&TD-funded project and has become an important asset for the future research proposal to NASA as well as other agencies.

  11. Analyzing crime scene videos

    NASA Astrophysics Data System (ADS)

    Cunningham, Cindy C.; Peloquin, Tracy D.

    1999-02-01

    Since late 1996 the Forensic Identification Services Section of the Ontario Provincial Police has been actively involved in state-of-the-art image capture and the processing of video images extracted from crime scene videos. The benefits and problems of this technology for video analysis are discussed. All analysis is being conducted on SUN Microsystems UNIX computers, networked to a digital disk recorder that is used for video capture. The primary advantage of this system over traditional frame grabber technology is reviewed. Examples from actual cases are presented and the successes and limitations of this approach are explored. Suggestions to companies implementing security technology plans for various organizations (banks, stores, restaurants, etc.) will be made. Future directions for this work and new technologies are also discussed.

  12. Analyzing Water's Optical Absorption

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  13. Analyzing geographic clustered response

    SciTech Connect

    Merrill, D.W.; Selvin, S.; Mohr, M.S.

    1991-08-01

    In the study of geographic disease clusters, an alternative to traditional methods based on rates is to analyze case locations on a transformed map in which population density is everywhere equal. Although the analyst's task is thereby simplified, the specification of the density equalizing map projection (DEMP) itself is not simple and continues to be the subject of considerable research. Here a new DEMP algorithm is described, which avoids some of the difficulties of earlier approaches. The new algorithm (a) avoids illegal overlapping of transformed polygons; (b) finds the unique solution that minimizes map distortion; (c) provides constant magnification over each map polygon; (d) defines a continuous transformation over the entire map domain; (e) defines an inverse transformation; (f) can accept optional constraints such as fixed boundaries; and (g) can use commercially supported minimization software. Work is continuing to improve computing efficiency and improve the algorithm. 21 refs., 15 figs., 2 tabs.

  14. UCNB_Analyzer

    SciTech Connect

    Broussard, Leah J

    2016-01-17

    The purpose of this software is to interpret and analyze data taken using the NI PXIe-5171R digitizer based data acquisition system for the UCNB and Nab experiments. The detection and data acquisition systems are identical for the 2 experiments, with some differences in analysis requirements. The software converts raw binary files produced by the NI DAQ into ROOT TTree format, performs waveform analysis using trapezoidal filter algorithms, pulse fitting, and noise analysis routines, and applies variable criteria to identify valid events in the data stream. The software will be used to perform analysis of the events for multi-channel coincidences, timing and energy studies, and event rates under different experimental conditions.

  15. Analyzing Next to Nothing

    NASA Astrophysics Data System (ADS)

    Taylor, G. J.

    2000-04-01

    Analytical techniques have advanced so far that it is possible to slice up a sample only 10 micrometers across (with a mass of only a billionth of a gram) so that a dozen microanalytical techniques can be used to extract fascinating, crucial information about the sample's history. This astonishing ability is useful in analyzing interplanetary dust collected in the stratosphere, tiny interstellar grains in meteorites, sparse and wispy weathering products in Martian meteorites, and samples to be collected and returned to Earth by current and future sample return missions from comets, asteroids, Martian moons, and Mars. The importance of the array of techniques available to cosmochemists has been documented by Michael Zolensky (Johnson Space Center), Carle Pieters (Brown University), Benton Clark (Lockheed Martin Astronautics, Denver), and James Papike (University of New Mexico), with special attention to sample-return missions.

  16. Analyzing a Cometary 'Sneeze'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Figure 1: Analyzing a Cometary 'Sneeze'

    This display shows highly processed images of the outburst of comet Tempel 1 between June 22 and 23, 2005. The pictures were taken by Deep Impact's medium-resolution camera. An average image of the comet has been subtracted from each picture to provide an enhanced view of the outburst. The intensity has also been stretched to show the faintest parts. This processing enables measurement of the outflow speed and the details of the dissipation of the outburst. The left image was taken when the comet was very close to its normal, non-bursting state, so almost nothing is visible.

  17. Moving particle composition analyzer

    NASA Technical Reports Server (NTRS)

    Auer, S. O. (Inventor)

    1976-01-01

    A mass spectrometry apparatus for analyzing the composition of moving microscopic particles is introduced. The apparatus includes a capacitor with a front electrode upon which the particles impinge, a back electrode, and a solid dielectric sandwiched between the front and back electrodes. In one embodiment, the electrodes and dielectric are arcuately shaped as concentric peripheral segments of different spheres having a common center and different radii. The front electrode and dielectric together have a thickness such that an impinging particle can penetrate them. In a second embodiment, the capacitor has planar, parallel electrodes, in which case the ejected positive ions are deflected downstream of a planar grid by a pair of spaced, arcuate capacitor plates having a region between them through which the ejected ions travel.

  18. Motion detector and analyzer

    DOEpatents

    Unruh, W.P.

    1987-03-23

    Method and apparatus are provided for deriving positive and negative Doppler spectrum to enable analysis of objects in motion, and particularly, objects having rotary motion. First and second returned radar signals are mixed with internal signals to obtain an in-phase process signal and a quadrature process signal. A broad-band phase shifter shifts the quadrature signal through 90/degree/ relative to the in-phase signal over a predetermined frequency range. A pair of signals is output from the broad-band phase shifter which are then combined to provide a first side band signal which is functionally related to a negative Doppler shift spectrum. The distinct positive and negative Doppler spectra may then be analyzed for the motion characteristics of the object being examined.

  19. ROBOT TASK SCENE ANALYZER

    SciTech Connect

    William R. Hamel; Steven Everett

    2000-08-01

    Environmental restoration and waste management (ER and WM) challenges in the United States Department of Energy (DOE), and around the world, involve radiation or other hazards which will necessitate the use of remote operations to protect human workers from dangerous exposures. Remote operations carry the implication of greater costs since remote work systems are inherently less productive than contact human work due to the inefficiencies/complexities of teleoperation. To reduce costs and improve quality, much attention has been focused on methods to improve the productivity of combined human operator/remote equipment systems; the achievements to date are modest at best. The most promising avenue in the near term is to supplement conventional remote work systems with robotic planning and control techniques borrowed from manufacturing and other domains where robotic automation has been used. Practical combinations of teleoperation and robotic control will yield telerobotic work systems that outperform currently available remote equipment. It is believed that practical telerobotic systems may increase remote work efficiencies significantly. Increases of 30% to 50% have been conservatively estimated for typical remote operations. It is important to recognize that the basic hardware and software features of most modern remote manipulation systems can readily accommodate the functionality required for telerobotics. Further, several of the additional system ingredients necessary to implement telerobotic control--machine vision, 3D object and workspace modeling, automatic tool path generation and collision-free trajectory planning--are existent.

  20. Analyzing costs of space debris mitigation methods

    NASA Astrophysics Data System (ADS)

    Wiedemann, C.; Krag, H.; Bendisch, J.; Sdunnus, H.

    The steadily increasing number of space objects poses a considerable hazard to all kinds of spacecraft. To reduce the risks to future space missions different debris mitigation measures and spacecraft protection techniques have been investigated during the last years. However, the economic efficiency has not been considered yet in this context. This economical background is not always clear to satellite operators and the space industry. Current studies have the objective to evaluate the mission costs due to space debris in a business as usual (no mitigation) scenario compared to the missions costs considering debris mitigation. The aim i an estimation of thes time until the investment in debris mitigation will lead to an effective reduction of mission costs. This paper presents the results of investigations on the key problems of cost estimation for spacecraft and the influence of debris mitigation and shielding on cost. The shielding of a satellite can be an effective method to protect the spacecraft against debris impact. Mitigation strategies like the reduction of orbital lifetime and de- or re-orbit of non-operational satellites are methods to control the space debris environment. These methods result in an increase of costs. In a first step the overall costs of different types of unmanned satellites are analyzed. The key problem is, that it is not possible to provide a simple cost model that can be applied to all types of satellites. Unmanned spacecraft differ very much in mission, complexity of design, payload and operational lifetime. It is important to classify relevant cost parameters and investigate their influence on the respective mission. The theory of empirical cost estimation and existing cost models are discussed. A selected cost model is simplified and generalized for an application on all operational satellites. In a next step the influence of space debris on cost is treated, if the implementation of mitigation strategies is considered.

  1. Analyzing Atmospheric Neutrino Oscillations

    SciTech Connect

    Escamilla, J.; Ernst, D. J.; Latimer, D. C.

    2007-10-26

    We provide a pedagogic derivation of the formula needed to analyze atmospheric data and then derive, for the subset of the data that are fully-contained events, an analysis tool that is quantitative and numerically efficient. Results for the full set of neutrino oscillation data are then presented. We find the following preliminary results: 1.) the sub-dominant approximation provides reasonable values for the best fit parameters for {delta}{sub 32}, {theta}{sub 23}, and {theta}{sub 13} but does not quantitatively provide the errors for these three parameters; 2.) the size of the MSW effect is suppressed in the sub-dominant approximation; 3.) the MSW effect reduces somewhat the extracted error for {delta}{sub 32}, more so for {theta}{sub 23} and {theta}{sub 13}; 4.) atmospheric data alone constrains the allowed values of {theta}{sub 13} only in the sub-dominant approximation, the full three neutrino calculations requires CHOOZ to get a clean constraint; 5.) the linear in {theta}{sub 13} terms are not negligible; and 6.) the minimum value of {theta}{sub 13} is found to be negative, but at a statistically insignificant level.

  2. Analyzing Spacecraft Telecommunication Systems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  3. PULSE HEIGHT ANALYZER

    DOEpatents

    Goldsworthy, W.W.

    1958-06-01

    A differential pulse-height discriminator circuit is described which is readily adaptable for operation in a single-channel pulse-height analyzer. The novel aspect of the circuit lies in the specific arrangement of differential pulse-height discriminator which includes two pulse-height discriminators having a comnnon input and an anticoincidence circuit having two interconnected vacuum tubes with a common cathode resistor. Pulses from the output of one discriminator circuit are delayed and coupled to the grid of one of the anticoincidence tubes by a resistor. The output pulses from the other discriminator circuit are coupled through a cathode follower circuit, which has a cathode resistor of such value as to provide a long time constant with the interelectrode capacitance of the tube, to lenthen the output pulses. The pulses are then fed to the grid of the other anticoincidence tube. With such connections of the circuits, only when the incoming pulse has a pesk value between the operating levels of the two discriminators does an output pulse occur from the anticoincidence circuit.

  4. Analyzing nocturnal noise stratification.

    PubMed

    Rey Gozalo, Guillermo; Barrigón Morillas, Juan Miguel; Gómez Escobar, Valentín

    2014-05-01

    Pollution associated to traffic can be considered as one of the most relevant pollution sources in our cities; noise is one of the major components of traffic pollution; thus, efforts are necessary to search adequate noise assessment methods and low pollution city designs. Different methods have been proposed for the evaluation of noise in cities, including the categorization method, which is based on the functionality concept. Until now, this method has only been studied (with encouraging results) for short-term, diurnal measurements, but nocturnal noise presents a behavior clearly different on respect to the diurnal one. In this work 45 continuous measurements of approximately one week each in duration are statistically analyzed to identify differences between the proposed categories. The results show that the five proposed categories highlight the noise stratification of the studied city in each period of the day (day, evening, and night). A comparison of the continuous measurements with previous short-term measurements indicates that the latter can be a good approximation of the former in diurnal period, reducing the resource expenditure for noise evaluation. Annoyance estimated from the measured noise levels was compared with the response of population obtained from a questionnaire with good agreement. The categorization method can yield good information about the distribution of a pollutant associated to traffic in our cities in each period of the day and, therefore, is a powerful tool for town planning and the design of pollution prevention policies.

  5. Lorentz force particle analyzer

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Thess, André; Moreau, René; Tan, Yanqing; Dai, Shangjun; Tao, Zhen; Yang, Wenzhi; Wang, Bo

    2016-07-01

    A new contactless technique is presented for the detection of micron-sized insulating particles in the flow of an electrically conducting fluid. A transverse magnetic field brakes this flow and tends to become entrained in the flow direction by a Lorentz force, whose reaction force on the magnetic-field-generating system can be measured. The presence of insulating particles suspended in the fluid produce changes in this Lorentz force, generating pulses in it; these pulses enable the particles to be counted and sized. A two-dimensional numerical model that employs a moving mesh method demonstrates the measurement principle when such a particle is present. Two prototypes and a three-dimensional numerical model are used to demonstrate the feasibility of a Lorentz force particle analyzer (LFPA). The findings of this study conclude that such an LFPA, which offers contactless and on-line quantitative measurements, can be applied to an extensive range of applications. These applications include measurements of the cleanliness of high-temperature and aggressive molten metal, such as aluminum and steel alloys, and the clean manufacturing of semiconductors.

  6. TEAMS Model Analyzer

    NASA Technical Reports Server (NTRS)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  7. Analyzing Human Communication Networks in Organizations: Applications to Management Problems.

    ERIC Educational Resources Information Center

    Farace, Richard V.; Danowski, James A.

    Investigating the networks of communication in organizations leads to an understanding of efficient and inefficient information dissemination as practiced in large systems. Most important in organizational communication is the role of the "liaison person"--the coordinator of intercommunication. When functioning efficiently, coordinators maintain…

  8. Analyzing Performance Problems; or "You Really Oughta Wanna".

    ERIC Educational Resources Information Center

    Mager, Robert F.; Pipe, Peter

    When faced with a discrepancy between the actual and the desired performance of a student, employee, or acquaintance, the usual course of action is to "train, transfer, or terminate" the individual. The authors believe that while these may sometimes be appropriate solutions appropriately applied, more often they are not. They offer a procedure for…

  9. Digital Microfluidics Sample Analyzer

    NASA Technical Reports Server (NTRS)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  10. Soft Decision Analyzer

    NASA Technical Reports Server (NTRS)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  11. Regolith Evolved Gas Analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, John H.; Hedgecock, Jud; Nienaber, Terry; Cooper, Bonnie; Allen, Carlton; Ming, Doug

    2000-01-01

    The Regolith Evolved Gas Analyzer (REGA) is a high-temperature furnace and mass spectrometer instrument for determining the mineralogical composition and reactivity of soil samples. REGA provides key mineralogical and reactivity data that is needed to understand the soil chemistry of an asteroid, which then aids in determining in-situ which materials should be selected for return to earth. REGA is capable of conducting a number of direct soil measurements that are unique to this instrument. These experimental measurements include: (1) Mass spectrum analysis of evolved gases from soil samples as they are heated from ambient temperature to 900 C; and (2) Identification of liberated chemicals, e.g., water, oxygen, sulfur, chlorine, and fluorine. REGA would be placed on the surface of a near earth asteroid. It is an autonomous instrument that is controlled from earth but does the analysis of regolith materials automatically. The REGA instrument consists of four primary components: (1) a flight-proven mass spectrometer, (2) a high-temperature furnace, (3) a soil handling system, and (4) a microcontroller. An external arm containing a scoop or drill gathers regolith samples. A sample is placed in the inlet orifice where the finest-grained particles are sifted into a metering volume and subsequently moved into a crucible. A movable arm then places the crucible in the furnace. The furnace is closed, thereby sealing the inner volume to collect the evolved gases for analysis. Owing to the very low g forces on an asteroid compared to Mars or the moon, the sample must be moved from inlet to crucible by mechanical means rather than by gravity. As the soil sample is heated through a programmed pattern, the gases evolved at each temperature are passed through a transfer tube to the mass spectrometer for analysis and identification. Return data from the instrument will lead to new insights and discoveries including: (1) Identification of the molecular masses of all of the gases

  12. Crew Activity Analyzer

    NASA Technical Reports Server (NTRS)

    Murray, James; Kirillov, Alexander

    2008-01-01

    The crew activity analyzer (CAA) is a system of electronic hardware and software for automatically identifying patterns of group activity among crew members working together in an office, cockpit, workshop, laboratory, or other enclosed space. The CAA synchronously records multiple streams of data from digital video cameras, wireless microphones, and position sensors, then plays back and processes the data to identify activity patterns specified by human analysts. The processing greatly reduces the amount of time that the analysts must spend in examining large amounts of data, enabling the analysts to concentrate on subsets of data that represent activities of interest. The CAA has potential for use in a variety of governmental and commercial applications, including planning for crews for future long space flights, designing facilities wherein humans must work in proximity for long times, improving crew training and measuring crew performance in military settings, human-factors and safety assessment, development of team procedures, and behavioral and ethnographic research. The data-acquisition hardware of the CAA (see figure) includes two video cameras: an overhead one aimed upward at a paraboloidal mirror on the ceiling and one mounted on a wall aimed in a downward slant toward the crew area. As many as four wireless microphones can be worn by crew members. The audio signals received from the microphones are digitized, then compressed in preparation for storage. Approximate locations of as many as four crew members are measured by use of a Cricket indoor location system. [The Cricket indoor location system includes ultrasonic/radio beacon and listener units. A Cricket beacon (in this case, worn by a crew member) simultaneously transmits a pulse of ultrasound and a radio signal that contains identifying information. Each Cricket listener unit measures the difference between the times of reception of the ultrasound and radio signals from an identified beacon

  13. Whole blood coagulation analyzers.

    PubMed

    1997-08-01

    Whole blood Coagulation analyzers (WBCAs) are widely used point-of-care (POC) testing devices found primarily in cardiothoracic surgical suites and cardia catheterization laboratories. Most of these devices can perform a number of coagulation tests that provide information about a patient's blood clotting status. Clinicians use the results of the WBCA tests, which are available minutes after applying a blood sample, primarily to monitor the effectiveness of heparin therapy--an anticoagulation therapy used during cardiopulmonary bypass (CPB) surgery, angioplasty, hemodialysis, and other clinical procedures. In this study we evaluated five WBCAs from four suppliers. Our testing focused on the applications for which WBCAs are primarily used: Monitoring moderate to high heparin levels, as would be required, for example, during CPB are angioplasty. For this function, WCBAs are typically used to perform an activated clotting time (ACT) test or, as one supplier refers to its test, a heparin management test (HMT). All models included in this study offered an ACT test or an HMT. Monitoring low heparin levels, as would be required, for example,during hemodialysis. For this function, WBCAs would normally be used to perform either a low-range ACT (LACT) test or a whole blood activated partial thromboplastin time (WBAPTT) test. Most of the evaluated units could perform at least one of these tests; one unit did not offer either test and was therefore not rated for this application. We rated and ranked each evaluated model separately for each of these two applications. In addition, we provided a combined rating and ranking that considers the units' appropriateness for performing both application. We based our conclusions on a unit's performance and humans factor design, as determined by our testing, and on its five-year life-cycle cost, as determined by our net present value (NPV) analysis. While we rated all evaluated units acceptable for each appropriate category, we did

  14. Statistical considerations when analyzing biomarker data.

    PubMed

    Beam, Craig A

    2015-11-01

    Biomarkers have become, and will continue to become, increasingly important to clinical immunology research. Yet, biomarkers often present new problems and raise new statistical and study design issues to scientists working in clinical immunology. In this paper I discuss statistical considerations related to the important biomarker problems of: 1) The design and analysis of clinical studies which seek to determine whether changes from baseline in a biomarker are associated with changes in a metabolic outcome; 2) The conditions that are required for a biomarker to be considered a "surrogate"; 3) Considerations that arise when analyzing whether or not a predictive biomarker could act as a surrogate endpoint; 4) Biomarker timing relative to the clinical endpoint; 5) The problem of analyzing studies that measure many biomarkers from few subjects; and, 6) The use of statistical models when analyzing biomarker data arising from count data.

  15. Space complexity of estimation of distribution algorithms.

    PubMed

    Gao, Yong; Culberson, Joseph

    2005-01-01

    In this paper, we investigate the space complexity of the Estimation of Distribution Algorithms (EDAs), a class of sampling-based variants of the genetic algorithm. By analyzing the nature of EDAs, we identify criteria that characterize the space complexity of two typical implementation schemes of EDAs, the factorized distribution algorithm and Bayesian network-based algorithms. Using random additive functions as the prototype, we prove that the space complexity of the factorized distribution algorithm and Bayesian network-based algorithms is exponential in the problem size even if the optimization problem has a very sparse interaction structure.

  16. An Artificial Intelligence Approach to Analyzing Student Errors in Statistics.

    ERIC Educational Resources Information Center

    Sebrechts, Marc M.; Schooler, Lael J.

    1987-01-01

    Describes the development of an artificial intelligence system called GIDE that analyzes student errors in statistics problems by inferring the students' intentions. Learning strategies involved in problem solving are discussed and the inclusion of goal structures is explained. (LRW)

  17. Analyzing machine noise for real time maintenance

    NASA Astrophysics Data System (ADS)

    Yamato, Yoji; Fukumoto, Yoshifumi; Kumazaki, Hiroki

    2017-02-01

    Recently, IoT technologies have been progressed and applications of maintenance area are expected. However, IoT maintenance applications are not spread in Japan yet because of one-off solution of sensing and analyzing for each case, high cost to collect sensing data and insufficient maintenance automation. This paper proposes a maintenance platform which analyzes sound data in edges, analyzes only anomaly data in cloud and orders maintenance automatically to resolve existing technology problems. We also implement a sample application and compare related work.

  18. Developing Collaboration in Complex Events: A Model for Civil-Military Inter-Organizational Problem-Solving and Decision-Making

    DTIC Science & Technology

    2011-06-01

    2009, p.2). Given the wide adoption of principles and structures associated with the Incident Command System (ICS) in emergency management, it was...problem-solving and decision-making characteristics, distribution of authority, interaction and role patterns, and associated sectors. Table 1...inter-organizational problem-solving and decision-making processes associated with six recent extreme events in Canada, and three international events

  19. Review: oculomotor cranial nerve palsies: symptoms, problems and non-surgical preoperative management of the resultant complex incomitant strabismus and monocular and binocular vision disturbances.

    PubMed

    Khawam, Edward; Fahed, Daoud

    2012-01-01

    The purpose of this presentation is first to describe the symptoms and problems encountered in cranial nerve palsies (CNP). The purpose is also to describe the different means of treatment during the observational preoperative period and their positive or negative impact on each of the symptoms and problems. Finally, we will present our way of handling these patients in their preoperative period: practical, inexpensive, and unsophisticated means that keep the patient comfortable and prevent the secondary untoward effects that can take place.

  20. Modular thermal analyzer routine, volume 1

    NASA Technical Reports Server (NTRS)

    Oren, J. A.; Phillips, M. A.; Williams, D. R.

    1972-01-01

    The Modular Thermal Analyzer Routine (MOTAR) is a general thermal analysis routine with strong capabilities for performing thermal analysis of systems containing flowing fluids, fluid system controls (valves, heat exchangers, etc.), life support systems, and thermal radiation situations. Its modular organization permits the analysis of a very wide range of thermal problems for simple problems containing a few conduction nodes to those containing complicated flow and radiation analysis with each problem type being analyzed with peak computational efficiency and maximum ease of use. The organization and programming methods applied to MOTAR achieved a high degree of computer utilization efficiency in terms of computer execution time and storage space required for a given problem. The computer time required to perform a given problem on MOTAR is approximately 40 to 50 percent that required for the currently existing widely used routines. The computer storage requirement for MOTAR is approximately 25 percent more than the most commonly used routines for the most simple problems but the data storage techniques for the more complicated options should save a considerable amount of space.

  1. Problem Solving

    ERIC Educational Resources Information Center

    Kinsella, John J.

    1970-01-01

    Discussed are the nature of a mathematical problem, problem solving in the traditional and modern mathematics programs, problem solving and psychology, research related to problem solving, and teaching problem solving in algebra and geometry. (CT)

  2. Classroom Learning and Achievement: How the Complexity of Classroom Interaction Impacts Students' Learning

    ERIC Educational Resources Information Center

    Podschuweit, Sören; Bernholt, Sascha; Brückmann, Maja

    2016-01-01

    Background: Complexity models have provided a suitable framework in various domains to assess students' educational achievement. Complexity is often used as the analytical focus when regarding learning outcomes, i.e. when analyzing written tests or problem-centered interviews. Numerous studies reveal negative correlations between the complexity of…

  3. Cognitive Complexity of Mathematics Instructional Tasks in a Taiwanese Classroom: An Examination of Task Sources

    ERIC Educational Resources Information Center

    Hsu, Hui-Yu; Silver, Edward A.

    2014-01-01

    We examined geometric calculation with number tasks used within a unit of geometry instruction in a Taiwanese classroom, identifying the source of each task used in classroom instruction and analyzing the cognitive complexity of each task with respect to 2 distinct features: diagram complexity and problem-solving complexity. We found that…

  4. Problems of Indian Children.

    ERIC Educational Resources Information Center

    Linton, Marigold

    Previous approaches to the learning problems of American Indian children are viewed as inadequate. An alternative is suggested which emphasizes the problem solution strategies which these children bring to the school situation. Solutions were analyzed in terms of: (1) their probability; (2) their efficiency at permitting a present problem to be…

  5. Boundary force method for analyzing two-dimensional cracked bodies

    NASA Technical Reports Server (NTRS)

    Tan, P. W.; Raju, I. S.; Newman, J. C., Jr.

    1986-01-01

    The Boundary Force Method (BFM) was formulated for the two-dimensional stress analysis of complex crack configurations. In this method, only the boundaries of the region of interest are modeled. The boundaries are divided into a finite number of straight-line segments, and at the center of each segment, concentrated forces and a moment are applied. This set of unknown forces and moments is calculated to satisfy the prescribed boundary conditions of the problem. The elasticity solution for the stress distribution due to concentrated forces and a moment applied at an arbitrary point in a cracked infinite plate are used as the fundamental solution. Thus, the crack need not be modeled as part of the boundary. The formulation of the BFM is described and the accuracy of the method is established by analyzing several crack configurations for which accepted stress-intensity factor solutions are known. The crack configurations investigated include mode I and mixed mode (mode I and II) problems. The results obtained are, in general, within + or - 0.5 percent of accurate numerical solutions. The versatility of the method is demonstrated through the analysis of complex crack configurations for which limited or no solutions are known.

  6. A Framework for Modeling and Analyzing Complex Distributed Systems

    DTIC Science & Technology

    2005-08-15

    display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 2. REPORT TYPE 3. DATES...distributed algorithms and real-time control systems [56, 107, 32, 84]. Timed 1/0 Automata are interacting state machines. They are nondeterministic, which...Communication: group communication systems, broadcast and multicast systems with quality-of-service guarantees. (c) Coordination and control : traffic

  7. TECHNIQUES FOR ANALYZING COMPLEX MIXTURES OF DRINKING WATER DBPS

    EPA Science Inventory

    Although chlorine has been used to disinfect drinking water for approximately 100 years, there have been concerns raised over its use, due to the formation of potentially hazardous by-products. Trihalomethanes (THMs) were the first disinfection by-products (DBPs) identified and ...

  8. Analyzing the Localization of Retail Stores with Complex Systems Tools

    NASA Astrophysics Data System (ADS)

    Jensen, Pablo

    Measuring the spatial distribution of locations of many entities (trees, atoms, economic activities, ...), and, more precisely, the deviations from purely random configurations, is a powerful method to unravel their underlying interactions. I study here the spatial organization of retail commercial activities. From pure location data, network analysis leads to a community structure that closely follows the commercial classification of the US Department of Labor. The interaction network allows to build a ’quality’ index of optimal location niches for stores, which has been empirically tested.

  9. L.E.A.D.: A Framework for Evidence Gathering and Use for the Prevention of Obesity and Other Complex Public Health Problems

    ERIC Educational Resources Information Center

    Chatterji, Madhabi; Green, Lawrence W.; Kumanyika, Shiriki

    2014-01-01

    This article summarizes a comprehensive, systems-oriented framework designed to improve the use of a wide variety of evidence sources to address population-wide obesity problems. The L.E.A.D. framework (for "Locate" the evidence, "Evaluate" the evidence, "Assemble" the evidence, and inform "Decisions"),…

  10. Droplet actuator analyzer with cartridge

    NASA Technical Reports Server (NTRS)

    Smith, Gregory F. (Inventor); Sturmer, Ryan A. (Inventor); Paik, Philip Y. (Inventor); Srinivasan, Vijay (Inventor); Pollack, Michael G. (Inventor); Pamula, Vamsee K. (Inventor); Brafford, Keith R. (Inventor); West, Richard M. (Inventor)

    2011-01-01

    A droplet actuator with cartridge is provided. According to one embodiment, a sample analyzer is provided and includes an analyzer unit comprising electronic or optical receiving means, a cartridge comprising self-contained droplet handling capabilities, and a wherein the cartridge is coupled to the analyzer unit by a means which aligns electronic and/or optical outputs from the cartridge with electronic or optical receiving means on the analyzer unit. According to another embodiment, a sample analyzer is provided and includes a sample analyzer comprising a cartridge coupled thereto and a means of electrical interface and/or optical interface between the cartridge and the analyzer, whereby electrical signals and/or optical signals may be transmitted from the cartridge to the analyzer.

  11. Soft Decision Analyzer and Method

    NASA Technical Reports Server (NTRS)

    Steele, Glen F. (Inventor); Lansdowne, Chatwin (Inventor); Zucha, Joan P. (Inventor); Schlesinger, Adam M. (Inventor)

    2016-01-01

    A soft decision analyzer system is operable to interconnect soft decision communication equipment and analyze the operation thereof to detect symbol wise alignment between a test data stream and a reference data stream in a variety of operating conditions.

  12. Soft Decision Analyzer and Method

    NASA Technical Reports Server (NTRS)

    Steele, Glen F. (Inventor); Lansdowne, Chatwin (Inventor); Zucha, Joan P. (Inventor); Schlesinger, Adam M. (Inventor)

    2015-01-01

    A soft decision analyzer system is operable to interconnect soft decision communication equipment and analyze the operation thereof to detect symbol wise alignment between a test data stream and a reference data stream in a variety of operating conditions.

  13. Tourette Syndrome: Overview and Classroom Interventions. A Complex Neurobehavioral Disorder Which May Involve Learning Problems, Attention Deficit Hyperactivity Disorder, Obsessive Compulsive Symptoms, and Stereotypical Behaviors.

    ERIC Educational Resources Information Center

    Fisher, Ramona A.; Collins, Edward C.

    Tourette Syndrome is conceptualized as a neurobehavioral disorder, with behavioral aspects that are sometimes difficult for teachers to understand and deal with. The disorder has five layers of complexity: (1) observable multiple motor, vocal, and cognitive tics and sensory involvement; (2) Attention Deficit Hyperactivity Disorder; (3)…

  14. Marine Corps Intelligence and All-Source Fused Analysis Support to Marine and Joint Operating Forces: Complexities, Problems, and Challenges for the Future

    DTIC Science & Technology

    1993-06-01

    each other. Recording media generally consist of enemy situation maps at various aAlthough a purist might seek to differentiate technically between the...a.s a cnnso1idatqd., clasely knit group of intelligence ~~iersand *pecialists will provide the intelligence jyecesay to su pport contingency planning...of analysis including personal values, perceptions, prejudices, biases, and other social and cultural considerations; problem solving skills; reasoning

  15. Balance Problems

    MedlinePlus

    ... often, it could be a sign of a balance problem. Balance problems can make you feel unsteady or as ... fall-related injuries, such as hip fracture. Some balance problems are due to problems in the inner ...

  16. When 'solutions of yesterday become problems of today': crisis-ridden decision making in a complex adaptive system (CAS)--the Additional Duty Hours Allowance in Ghana.

    PubMed

    Agyepong, Irene Akua; Kodua, Augustina; Adjei, Sam; Adam, Taghreed

    2012-10-01

    Implementation of policies (decisions) in the health sector is sometimes defeated by the system's response to the policy itself. This can lead to counter-intuitive, unanticipated, or more modest effects than expected by those who designed the policy. The health sector fits the characteristics of complex adaptive systems (CAS) and complexity is at the heart of this phenomenon. Anticipating both positive and negative effects of policy decisions, understanding the interests, power and interaction between multiple actors; and planning for the delayed and distal impact of policy decisions are essential for effective decision making in CAS. Failure to appreciate these elements often leads to a series of reductionist approach interventions or 'fixes'. This in turn can initiate a series of negative feedback loops that further complicates the situation over time. In this paper we use a case study of the Additional Duty Hours Allowance (ADHA) policy in Ghana to illustrate these points. Using causal loop diagrams, we unpack the intended and unintended effects of the policy and how these effects evolved over time. The overall goal is to advance our understanding of decision making in complex adaptive systems; and through this process identify some essential elements in formulating, updating and implementing health policy that can help to improve attainment of desired outcomes and minimize negative unintended effects.

  17. Interpolation Errors in Spectrum Analyzers

    NASA Technical Reports Server (NTRS)

    Martin, J. L.

    1996-01-01

    To obtain the proper measurement amplitude with a spectrum analyzer, the correct frequency-dependent transducer factor must be added to the voltage measured by the transducer. This report examines how entering transducer factors into a spectrum analyzer can cause significant errors in field amplitude due to the misunderstanding of the analyzer's interpolation methods. It also discusses how to reduce these errors to obtain a more accurate field amplitude reading.

  18. Microscopic solvation and femtochemistry of charge-transfer reactions: the problem of benzene(s)-iodine binary complexes and their solvent structures

    NASA Astrophysics Data System (ADS)

    Cheng, P. Y.; Zhong, D.; Zewail, A. H.

    1995-08-01

    Charge-transfer reactions are studied on the femtosecond and picosecond time scales and under controlled composition in a molecular beam. The system of interest is iodine in aromatic and non-aromatic solvents, which goes back to Hildebrand and Mulliken almost 50 years ago. For the first time, the isolated binary complex and its solvated structural dynamics are studied. The product iodine atoms are positively identified and the concept of harpoon mechanism introduced. The dynamics are related to the impact geometry of the transition state and the electronic structure. A global potential energy surface is described with molecular dynamics detailing the motion in the reaction coordinates.

  19. Problem-Based Learning

    ERIC Educational Resources Information Center

    Allen, Deborah E.; Donham, Richard S.; Bernhardt, Stephen A.

    2011-01-01

    In problem-based learning (PBL), students working in collaborative groups learn by resolving complex, realistic problems under the guidance of faculty. There is some evidence of PBL effectiveness in medical school settings where it began, and there are numerous accounts of PBL implementation in various undergraduate contexts, replete with…

  20. C2Analyzer: Co-target–Co-function Analyzer

    PubMed Central

    Aftabuddin, Md.; Mal, Chittabrata; Deb, Arindam; Kundu, Sudip

    2014-01-01

    MicroRNAs (miRNAs) interact with their target mRNAs and regulate biological processes at post-transcriptional level. While one miRNA can target many mRNAs, a single mRNA can also be targeted by a set of miRNAs. The targeted mRNAs may be involved in different biological processes that are described by gene ontology (GO) terms. The major challenges involved in analyzing these multitude regulations include identification of the combinatorial regulation of miRNAs as well as determination of the co-functionally-enriched miRNA pairs. The C2Analyzer: Co-target–Co-function Analyzer, is a Perl-based, versatile and user-friendly web tool with online instructions. Based on the hypergeometric analysis, this novel tool can determine whether given pairs of miRNAs are co-functionally enriched. For a given set of GO term(s), it can also identify the set of miRNAs whose targets are enriched in the given GO term(s). Moreover, C2Analyzer can also identify the co-targeting miRNA pairs, their targets and GO processes, which they are involved in. The miRNA–miRNA co-functional relationship can also be saved as a .txt file, which can be used to further visualize the co-functional network by using other software like Cytoscape. C2Analyzer is freely available at www.bioinformatics.org/c2analyzer. PMID:24862384

  1. Class III malocclusion with complex problems of lateral open bite and severe crowding successfully treated with miniscrew anchorage and lingual orthodontic brackets.

    PubMed

    Yanagita, Takeshi; Kuroda, Shingo; Takano-Yamamoto, Teruko; Yamashiro, Takashi

    2011-05-01

    In this article, we report the successful use of miniscrews in a patient with an Angle Class III malocclusion, lateral open bite, midline deviation, and severe crowding. Simultaneously resolving such problems with conventional Class III treatment is difficult. In this case, the treatment procedure was even more challenging because the patient preferred to have lingual brackets on the maxillary teeth. As a result, miniscrews were used to facilitate significant asymmetric tooth movement in the posterior and downward directions; this contributed to the camouflage of the skeletal mandibular protrusion together with complete resolution of the severe crowding and lateral open bite. Analysis of the jaw motion showed that irregularities in chewing movement were also resolved, and a stable occlusion was achieved. Improvements in the facial profile and dental arches remained stable at the 18-month follow-up.

  2. Problem-Based Learning Tools

    ERIC Educational Resources Information Center

    Chin, Christine; Chia, Li-Gek

    2008-01-01

    One way of implementing project-based science (PBS) is to use problem-based learning (PBL), in which students formulate their own problems. These problems are often ill-structured, mirroring complex real-life problems where data are often messy and inclusive. In this article, the authors describe how they used PBL in a ninth-grade biology class in…

  3. Nuclear fuel microsphere gamma analyzer

    DOEpatents

    Valentine, Kenneth H.; Long, Jr., Ernest L.; Willey, Melvin G.

    1977-01-01

    A gamma analyzer system is provided for the analysis of nuclear fuel microspheres and other radioactive particles. The system consists of an analysis turntable with means for loading, in sequence, a plurality of stations within the turntable; a gamma ray detector for determining the spectrum of a sample in one section; means for analyzing the spectrum; and a receiver turntable to collect the analyzed material in stations according to the spectrum analysis. Accordingly, particles may be sorted according to their quality; e.g., fuel particles with fractured coatings may be separated from those that are not fractured, or according to other properties.

  4. Note: The intermodulation lockin analyzer

    SciTech Connect

    Tholen, Erik A.; Hutter, Carsten; Platz, Daniel; Forchheimer, Daniel; Haviland, David B.; Schuler, Vivien; Tholen, Mats O.

    2011-02-15

    Nonlinear systems can be probed by driving them with two or more pure tones while measuring the intermodulation products of the drive tones in the response. We describe a digital lockin analyzer which is designed explicitly for this purpose. The analyzer is implemented on a field-programmable gate array, providing speed in analysis, real-time feedback, and stability in operation. The use of the analyzer is demonstrated for intermodulation atomic force microscopy. A generalization of the intermodulation spectral technique to arbitrary drive waveforms is discussed.

  5. Note: The intermodulation lockin analyzer.

    PubMed

    Tholén, Erik A; Platz, Daniel; Forchheimer, Daniel; Schuler, Vivien; Tholén, Mats O; Hutter, Carsten; Haviland, David B

    2011-02-01

    Nonlinear systems can be probed by driving them with two or more pure tones while measuring the intermodulation products of the drive tones in the response. We describe a digital lockin analyzer which is designed explicitly for this purpose. The analyzer is implemented on a field-programmable gate array, providing speed in analysis, real-time feedback, and stability in operation. The use of the analyzer is demonstrated for intermodulation atomic force microscopy. A generalization of the intermodulation spectral technique to arbitrary drive waveforms is discussed.

  6. Market study: Whole blood analyzer

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A market survey was conducted to develop findings relative to the commercialization potential and key market factors of the whole blood analyzer which is being developed in conjunction with NASA's Space Shuttle Medical System.

  7. Balance Problems

    MedlinePlus

    ... version of this page please turn Javascript on. Balance Problems About Balance Problems Have you ever felt dizzy, lightheaded, or ... dizziness problem during the past year. Why Good Balance is Important Having good balance means being able ...

  8. Implementation of a multidisciplinary approach to solve complex nano EHS problems by the UC Center for the Environmental Implications of Nanotechnology.

    PubMed

    Xia, Tian; Malasarn, Davin; Lin, Sijie; Ji, Zhaoxia; Zhang, Haiyuan; Miller, Robert J; Keller, Arturo A; Nisbet, Roger M; Harthorn, Barbara H; Godwin, Hilary A; Lenihan, Hunter S; Liu, Rong; Gardea-Torresdey, Jorge; Cohen, Yoram; Mädler, Lutz; Holden, Patricia A; Zink, Jeffrey I; Nel, Andre E

    2013-05-27

    UC CEIN was established with funding from the US National Science Foundation and the US Environmental Protection Agency in 2008 with the mission to study the impact of nanotechnology on the environment, including the identification of hazard and exposure scenarios that take into consideration the unique physicochemical properties of engineered nanomaterials (ENMs). Since its inception, the Center has made great progress in assembling a multidisciplinary team to develop the scientific underpinnings, research, knowledge acquisition, education and outreach that is required for assessing the safe implementation of nanotechnology in the environment. In this essay, the development of the infrastructure, protocols, and decision-making tools that are required to effectively integrate complementary scientific disciplines allowing knowledge gathering in a complex study area that goes beyond the traditional safety and risk assessment protocols of the 20th century is outlined. UC CEIN's streamlined approach, premised on predictive hazard and exposure assessment methods, high-throughput discovery platforms and environmental decision-making tools that consider a wide range of nano/bio interfaces in terrestrial and aquatic ecosystems, demonstrates the implementation of a 21st-century approach to the safe implementation of nanotechnology in the environment.

  9. A Stochastic Employment Problem

    ERIC Educational Resources Information Center

    Wu, Teng

    2013-01-01

    The Stochastic Employment Problem(SEP) is a variation of the Stochastic Assignment Problem which analyzes the scenario that one assigns balls into boxes. Balls arrive sequentially with each one having a binary vector X = (X[subscript 1], X[subscript 2],...,X[subscript n]) attached, with the interpretation being that if X[subscript i] = 1 the ball…

  10. Analyzing Media: Metaphors as Methodologies.

    ERIC Educational Resources Information Center

    Meyrowitz, Joshua

    Students have little intuitive insight into the process of thinking and structuring ideas. The image of metaphor for a phenomenon acts as a kind of methodology for the study of the phenomenon by (1) defining the key issues or problems; (2) shaping the type of research questions that are asked; (3) defining the type of data that are searched out;…

  11. Cosmic dust analyzer for Cassini

    NASA Astrophysics Data System (ADS)

    Bradley, James G.; Gruen, Eberhard; Srama, Ralf

    1996-10-01

    The cosmic dust analyzer (CDA) is designed to characterize the dust environment in interplanetary space, in the Jovian and in the Saturnian systems. The instrument consists of two major components, the dust analyzer (DA) and the high rate detector (HRD). The DA has a large aperture to provide a large cross section for detection in low flux environments. The DA has the capability of determining dust particle mass, velocity, flight direction, charge, and chemical composition. The chemical composition is determined by the chemical analyzer system based on a time-of-flight mass spectrometer. The DA is capable of making full measurements up to one impact/second. The HRD contains two smaller PVDF detectors and electronics designed to characterize dust particle masses at impact rates up to 10(superscript 4) impacts/second. These high impact rates are expected during Saturn ring plane crossings.

  12. Rotor for centrifugal fast analyzers

    DOEpatents

    Lee, N.E.

    1984-01-01

    The invention is an improved photometric analyzer of the rotary cuvette type, the analyzer incorporating a multicuvette rotor of novel design. The rotor (a) is leaktight, (b) permits operation in the 90/sup 0/ and 180/sup 0/ excitation modes, (c) is compatible with extensively used Centrifugal Fast Analyzers, and (d) can be used thousands of times. The rotor includes an assembly comprising a top plate, a bottom plate, and a central plate, the rim of the central plate being formed with circumferentially spaced indentations. A uv-transmitting ring is sealably affixed to the indented rim to define with the indentations an array of cuvettes. The ring serves both as a sealing means and an end window for the cuvettes.

  13. Rotor for centrifugal fast analyzers

    DOEpatents

    Lee, Norman E.

    1985-01-01

    The invention is an improved photometric analyzer of the rotary cuvette type, the analyzer incorporating a multicuvette rotor of novel design. The rotor (a) is leaktight, (b) permits operation in the 90.degree. and 180.degree. excitation modes, (c) is compatible with extensively used Centrifugal Fast Analyzers, and (d) can be used thousands of times. The rotor includes an assembly comprising a top plate, a bottom plate, and a central plate, the rim of the central plate being formed with circumferentially spaced indentations. A UV-transmitting ring is sealably affixed to the indented rim to define with the indentations an array of cuvettes. The ring serves both as a sealing means and an end window for the cuvettes.

  14. On-Demand Urine Analyzer

    NASA Technical Reports Server (NTRS)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  15. An update on chemistry analyzers.

    PubMed

    Vap, L M; Mitzner, B

    1996-09-01

    This update of six chemistry analyzers available to the clinician discusses several points that should be considered prior to the purchase of equipment. General topics include how to best match an instrument to clinic needs and the indirect costs associated with instrument operation. Quality assurance recommendations are discussed and common terms are defined. Specific instrument features, principles of operation, performance, and costs are presented. The information provided offers potential purchasers an objective approach to the evaluation of a chemistry analyzer for the veterinary clinic.

  16. Real time infrared aerosol analyzer

    DOEpatents

    Johnson, Stanley A.; Reedy, Gerald T.; Kumar, Romesh

    1990-01-01

    Apparatus for analyzing aerosols in essentially real time includes a virtual impactor which separates coarse particles from fine and ultrafine particles in an aerosol sample. The coarse and ultrafine particles are captured in PTFE filters, and the fine particles impact onto an internal light reflection element. The composition and quantity of the particles on the PTFE filter and on the internal reflection element are measured by alternately passing infrared light through the filter and the internal light reflection element, and analyzing the light through infrared spectrophotometry to identify the particles in the sample.

  17. Using SCR methods to analyze requirements documentation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Morrison, Jeffery

    1995-01-01

    Software Cost Reduction (SCR) methods are being utilized to analyze and verify selected parts of NASA's EOS-DIS Core System (ECS) requirements documentation. SCR is being used as a spot-inspection tool. Through this formal and systematic approach of the SCR requirements methods, insights as to whether the requirements are internally inconsistent or incomplete as the scenarios of intended usage evolve in the OC (Operations Concept) documentation. Thus, by modelling the scenarios and requirements as mode charts using the SCR methods, we have been able to identify problems within and between the documents.

  18. Development of pulse neutron coal analyzer

    NASA Astrophysics Data System (ADS)

    Jing, Shi-wie; Gu, De-shan; Qiao, Shuang; Liu, Yu-ren; Liu, Lin-mao; Shi-wei, Jing

    2005-04-01

    This article introduced the development of pulsed neutron coal analyzer by pulse fast-thermal neutron analysis technology in the Radiation Technology Institute of Northeast Normal University. The 14MeV pulse neutron generator and bismuth germanate detector and 4096 multichannel analyzer were applied in this system. The multiple linear regression method employed to process data solved the interferential problem of multiple elements. The prototype (model MZ-MKFY) had been applied in Changshan and Jilin power plant for about a year. The results of measuring the main parameters of coal such as low caloric power, whole total water, ash content, volatile content, and sulfur content, with precision acceptable to the coal industry, are presented.

  19. Strategies for Analyzing Tone Languages

    ERIC Educational Resources Information Center

    Coupe, Alexander R.

    2014-01-01

    This paper outlines a method of auditory and acoustic analysis for determining the tonemes of a language starting from scratch, drawing on the author's experience of recording and analyzing tone languages of north-east India. The methodology is applied to a preliminary analysis of tone in the Thang dialect of Khiamniungan, a virtually undocumented…

  20. Pollution Analyzing and Monitoring Instruments.

    ERIC Educational Resources Information Center

    1972

    Compiled in this book is basic, technical information useful in a systems approach to pollution control. Descriptions and specifications are given of what is available in ready made, on-the-line commercial equipment for sampling, monitoring, measuring and continuously analyzing the multitudinous types of pollutants found in the air, water, soil,…

  1. Helping Students Analyze Business Documents.

    ERIC Educational Resources Information Center

    Devet, Bonnie

    2001-01-01

    Notes that student writers gain greater insight into the importance of audience by analyzing business documents. Discusses how business writing teachers can help students understand the rhetorical refinements of writing to an audience. Presents an assignment designed to lead writers systematically through an analysis of two advertisements. (SG)

  2. Therapy Talk: Analyzing Therapeutic Discourse

    ERIC Educational Resources Information Center

    Leahy, Margaret M.

    2004-01-01

    Therapeutic discourse is the talk-in-interaction that represents the social practice between clinician and client. This article invites speech-language pathologists to apply their knowledge of language to analyzing therapy talk and to learn how talking practices shape clinical roles and identities. A range of qualitative research approaches,…

  3. Analyzing Classroom Instruction in Reading.

    ERIC Educational Resources Information Center

    Rutherford, William L.

    A method for analyzing instructional techniques employed during reading group instruction is reported, and the characteristics of the effective reading teacher are discussed. Teaching effectiveness is divided into two categories: (1) how the teacher acts and interacts with children on a personal level and (2) how the teacher performs his…

  4. Structural qualia: a solution to the hard problem of consciousness

    PubMed Central

    Loorits, Kristjan

    2014-01-01

    The hard problem of consciousness has been often claimed to be unsolvable by the methods of traditional empirical sciences. It has been argued that all the objects of empirical sciences can be fully analyzed in structural terms but that consciousness is (or has) something over and above its structure. However, modern neuroscience has introduced a theoretical framework in which also the apparently non-structural aspects of consciousness, namely the so called qualia or qualitative properties, can be analyzed in structural terms. That framework allows us to see qualia as something compositional with internal structures that fully determine their qualitative nature. Moreover, those internal structures can be identified which certain neural patterns. Thus consciousness as a whole can be seen as a complex neural pattern that misperceives some of its own highly complex structural properties as monadic and qualitative. Such neural pattern is analyzable in fully structural terms and thereby the hard problem is solved. PMID:24672510

  5. Developpement d'une plateforme de calcul d'equilibres chimiques complexes et adaptation aux problemes electrochimiques et d'equilibres contraints

    NASA Astrophysics Data System (ADS)

    Neron, Alex

    Avec l'arrivée de l'environnement comme enjeu mondial, le secteur de l'efficacité énergétique prend une place de plus en plus importante pour les entreprises autant au niveau économique que pour l'image de la compagnie. Par le fait même, le domaine des technologies de l'énergie est un créneau de recherche dont les projets en cours se multiplient. D'ailleurs, un des problèmes qui peut survenir fréquemment dans certaines entreprises est d'aller mesurer la composition des matériaux dans des conditions difficiles d'accès. C'est le cas par exemple de l'électrolyse de l'aluminium qui se réalise à des températures très élevées. Pour pallier à ce problème, il faut créer et valider des modèles mathématiques qui vont calculer la composition et les propriétés à l'équilibre du système chimique. Ainsi, l'objectif global du projet de recherche est de développer un outil de calcul d'équilibres chimiques complexes (plusieurs réactions et plusieurs phases) et l'adapter aux problèmes électrochimiques et d'équilibres contraints. Plus spécifiquement, la plateforme de calcul doit tenir compte de la variation de température due à un gain ou une perte en énergie du système. Elle doit aussi considérer la limitation de l'équilibre due à un taux de réaction et enfin, résoudre les problèmes d'équilibres électrochimiques. Pour y parvenir, les propriétés thermodynamiques telles que l'énergie libre de Gibbs, la fugacité et l'activité sont tout d'abord étudiées pour mieux comprendre les interactions moléculaires qui régissent les équilibres chimiques. Ensuite, un bilan énergétique est inséré à la plateforme de calcul, ce qui permet de calculer la température à laquelle le système est le plus stable en fonction d'une température initiale et d'une quantité d'énergie échangée. Puis, une contrainte cinétique est ajoutée au système afin de calculer les équilibres pseudo-stationnaires en évolution dans le temps. De plus, la

  6. High reliability FBG interrogation analyzers

    NASA Astrophysics Data System (ADS)

    Yang, William; Zhang, Charlie; Bergles, Eric

    2009-05-01

    The invention of optical fiber and semiconductor lasers in the 1960s opened up a cornucopia of applications, notably as a medium of carrying light signals for communications and sensing applications. Optical fibers provide a fundamental improvement over traditional methods offering lower loss, higher bandwidth, immunity to electromagnetic interference (EMI), lighter weight, lower cost, and lower maintenance. By applying a UV laser to "burn" or write a diffraction grating (A Fiber Bragg Grating-FBG) in the fiber it became possible to reflect certain wavelengths of light, which used together with an interrogation analyzer (spectral analyzer) precise sensing measurements could be taken. The recent developments of optoelectronics components in the optical telecommunications field have dramatically enhanced the capabilities of many components, such as: light sources, fibers, detectors, optical amplifiers, mux/demuxes, switches, etc. As a result, numerous applications are now available for monitoring strain, stress and pressure in harsh environments. Examples of current and planned deployments will be presented.

  7. Introduction: why analyze single cells?

    PubMed

    Di Carlo, Dino; Tse, Henry Tat Kwong; Gossett, Daniel R

    2012-01-01

    Powerful methods in molecular biology are abundant; however, in many fields including hematology, stem cell biology, tissue engineering, and cancer biology, data from tools and assays that analyze the average signals from many cells may not yield the desired result because the cells of interest may be in the minority-their behavior masked by the majority-or because the dynamics of the populations of interest are offset in time. Accurate characterization of samples with high cellular heterogeneity may only be achieved by analyzing single cells. In this chapter, we discuss the rationale for performing analyses on individual cells in more depth, cover the fields of study in which single-cell behavior is yielding new insights into biological and clinical questions, and speculate on how single-cell analysis will be critical in the future.

  8. DEEP WATER ISOTOPIC CURRENT ANALYZER

    DOEpatents

    Johnston, W.H.

    1964-04-21

    A deepwater isotopic current analyzer, which employs radioactive isotopes for measurement of ocean currents at various levels beneath the sea, is described. The apparatus, which can determine the direction and velocity of liquid currents, comprises a shaft having a plurality of radiation detectors extending equidistant radially therefrom, means for releasing radioactive isotopes from the shaft, and means for determining the time required for the isotope to reach a particular detector. (AEC)

  9. Problem solving using soft systems methodology.

    PubMed

    Land, L

    This article outlines a method of problem solving which considers holistic solutions to complex problems. Soft systems methodology allows people involved in the problem situation to have control over the decision-making process.

  10. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    PubMed Central

    Cheung, Mike W.-L.; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  11. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    PubMed

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  12. Walking Problems

    MedlinePlus

    ... Parkinson's disease Diseases such as arthritis or multiple sclerosis Vision or balance problems Treatment of walking problems depends on the cause. Physical therapy, surgery, or mobility aids may help.

  13. ITK and ANALYZE: a synergistic integration

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Robb, Richard A.

    2004-05-01

    The Insight Toolkit (ITK) is a C++ open-source software toolkit developed under sponsorship of the National Library of Medicine. It provides advanced algorithms for performing image registration and segmentation, but does not provide support for visualization and analysis, nor does it offer any graphical user interface (GUI). The purpose of this integration project is to make ITK readily accessible to end-users with little or no programming skills, and provide interactive processing, visualization and measurement capabilities. This is achieved through the integration of ITK with ANALYZE, a multi-dimension image visualization/analysis application installed in over 300 institutions around the world, with a user-base in excess of 4000. This integration is carried out at both the software foundation and GUI levels. The foundation technology upon which ANALYZE is built is a comprehensive C-function library called AVW. A new set of AVW-ITK functions have been developed and integrated into the AVW library, and four new ITK modules have been added to the ANALYZE interface. Since ITK is a software developer"s toolkit, the only way to access its intrinsic power is to write programs that incorporate it. Integrating ITK with ANALYZE opens the ITK algorithms to end-users who otherwise might never be able to take advantage of the toolkit"s advanced functionality. In addition, this integration provides end-to-end interactive problem solving capabilities which allow all users, including programmers, an integrated system to readily display and quantitatively evaluate the results from the segmentation and registration routines in ITK, regardless of the type or format of input images, which are comprehensively supported in ANALYZE.

  14. The Aqueduct Global Flood Analyzer

    NASA Astrophysics Data System (ADS)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  15. MULTICHANNEL PULSE-HEIGHT ANALYZER

    DOEpatents

    Russell, J.T.; Lefevre, H.W.

    1958-01-21

    This patent deals with electronic computing circuits and more particularly to pulse-height analyzers used for classifying variable amplitude pulses into groups of different amplitudes. The device accomplishes this pulse allocation by by converting the pulses into frequencies corresponding to the amplitudes of the pulses, which frequencies are filtered in channels individually pretuned to a particular frequency and then detected and recorded in the responsive channel. This circuit substantially overcomes the disadvantages of prior annlyzers incorporating discriminators pre-set to respond to certain voltage levels, since small variation in component values is not as critical to satisfactory circuit operation.

  16. Metabolic analyzer. [for Skylab mission

    NASA Technical Reports Server (NTRS)

    Perry, C. L.

    1973-01-01

    An apparatus is described for the measurement of metabolic rate and breathing dynamics in which inhaled and exhaled breath are sensed by sealed, piston-displacement type spirometers. These spirometers electrically measure the volume of inhaled and exhaled breath. A mass spectrometer analyzes simultaneously for oxygen, carbon dioxide, nitrogen, and water vapor. Circuits responsive to the outputs of the spirometers, mass spectrometer, temperature, pressure, and timing signals compute oxygen consumption, carbon dioxide production, minute volume, and respiratory exchange ratio. A selective indicator provides for readout of these data at predetermined cyclic intervals.

  17. The OpenSHMEM Analyzer

    SciTech Connect

    Hernandez, Oscar

    2014-07-30

    The OpenSHMEM Analyzer is a compiler-based tool that can help users detect errors and provide useful analyses about their OpenSHMEM applications. The tool is built on top of the OpenUH compiler (a branch of Open64 compiler) and presents OpenSHMEM information as feedback to the user. Some of the analyses it provides include checks for correct usage of symmetric variables in OpenSHMEM calls, out-of-bounds checks for symmetric data, checks for the correct initialization of pointers to symmetric data, and symmetric data alias information.

  18. Trace Gas Analyzer (TGA) program

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The design, fabrication, and test of a breadboard trace gas analyzer (TGA) is documented. The TGA is a gas chromatograph/mass spectrometer system. The gas chromatograph subsystem employs a recirculating hydrogen carrier gas. The recirculation feature minimizes the requirement for transport and storage of large volumes of carrier gas during a mission. The silver-palladium hydrogen separator which permits the removal of the carrier gas and its reuse also decreases vacuum requirements for the mass spectrometer since the mass spectrometer vacuum system need handle only the very low sample pressure, not sample plus carrier. System performance was evaluated with a representative group of compounds.

  19. Charged particle mobility refrigerant analyzer

    DOEpatents

    Allman, S.L.; Chunghsuan Chen; Chen, F.C.

    1993-02-02

    A method for analyzing a gaseous electronegative species comprises the steps of providing an analysis chamber; providing an electric field of known potential within the analysis chamber; admitting into the analysis chamber a gaseous sample containing the gaseous electronegative species; providing a pulse of free electrons within the electric field so that the pulse of free electrons interacts with the gaseous electronegative species so that a swarm of electrically charged particles is produced within the electric field; and, measuring the mobility of the electrically charged particles within the electric field.

  20. Method for analyzing microbial communities

    DOEpatents

    Zhou, Jizhong [Oak Ridge, TN; Wu, Liyou [Oak Ridge, TN

    2010-07-20

    The present invention provides a method for quantitatively analyzing microbial genes, species, or strains in a sample that contains at least two species or strains of microorganisms. The method involves using an isothermal DNA polymerase to randomly and representatively amplify genomic DNA of the microorganisms in the sample, hybridizing the resultant polynucleotide amplification product to a polynucleotide microarray that can differentiate different genes, species, or strains of microorganisms of interest, and measuring hybridization signals on the microarray to quantify the genes, species, or strains of interest.