Evaluation of the HARDMAN comparability methodology for manpower, personnel and training
NASA Technical Reports Server (NTRS)
Zimmerman, W.; Butler, R.; Gray, V.; Rosenberg, L.
1984-01-01
The methodology evaluation and recommendation are part of an effort to improve Hardware versus Manpower (HARDMAN) methodology for projecting manpower, personnel, and training (MPT) to support new acquisition. Several different validity tests are employed to evaluate the methodology. The methodology conforms fairly well with both the MPT user needs and other accepted manpower modeling techniques. Audits of three completed HARDMAN applications reveal only a small number of potential problem areas compared to the total number of issues investigated. The reliability study results conform well with the problem areas uncovered through the audits. The results of the accuracy studies suggest that the manpower life-cycle cost component is only marginally sensitive to changes in other related cost variables. Even with some minor problems, the methodology seem sound and has good near term utility to the Army. Recommendations are provided to firm up the problem areas revealed through the evaluation.
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
ERIC Educational Resources Information Center
Soh, Kaycheng
2013-01-01
Recent research into university ranking methodologies uncovered several methodological problems among the systems currently in vogue. One of these is the discrepancy between the nominal and attained weights. The problem is the summation of unstandardized indicators for the total scores used in ranking. It is demonstrated that weight discrepancy…
A Methodological Critique of "Interventions for Boys with Conduct Problems"
ERIC Educational Resources Information Center
Kent, Ronald; And Others
1976-01-01
Kent criticizes Patterson's study on treating the behavior problems of boys, on several methodological bases concluding that more rigorous research is required in this field. Patterson answers Kent's criticisms arguing that they are not based on sound grounds. Patterson offers further evidence to support the efficacy of his treatment procedures.…
Helicopter-V/STOL dynamic wind and turbulence design methodology
NASA Technical Reports Server (NTRS)
Bailey, J. Earl
1987-01-01
Aircraft and helicopter accidents due to severe dynamic wind and turbulence continue to present challenging design problems. The development of the current set of design analysis tools for a aircraft wind and turbulence design began in the 1940's and 1950's. The areas of helicopter dynamic wind and turbulence modeling and vehicle response to severe dynamic wind inputs (microburst type phenomena) during takeoff and landing remain as major unsolved design problems from a lack of both environmental data and computational methodology. The development of helicopter and V/STOL dynamic wind and turbulence response computation methology is reviewed, the current state of the design art in industry is outlined, and comments on design methodology are made which may serve to improve future flight vehicle design.
ERIC Educational Resources Information Center
Danov, Stacy E.; Tervo, Raymond; Meyers, Stephanie; Symons, Frank J.
2012-01-01
The atypical antipsychotic medication aripiprazole was evaluated using a randomized AB multiple baseline, double-blind, placebo-controlled design for the treatment of severe problem behavior with 4 children with intellectual and developmental disabilities. Functional analysis (FA) was conducted concurrent with the medication evaluation to…
NASA Technical Reports Server (NTRS)
Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong
1993-01-01
The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.
NASA Astrophysics Data System (ADS)
Fulkerson, David E.
2010-02-01
This paper describes a new methodology for characterizing the electrical behavior and soft error rate (SER) of CMOS and SiGe HBT integrated circuits that are struck by ions. A typical engineering design problem is to calculate the SER of a critical path that commonly includes several circuits such as an input buffer, several logic gates, logic storage, clock tree circuitry, and an output buffer. Using multiple 3D TCAD simulations to solve this problem is too costly and time-consuming for general engineering use. The new and simple methodology handles the problem with ease by simple SPICE simulations. The methodology accurately predicts the measured threshold linear energy transfer (LET) of a bulk CMOS SRAM. It solves for circuit currents and voltage spikes that are close to those predicted by expensive 3D TCAD simulations. It accurately predicts the measured event cross-section vs. LET curve of an experimental SiGe HBT flip-flop. The experimental cross section vs. frequency behavior and other subtle effects are also accurately predicted.
NASA Technical Reports Server (NTRS)
Nakajima, Yukio; Padovan, Joe
1987-01-01
In a three-part series of papers, a generalized finite element methodology is formulated to handle traveling load problems involving large deformation fields in structure composed of viscoelastic media. The main thrust of this paper is to develop an overall finite element methodology and associated solution algorithms to handle the transient aspects of moving problems involving contact impact type loading fields. Based on the methodology and algorithms formulated, several numerical experiments are considered. These include the rolling/sliding impact of tires with road obstructions.
Using the CPGI to Determine Problem Gambling Prevalence in Australia: Measurement Issues
ERIC Educational Resources Information Center
Jackson, Alun C.; Wynne, Harold; Dowling, Nicki A.; Tomnay, Jane E.; Thomas, Shane A.
2010-01-01
Most states and territories in Australia have adopted the Problem Gambling Severity Index (PGSI) of the Canadian Problem Gambling Index as the standard measure of problem gambling in their prevalence studies and research programs. However, notwithstanding this attempted standardisation, differences in sampling and recruitment methodologies and in…
Zhang, Yong-Feng; Chiang, Hsiao-Dong
2017-09-01
A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.
[Problem-based learning in cardiopulmonary resuscitation: basic life support].
Sardo, Pedro Miguel Garcez; Dal Sasso, Grace Terezinha Marcon
2008-12-01
Descriptive and exploratory study, aimed to develop an educational practice of Problem-Based Learning in CPR/BLS with 24 students in the third stage of the Nursing Undergraduate Course in a University in the Southern region of Brazil. The study used the PBL methodology, focused on problem situations of cardiopulmonary arrest, and was approved by the CONEP. The methodological strategies for data collection, such as participative observation and questionnaires to evaluate the learning, the educational practices and their methodology, allowed for grouping the results in: students' expectations; group activities; individual activities; practical activities; evaluation of the meetings and their methodology. The study showed that PBL allows the educator to evaluate the academic learning process in several dimensions, functioning as a motivating factor for both the educator and the student, because it allows the theoretical-practical integration in an integrated learning process.
Case study of a problem-based learning course of physics in a telecommunications engineering degree
NASA Astrophysics Data System (ADS)
Macho-Stadler, Erica; Jesús Elejalde-García, Maria
2013-08-01
Active learning methods can be appropriate in engineering, as their methodology promotes meta-cognition, independent learning and problem-solving skills. Problem-based learning is the educational process by which problem-solving activities and instructor's guidance facilitate learning. Its key characteristic involves posing a 'concrete problem' to initiate the learning process, generally implemented by small groups of students. Many universities have developed and used active methodologies successfully in the teaching-learning process. During the past few years, the University of the Basque Country has promoted the use of active methodologies through several teacher training programmes. In this paper, we describe and analyse the results of the educational experience using the problem-based learning (PBL) method in a physics course for undergraduates enrolled in the technical telecommunications engineering degree programme. From an instructors' perspective, PBL strengths include better student attitude in class and increased instructor-student and student-student interactions. The students emphasised developing teamwork and communication skills in a good learning atmosphere as positive aspects.
NASA Technical Reports Server (NTRS)
Weissenberger, S. (Editor)
1973-01-01
A systems engineering approach is reported for the problem of reducing the number and severity of California's wildlife fires. Prevention methodologies are reviewed and cost benefit models are developed for making preignition decisions.
Application of Design Methodologies for Feedback Compensation Associated with Linear Systems
NASA Technical Reports Server (NTRS)
Smith, Monty J.
1996-01-01
The work that follows is concerned with the application of design methodologies for feedback compensation associated with linear systems. In general, the intent is to provide a well behaved closed loop system in terms of stability and robustness (internal signals remain bounded with a certain amount of uncertainty) and simultaneously achieve an acceptable level of performance. The approach here has been to convert the closed loop system and control synthesis problem into the interpolation setting. The interpolation formulation then serves as our mathematical representation of the design process. Lifting techniques have been used to solve the corresponding interpolation and control synthesis problems. Several applications using this multiobjective design methodology have been included to show the effectiveness of these techniques. In particular, the mixed H 2-H performance criteria with algorithm has been used on several examples including an F-18 HARV (High Angle of Attack Research Vehicle) for sensitivity performance.
ERIC Educational Resources Information Center
Jennings, Jerry L.; Apsche, Jack A.; Blossom, Paige; Bayles, Corliss
2013-01-01
Although mindfulness has become a mainstream methodology in mental health treatment, it is a relatively new approach with adolescents, and perhaps especially youth with sexual behavior problems. Nevertheless, clinical experience and several empirical studies are available to show the effectiveness of a systematic mindfulness- based methodology for…
The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis
NASA Astrophysics Data System (ADS)
Yu, Huiling; Fan, Delin; Zhang, Yizhuo
This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.
Inverse problems in quantum chemistry
NASA Astrophysics Data System (ADS)
Karwowski, Jacek
Inverse problems constitute a branch of applied mathematics with well-developed methodology and formalism. A broad family of tasks met in theoretical physics, in civil and mechanical engineering, as well as in various branches of medical and biological sciences has been formulated as specific implementations of the general theory of inverse problems. In this article, it is pointed out that a number of approaches met in quantum chemistry can (and should) be classified as inverse problems. Consequently, the methodology used in these approaches may be enriched by applying ideas and theorems developed within the general field of inverse problems. Several examples, including the RKR method for the construction of potential energy curves, determining parameter values in semiempirical methods, and finding external potentials for which the pertinent Schrödinger equation is exactly solvable, are discussed in detail.
Building an adaptive agent to monitor and repair the electrical power system of an orbital satellite
NASA Technical Reports Server (NTRS)
Tecuci, Gheorghe; Hieb, Michael R.; Dybala, Tomasz
1995-01-01
Over several years we have developed a multistrategy apprenticeship learning methodology for building knowledge-based systems. Recently we have developed and applied our methodology to building intelligent agents. This methodology allows a subject matter expert to build an agent in the same way in which the expert would teach a human apprentice. The expert will give the agent specific examples of problems and solutions, explanations of these solutions, or supervise the agent as it solves new problems. During such interactions, the agent learns general rules and concepts, continuously extending and improving its knowledge base. In this paper we present initial results on applying this methodology to build an intelligent adaptive agent for monitoring and repair of the electrical power system of an orbital satellite, stressing the interaction with the expert during apprenticeship learning.
Decomposition of timed automata for solving scheduling problems
NASA Astrophysics Data System (ADS)
Nishi, Tatsushi; Wakatake, Masato
2014-03-01
A decomposition algorithm for scheduling problems based on timed automata (TA) model is proposed. The problem is represented as an optimal state transition problem for TA. The model comprises of the parallel composition of submodels such as jobs and resources. The procedure of the proposed methodology can be divided into two steps. The first step is to decompose the TA model into several submodels by using decomposable condition. The second step is to combine individual solution of subproblems for the decomposed submodels by the penalty function method. A feasible solution for the entire model is derived through the iterated computation of solving the subproblem for each submodel. The proposed methodology is applied to solve flowshop and jobshop scheduling problems. Computational experiments demonstrate the effectiveness of the proposed algorithm compared with a conventional TA scheduling algorithm without decomposition.
An integrated methodology to assess the benefits of urban green space.
De Ridder, K; Adamec, V; Bañuelos, A; Bruse, M; Bürger, M; Damsgaard, O; Dufek, J; Hirsch, J; Lefebre, F; Pérez-Lacorzana, J M; Thierry, A; Weber, C
2004-12-01
The interrelated issues of urban sprawl, traffic congestion, noise, and air pollution are major socioeconomic problems faced by most European cities. A methodology is currently being developed for evaluating the role of green space and urban form in alleviating the adverse effects of urbanisation, mainly focusing on the environment but also accounting for socioeconomic aspects. The objectives and structure of the methodology are briefly outlined and illustrated with preliminary results obtained from case studies performed on several European cities.
"Sustainability on Earth" Webquests: Do They Qualify as Problem-Based Learning Activities?
ERIC Educational Resources Information Center
Leite, Laurinda; Dourado, Luís; Morgado, Sofia
2015-01-01
Information and communication technologies (ICT), namely the Internet, can play a valuable educational role in several school subjects, including science education. The same applies to problem-based learning (PBL), that is, a student-centered active learning methodology that can prepare students for lifelong learning. WebQuests (WQs) combine PBL…
Advancements in medicine from aerospace research
NASA Technical Reports Server (NTRS)
Wooten, F. T.
1971-01-01
A program designed to find ways of transferring space technology to non-space medicine is discussed. The methodology used to attack the problem and several illustrative examples of the results are given.
Brain Dynamics: Methodological Issues and Applications in Psychiatric and Neurologic Diseases
NASA Astrophysics Data System (ADS)
Pezard, Laurent
The human brain is a complex dynamical system generating the EEG signal. Numerical methods developed to study complex physical dynamics have been used to characterize EEG since the mid-eighties. This endeavor raised several issues related to the specificity of EEG. Firstly, theoretical and methodological studies should address the major differences between the dynamics of the human brain and physical systems. Secondly, this approach of EEG signal should prove to be relevant for dealing with physiological or clinical problems. A set of studies performed in our group is presented here within the context of these two problematic aspects. After the discussion of methodological drawbacks, we review numerical simulations related to the high dimension and spatial extension of brain dynamics. Experimental studies in neurologic and psychiatric disease are then presented. We conclude that if it is now clear that brain dynamics changes in relation with clinical situations, methodological problems remain largely unsolved.
Expert Systems Development Methodology
1989-07-28
application. Chapter 9, Design and Prototyping, discusses the problems of designing the user interface and other characteristics of the ES and the prototyping...severely in question as to whether they were computable. In order to work with this problem , Turing created what he called the universal machine. These...about the theory of computers and their relationship to problem solving. It was here at Princeton that he first began to experiment directly with
A Social-Medical Approach to Violence in Colombia
Franco, Saul
2003-01-01
Violence is the main public health problem in Colombia. Many theoretical and methodological approaches to solving this problem have been attempted from different disciplines. My past work has focused on homicide violence from the perspective of social medicine. In this article I present the main conceptual and methodological aspects and the chief findings of my research over the past 15 years. Findings include a quantitative description of the current situation and the introduction of the category of explanatory contexts as a contribution to the study of Colombian violence. The complexity and severity of this problem demand greater theoretical discussion, more plans for action and a faster transition between the two. Social medicine may make a growing contribution to this field. PMID:14652328
A social-medical approach to violence in Colombia.
Franco, Saul
2003-12-01
Violence is the main public health problem in Colombia. Many theoretical and methodological approaches to solving this problem have been attempted from different disciplines. My past work has focused on homicide violence from the perspective of social medicine. In this article I present the main conceptual and methodological aspects and the chief findings of my research over the past 15 years. Findings include a quantitative description of the current situation and the introduction of the category of explanatory contexts as a contribution to the study of Colombian violence. The complexity and severity of this problem demand greater theoretical discussion, more plans for action and a faster transition between the two. Social medicine may make a growing contribution to this field.
Towards a Methodology for Identifying Program Constraints During Requirements Analysis
NASA Technical Reports Server (NTRS)
Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo
1997-01-01
Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.
NASA Astrophysics Data System (ADS)
Neumann, Karl
1987-06-01
In the methodological discussion of recent years it has become apparent that many research problems, including problems relating to the theory of educational science, cannot be solved by using quantitative methods. The multifaceted aspects of human behaviour and all its environment-bound subtle nuances, especially the process of education or the development of identity, cannot fully be taken into account within a rigid neopositivist approach. In employing the paradigm of symbolic interactionism as a suitable model for the analysis of processes of education and formation, the research has generally to start out from complex reciprocal social interactions instead of unambigious connections of causes. In analysing several particular methodological problems, the article demonstrates some weaknesses of quantitative approaches and then shows the advantages in and the necessity for using qualitative research tools.
NASA Astrophysics Data System (ADS)
Alemany, Kristina
Electric propulsion has recently become a viable technology for spacecraft, enabling shorter flight times, fewer required planetary gravity assists, larger payloads, and/or smaller launch vehicles. With the maturation of this technology, however, comes a new set of challenges in the area of trajectory design. Because low-thrust trajectory optimization has historically required long run-times and significant user-manipulation, mission design has relied on expert-based knowledge for selecting departure and arrival dates, times of flight, and/or target bodies and gravitational swing-bys. These choices are generally based on known configurations that have worked well in previous analyses or simply on trial and error. At the conceptual design level, however, the ability to explore the full extent of the design space is imperative to locating the best solutions in terms of mass and/or flight times. Beginning in 2005, the Global Trajectory Optimization Competition posed a series of difficult mission design problems, all requiring low-thrust propulsion and visiting one or more asteroids. These problems all had large ranges on the continuous variables---launch date, time of flight, and asteroid stay times (when applicable)---as well as being characterized by millions or even billions of possible asteroid sequences. Even with recent advances in low-thrust trajectory optimization, full enumeration of these problems was not possible within the stringent time limits of the competition. This investigation develops a systematic methodology for determining a broad suite of good solutions to the combinatorial, low-thrust, asteroid tour problem. The target application is for conceptual design, where broad exploration of the design space is critical, with the goal being to rapidly identify a reasonable number of promising solutions for future analysis. The proposed methodology has two steps. The first step applies a three-level heuristic sequence developed from the physics of the problem, which allows for efficient pruning of the design space. The second phase applies a global optimization scheme to locate a broad suite of good solutions to the reduced problem. The global optimization scheme developed combines a novel branch-and-bound algorithm with a genetic algorithm and an industry-standard low-thrust trajectory optimization program to solve for the following design variables: asteroid sequence, launch date, times of flight, and asteroid stay times. The methodology is developed based on a small sample problem, which is enumerated and solved so that all possible discretized solutions are known. The methodology is then validated by applying it to a larger intermediate sample problem, which also has a known solution. Next, the methodology is applied to several larger combinatorial asteroid rendezvous problems, using previously identified good solutions as validation benchmarks. These problems include the 2nd and 3rd Global Trajectory Optimization Competition problems. The methodology is shown to be capable of achieving a reduction in the number of asteroid sequences of 6-7 orders of magnitude, in terms of the number of sequences that require low-thrust optimization as compared to the number of sequences in the original problem. More than 70% of the previously known good solutions are identified, along with several new solutions that were not previously reported by any of the competitors. Overall, the methodology developed in this investigation provides an organized search technique for the low-thrust mission design of asteroid rendezvous problems.
Modeling, Analyzing, and Mitigating Dissonance Between Alerting Systems
NASA Technical Reports Server (NTRS)
Song, Lixia; Kuchar, James K.
2003-01-01
Alerting systems are becoming pervasive in process operations, which may result in the potential for dissonance or conflict in information from different alerting systems that suggests different threat levels and/or actions to resolve hazards. Little is currently available to help in predicting or solving the dissonance problem. This thesis presents a methodology to model and analyze dissonance between alerting systems, providing both a theoretical foundation for understanding dissonance and a practical basis from which specific problems can be addressed. A state-space representation of multiple alerting system operation is generalized that can be tailored across a variety of applications. Based on the representation, two major causes of dissonance are identified: logic differences and sensor error. Additionally, several possible types of dissonance are identified. A mathematical analysis method is developed to identify the conditions for dissonance originating from logic differences. A probabilistic analysis methodology is developed to estimate the probability of dissonance originating from sensor error, and to compare the relative contribution to dissonance of sensor error against the contribution from logic differences. A hybrid model, which describes the dynamic behavior of the process with multiple alerting systems, is developed to identify dangerous dissonance space, from which the process can lead to disaster. Methodologies to avoid or mitigate dissonance are outlined. Two examples are used to demonstrate the application of the methodology. First, a conceptual In-Trail Spacing example is presented. The methodology is applied to identify the conditions for possible dissonance, to identify relative contribution of logic difference and sensor error, and to identify dangerous dissonance space. Several proposed mitigation methods are demonstrated in this example. In the second example, the methodology is applied to address the dissonance problem between two air traffic alert and avoidance systems: the existing Traffic Alert and Collision Avoidance System (TCAS) vs. the proposed Airborne Conflict Management system (ACM). Conditions on ACM resolution maneuvers are identified to avoid dynamic dissonance between TCAS and ACM. Also included in this report is an Appendix written by Lee Winder about recent and continuing work on alerting systems design. The application of Markov Decision Process (MDP) theory to complex alerting problems is discussed and illustrated with an abstract example system.
NASA Technical Reports Server (NTRS)
Hyland, D. C.; Bernstein, D. S.
1987-01-01
The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modeling and reduced control design methodology for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed. The application of the methodology to several Large Space Structures (LSS) problems of representative complexity is illustrated.
A methodological proposal for the development of an HPC-based antenna array scheduler
NASA Astrophysics Data System (ADS)
Bonvallet, Roberto; Hoffstadt, Arturo; Herrera, Diego; López, Daniela; Gregorio, Rodrigo; Almuna, Manuel; Hiriart, Rafael; Solar, Mauricio
2010-07-01
As new astronomy projects choose interferometry to improve angular resolution and to minimize costs, preparing and optimizing schedules for an antenna array becomes an increasingly critical task. This problem shares similarities with the job-shop problem, which is known to be a NP-hard problem, making a complete approach infeasible. In the case of ALMA, 18000 projects per season are expected, and the best schedule must be found in the order of minutes. The problem imposes severe difficulties: the large domain of observation projects to be taken into account; a complex objective function, composed of several abstract, environmental, and hardware constraints; the number of restrictions imposed and the dynamic nature of the problem, as weather is an ever-changing variable. A solution can benefit from the use of High-Performance Computing for the final implementation to be deployed, but also for the development process. Our research group proposes the use of both metaheuristic search and statistical learning algorithms, in order to create schedules in a reasonable time. How these techniques will be applied is yet to be determined as part of the ongoing research. Several algorithms need to be implemented, tested and evaluated by the team. This work presents the methodology proposed to lead the development of the scheduler. The basic functionality is encapsulated into software components implemented on parallel architectures. These components expose a domain-level interface to the researchers, enabling then to develop early prototypes for evaluating and comparing their proposed techniques.
Tractenberg, Saulo G; Levandowski, Mateus L; de Azeredo, Lucas Araújo; Orso, Rodrigo; Roithmann, Laura G; Hoffmann, Emerson S; Brenhouse, Heather; Grassi-Oliveira, Rodrigo
2016-09-01
Early life stress (ELS) developmental effects have been widely studied by preclinical researchers. Despite the growing body of evidence from ELS models, such as the maternal separation paradigm, the reported results have marked inconsistencies. The maternal separation model has several methodological pitfalls that could influence the reliability of its results. Here, we critically review 94 mice studies that addressed the effects of maternal separation on behavioural outcomes. We also discuss methodological issues related to the heterogeneity of separation protocols and the quality of reporting methods. Our findings indicate a lack of consistency in maternal separation effects: major studies of behavioural and biological phenotypes failed to find significant deleterious effects. Furthermore, we identified several specific variations in separation methodological procedures. These methodological variations could contribute to the inconsistency of maternal separation effects by producing different degrees of stress exposure in maternal separation-reared pups. These methodological problems, together with insufficient reporting, might lead to inaccurate and unreliable effect estimates in maternal separation studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Elad, M; Feuer, A
1997-01-01
The three main tools in the single image restoration theory are the maximum likelihood (ML) estimator, the maximum a posteriori probability (MAP) estimator, and the set theoretic approach using projection onto convex sets (POCS). This paper utilizes the above known tools to propose a unified methodology toward the more complicated problem of superresolution restoration. In the superresolution restoration problem, an improved resolution image is restored from several geometrically warped, blurred, noisy and downsampled measured images. The superresolution restoration problem is modeled and analyzed from the ML, the MAP, and POCS points of view, yielding a generalization of the known superresolution restoration methods. The proposed restoration approach is general but assumes explicit knowledge of the linear space- and time-variant blur, the (additive Gaussian) noise, the different measured resolutions, and the (smooth) motion characteristics. A hybrid method combining the simplicity of the ML and the incorporation of nonellipsoid constraints is presented, giving improved restoration performance, compared with the ML and the POCS approaches. The hybrid method is shown to converge to the unique optimal solution of a new definition of the optimization problem. Superresolution restoration from motionless measurements is also discussed. Simulations demonstrate the power of the proposed methodology.
The kidney allocation score: methodological problems, moral concerns and unintended consequences.
Hippen, B
2009-07-01
The growing disparity between the demand for and supply of kidneys for transplantation has generated interest in alternative systems of allocating kidneys from deceased donors. This personal viewpoint focuses attention on the Kidney Allocation Score (KAS) proposal promulgated by the UNOS/OPTN Kidney Committee. I identify several methodological and moral flaws in the proposed system, concluding that any iteration of the KAS proposal should be met with more skepticism than sanguinity.
Software Risk Identification for Interplanetary Probes
NASA Technical Reports Server (NTRS)
Dougherty, Robert J.; Papadopoulos, Periklis E.
2005-01-01
The need for a systematic and effective software risk identification methodology is critical for interplanetary probes that are using increasingly complex and critical software. Several probe failures are examined that suggest more attention and resources need to be dedicated to identifying software risks. The direct causes of these failures can often be traced to systemic problems in all phases of the software engineering process. These failures have lead to the development of a practical methodology to identify risks for interplanetary probes. The proposed methodology is based upon the tailoring of the Software Engineering Institute's (SEI) method of taxonomy-based risk identification. The use of this methodology will ensure a more consistent and complete identification of software risks in these probes.
Decision-theoretic methodology for reliability and risk allocation in nuclear power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.
1985-01-01
This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferencesmore » could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed.« less
Development of Methodology for Programming Autonomous Agents
NASA Technical Reports Server (NTRS)
Erol, Kutluhan; Levy, Renato; Lang, Lun
2004-01-01
A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently
ERIC Educational Resources Information Center
Perez, Marcel
This study presents a model for teaching a French conversation course on the college level. The research is based on French language classes in Quebec general education and professional colleges (CEGEP). The first part states the problem, examines several programs, describes the organization of the conversation classes, presents several language…
Taylor, J V; DiBennardo, R; Linares, G H; Goldman, A D; DeForest, P R
1984-07-01
A case study is presented to demonstrate the utility of the team approach to the identification of human remains, and to illustrate a methodological innovation developed by MFAT. Case 1 represents the first of several planned case studies, each designed to present new methodological solutions to standard problems in identification. The present case describes a test, by application, of race and sex assessment of the postcranial skeleton by discriminant function analysis.
Innovative Technologies for Multicultural Education Needs
ERIC Educational Resources Information Center
Ferdig, Richard E.; Coutts, Jade; DiPietro, Joseph; Lok, Benjamin; Davis, Niki
2007-01-01
Purpose: The purpose of this paper is to discuss several technology applications that are being used to address current problems or opportunities related to multicultural education. Design/methodology/approach: Five technology applications or technology-related projects are discussed, including a teacher education literacy tool, social networking…
McKay, J R; Weiss, R V
2001-04-01
This article is an initial report from a review of alcohol and drug treatment studies with follow-ups of 2 years or more. The goals of the review are to examine the stability of substance use outcomes and the factors that moderate or mediate these outcomes. Results from 12 studies that generated multiple research reports are presented, and methodological problems encountered in the review are discussed. Substance use outcomes at the group level were generally stable, although moderate within-subject variation in substance use status over time was observed. Of factors assessed at baseline, psychiatric severity was a significant predictor of outcome in the highest percentage of reports, although the nature of the relationship varied. Stronger motivation and coping at baseline also consistently predicted better drinking outcomes. Better progress while in treatment, and the performance of pro-recovery behaviors and low problem severity in associated areas following treatment, consistently predicted better substance use outcomes.
Fuzzy Multi-Objective Vendor Selection Problem with Modified S-CURVE Membership Function
NASA Astrophysics Data System (ADS)
Díaz-Madroñero, Manuel; Peidro, David; Vasant, Pandian
2010-06-01
In this paper, the S-Curve membership function methodology is used in a vendor selection (VS) problem. An interactive method for solving multi-objective VS problems with fuzzy goals is developed. The proposed method attempts simultaneously to minimize the total order costs, the number of rejected items and the number of late delivered items with reference to several constraints such as meeting buyers' demand, vendors' capacity, vendors' quota flexibility, vendors' allocated budget, etc. We compare in an industrial case the performance of S-curve membership functions, representing uncertainty goals and constraints in VS problems, with linear membership functions.
Guina, Jeffrey; Nahhas, Ramzi W.; Goldberg, Adam J.; Farnsworth, Seth
2016-01-01
Background: Trauma is commonly associated with substance-related problems, yet associations between specific substances and specific posttraumatic stress disorder symptoms (PTSSs) are understudied. We hypothesized that substance-related problems are associated with PTSS severities, interpersonal traumas, and benzodiazepine prescriptions. Methods: Using a cross-sectional survey methodology in a consecutive sample of adult outpatients with trauma histories (n = 472), we used logistic regression to examine substance-related problems in general (primary, confirmatory analysis), as well as alcohol, tobacco, and illicit drug problems specifically (secondary, exploratory analyses) in relation to demographics, trauma type, PTSSs, and benzodiazepine prescriptions. Results: After adjusting for multiple testing, several factors were significantly associated with substance-related problems, particularly benzodiazepines (AOR = 2.78; 1.99 for alcohol, 2.42 for tobacco, 8.02 for illicit drugs), DSM-5 PTSD diagnosis (AOR = 1.92; 2.38 for alcohol, 2.00 for tobacco, 2.14 for illicit drugs), most PTSSs (especially negative beliefs, recklessness, and avoidance), and interpersonal traumas (e.g., assaults and child abuse). Conclusion: In this clinical sample, there were consistent and strong associations between several trauma-related variables and substance-related problems, consistent with our hypotheses. We discuss possible explanations and implications of these findings, which we hope will stimulate further research, and improve screening and treatment. PMID:27517964
The inverse problem of brain energetics: ketone bodies as alternative substrates
NASA Astrophysics Data System (ADS)
Calvetti, D.; Occhipinti, R.; Somersalo, E.
2008-07-01
Little is known about brain energy metabolism under ketosis, although there is evidence that ketone bodies have a neuroprotective role in several neurological disorders. We investigate the inverse problem of estimating reaction fluxes and transport rates in the different cellular compartments of the brain, when the data amounts to a few measured arterial venous concentration differences. By using a recently developed methodology to perform Bayesian Flux Balance Analysis and a new five compartment model of the astrocyte-glutamatergic neuron cellular complex, we are able to identify the preferred biochemical pathways during shortage of glucose and in the presence of ketone bodies in the arterial blood. The analysis is performed in a minimally biased way, therefore revealing the potential of this methodology for hypothesis testing.
A Comparison of Methods to Analyze Aquatic Heterotrophic Flagellates of Different Taxonomic Groups.
Jeuck, Alexandra; Nitsche, Frank; Wylezich, Claudia; Wirth, Olaf; Bergfeld, Tanja; Brutscher, Fabienne; Hennemann, Melanie; Monir, Shahla; Scherwaß, Anja; Troll, Nicole; Arndt, Hartmut
2017-08-01
Heterotrophic flagellates contribute significantly to the matter flux in aquatic and terrestrial ecosystems. Still today their quantification and taxonomic classification bear several problems in field studies, though these methodological problems seem to be increasingly ignored in current ecological studies. Here we describe and test different methods, the live-counting technique, different fixation techniques, cultivation methods like the liquid aliquot method (LAM), and a molecular single cell survey called aliquot PCR (aPCR). All these methods have been tested either using aquatic field samples or cultures of freshwater and marine taxa. Each of the described methods has its advantages and disadvantages, which have to be considered in every single case. With the live-counting technique a detection of living cells up to morphospecies level is possible. Fixation of cells and staining methods are advantageous due to the possible long-term storage and observation of samples. Cultivation methods (LAM) offer the possibility of subsequent molecular analyses, and aPCR tools might complete the deficiency of LAM in terms of the missing detection of non-cultivable flagellates. In summary, we propose a combination of several investigation techniques reducing the gap between the different methodological problems. Copyright © 2017 Elsevier GmbH. All rights reserved.
Methodological difficulties of conducting agroecological studies from a statistical perspective
USDA-ARS?s Scientific Manuscript database
Statistical methods for analysing agroecological data might not be able to help agroecologists to solve all of the current problems concerning crop and animal husbandry, but such methods could well help agroecologists to assess, tackle, and resolve several agroecological issues in a more reliable an...
Challenging the Stereotypes of Mexican American Fathers
ERIC Educational Resources Information Center
Saracho, Olivia N.; Spodek, Bernard
2007-01-01
This critical review presents studies of Mexican American fathers in the United Sates to provide researchers with an understanding of contemporary fatherhood. It describes the myths that cause methodological and conceptual problems in interpreting the results of studies on Mexican American fathers. Several common challenges and limitations in…
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
Robinson, Sean M; Sobell, Linda Carter; Sobell, Mark B; Arcidiacono, Steven; Tzall, David
2014-01-01
Several methodological reviews of alcohol treatment outcome studies and one review of drug studies have been published over the past 40 years. Although past reviews demonstrated methodological improvements in alcohol studies, they also found continued deficiencies. The current review allows for an updated evaluation of the methodological rigor of alcohol and drug studies and, by utilizing inclusion criteria similar to previous reviews, it allows for a comparative review over time. In addition, this is the first review that compares the methodology of alcohol and drug treatment outcome studies published during the same time period. The methodology for 25 alcohol and 11 drug treatment outcome studies published from 2005 through 2010 that met the review's inclusion criteria was evaluated. The majority of variables evaluated were used in prior reviews. The current review found that more alcohol and drug treatment outcome studies are now using continuous substance use measures and assessing problem severity. Although there have been methodological improvements over time, the current reviews differed little from their most recent past counterpart. Despite this finding, some areas, particularly the continued low reporting of demographic data, needs strengthening. Improvement in the methodological rigor of alcohol and drug treatment outcome studies has occurred over time. The current review found few differences between alcohol and drug study methodologies as well as few differences between the current review and the most recent past alcohol and drug reviews. © 2013 Elsevier Ltd. All rights reserved.
Perspectives for Teachers of Latin American Culture.
ERIC Educational Resources Information Center
Seelye, H. Ned, Ed.
Articles treating various aspects of the teaching of Hispanic culture at the secondary and junior high school levels are intended to improve methodology and facilitate the development of teacher-made instructional materials. An overview of the field relating problems and procedures in several areas is developed. Selections cover: (1) bilinguality,…
Manual for the Comparative Politics Laboratory: Conditions for Effective Democracy.
ERIC Educational Resources Information Center
Fogelman, Edwin; Zingale, Nancy
This manual introduces undergraduate students in political science to major types of data and methods for cross-national quantitative analysis. The manual's topic, Conditions for Effective Democracy, was chosen because it incorporates several different kinds of data and illustrates various methodological problems. The data are cross-sectional…
False-Positive Tangible Outcomes of Functional Analyses
ERIC Educational Resources Information Center
Rooker, Griffin W.; Iwata, Brian A.; Harper, Jill M.; Fahmie, Tara A.; Camp, Erin M.
2011-01-01
Functional analysis (FA) methodology is the most precise method for identifying variables that maintain problem behavior. Occasionally, however, results of an FA may be influenced by idiosyncratic sensitivity to aspects of the assessment conditions. For example, data from several studies suggest that inclusion of a tangible condition during an FA…
Emerging Models of the New Paradigm.
ERIC Educational Resources Information Center
Howser, Lee; Schwinn, Carole
Working with the Philadelphia-based Institute of Interactive Management, several teams at Jackson Community College (JCC), in Michigan, set out in 1994 to learn and apply an interactive design methodology to selected college subsystems. Interactive design begins with understanding problems faced by the system as a whole, which in the case of JCC…
Composite Indices of Development and Poverty: An Application to MDGs
ERIC Educational Resources Information Center
De Muro, Pasquale; Mazziotta, Matteo; Pareto, Adriano
2011-01-01
The measurement of development or poverty as multidimensional phenomena is very difficult because there are several theoretical, methodological and empirical problems involved. The literature of composite indicators offers a wide variety of aggregation methods, all with their pros and cons. In this paper, we propose a new, alternative composite…
NASA Astrophysics Data System (ADS)
Bruzzone, Agostino G.; Revetria, Roberto; Simeoni, Simone; Viazzo, Simone; Orsoni, Alessandra
2004-08-01
In logistics and industrial production managers must deal with the impact of stochastic events to improve performances and reduce costs. In fact, production and logistics systems are generally designed considering some parameters as deterministically distributed. While this assumption is mostly used for preliminary prototyping, it is sometimes also retained during the final design stage, and especially for estimated parameters (i.e. Market Request). The proposed methodology can determine the impact of stochastic events in the system by evaluating the chaotic threshold level. Such an approach, based on the application of a new and innovative methodology, can be implemented to find the condition under which chaos makes the system become uncontrollable. Starting from problem identification and risk assessment, several classification techniques are used to carry out an effect analysis and contingency plan estimation. In this paper the authors illustrate the methodology with respect to a real industrial case: a production problem related to the logistics of distributed chemical processing.
Seismic Hazard Estimates Using Ill-defined Macroseismic Data at Site
NASA Astrophysics Data System (ADS)
Albarello, D.; Mucciarelli, M.
- A new approach is proposed to the seismic hazard estimate based on documentary data concerning local history of seismic effects. The adopted methodology allows for the use of ``poor'' data, such as the macroseismic ones, within a formally coherent approach that permits overcoming a number of problems connected to the forcing of available information in the frame of ``standard'' methodologies calibrated on the use of instrumental data. The use of the proposed methodology allows full exploitation of all the available information (that for many towns in Italy covers several centuries) making possible a correct use of macroseismic data characterized by different levels of completeness and reliability. As an application of the proposed methodology, seismic hazard estimates are presented for two towns located in Northern Italy: Bologna and Carpi.
Constructing space difference schemes which satisfy a cell entropy inequality
NASA Technical Reports Server (NTRS)
Merriam, Marshal L.
1989-01-01
A numerical methodology for solving convection problems is presented, using finite difference schemes which satisfy the second law of thermodynamics on a cell-by-cell basis in addition to the usual conservation laws. It is shown that satisfaction of a cell entropy inequality is sufficient, in some cases, to guarantee nonlinear stability. Some details are given for several one-dimensional problems, including the quasi-one-dimensional Euler equations applied to flow in a nozzle.
Accurate quantum chemical calculations
NASA Technical Reports Server (NTRS)
Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.
1989-01-01
An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.
Environment, genes, and experience: lessons from behavior genetics.
Barsky, Philipp I
2010-11-01
The article reviews the theoretical analysis of the problems inherent in studying the environment within behavior genetics across several periods in the development of environmental studies in behavior genetics and proposes some possible alternatives to traditional approaches to studying the environment in behavior genetics. The first period (from the end of the 1920s to the end of the 1970s), when the environment was not actually studied, is called pre-environmental; during this time, the basic principles and theoretical models of understanding environmental effects in behavior genetics were developed. The second period is characterized by the development of studies on environmental influences within the traditional behavior genetics paradigm; several approaches to studying the environment emerged in behavior genetics during this period, from the beginning of the 1980s until today. At the present time, the field is undergoing paradigmatic changes, concerned with methodology, theory, and mathematical models of genotype-environment interplay; this might be the beginning of a third period of development of environmental studies in behavior genetics. In another part, the methodological problems related to environmental studies in behavior genetics are discussed. Although the methodology used in differential psychology is applicable for assessment of differences between individuals, it is insufficient to explain the sources of these differences. In addition, we stress that psychoanalytic studies of twins and their experiences, initiated in the 1930s and continued episodically until the 1980s, could bring an interesting methodology and contribute to the explanation of puzzling findings from environmental studies of behavior genetics. Finally, we will conclude with implications from the results of environmental studies in behavior genetics, including methodological issues. Copyright © 2010 Elsevier Ltd. All rights reserved.
Short-Term fo F2 Forecast: Present Day State of Art
NASA Astrophysics Data System (ADS)
Mikhailov, A. V.; Depuev, V. H.; Depueva, A. H.
An analysis of the F2-layer short-term forecast problem has been done. Both objective and methodological problems prevent us from a deliberate F2-layer forecast issuing at present. An empirical approach based on statistical methods may be recommended for practical use. A forecast method based on a new aeronomic index (a proxy) AI has been proposed and tested over selected 64 severe storm events. The method provides an acceptable prediction accuracy both for strongly disturbed and quiet conditions. The problems with the prediction of the F2-layer quiet-time disturbances as well as some other unsolved problems are discussed
Common Methodological Problems in Research on the Addictions.
ERIC Educational Resources Information Center
Nathan, Peter E.; Lansky, David
1978-01-01
Identifies common problems in research on the addictions and offers suggestions for remediating these methodological problems. The addictions considered include alcoholism and drug dependencies. Problems considered are those arising from inadequate, incomplete, or biased reviews of relevant literatures and methodological shortcomings of subject…
Common methodological flaws in economic evaluations.
Drummond, Michael; Sculpher, Mark
2005-07-01
Economic evaluations are increasingly being used by those bodies such as government agencies and managed care groups that make decisions about the reimbursement of health technologies. However, several reviews of economic evaluations point to numerous deficiencies in the methodology of studies or the failure to follow published methodological guidelines. This article, written for healthcare decision-makers and other users of economic evaluations, outlines the common methodological flaws in studies, focussing on those issues that are likely to be most important when deciding on the reimbursement, or guidance for use, of health technologies. The main flaws discussed are: (i) omission of important costs or benefits; (ii) inappropriate selection of alternatives for comparison; (iii) problems in making indirect comparisons; (iv) inadequate representation of the effectiveness data; (v) inappropriate extrapolation beyond the period observed in clinical studies; (vi) excessive use of assumptions rather than data; (vii) inadequate characterization of uncertainty; (viii) problems in aggregation of results; (ix) reporting of average cost-effectiveness ratios; (x) lack of consideration of generalizability issues; and (xi) selective reporting of findings. In each case examples are given from the literature and guidance is offered on how to detect flaws in economic evaluations.
Applying Bayesian statistics to the study of psychological trauma: A suggestion for future research.
Yalch, Matthew M
2016-03-01
Several contemporary researchers have noted the virtues of Bayesian methods of data analysis. Although debates continue about whether conventional or Bayesian statistics is the "better" approach for researchers in general, there are reasons why Bayesian methods may be well suited to the study of psychological trauma in particular. This article describes how Bayesian statistics offers practical solutions to the problems of data non-normality, small sample size, and missing data common in research on psychological trauma. After a discussion of these problems and the effects they have on trauma research, this article explains the basic philosophical and statistical foundations of Bayesian statistics and how it provides solutions to these problems using an applied example. Results of the literature review and the accompanying example indicates the utility of Bayesian statistics in addressing problems common in trauma research. Bayesian statistics provides a set of methodological tools and a broader philosophical framework that is useful for trauma researchers. Methodological resources are also provided so that interested readers can learn more. (c) 2016 APA, all rights reserved).
Danforth, Jeffrey S; Doerfler, Leonard A; Connor, Daniel F
2017-08-01
The goal was to examine whether anxiety modifies the risk for, or severity of, conduct problems in children with ADHD. Assessment included both categorical and dimensional measures of ADHD, anxiety, and conduct problems. Analyses compared conduct problems between children with ADHD features alone versus children with co-occurring ADHD and anxiety features. When assessed by dimensional rating scales, results showed that compared with children with ADHD alone, those children with ADHD co-occurring with anxiety are at risk for more intense conduct problems. When assessment included a Diagnostic and Statistical Manual of Mental Disorders (4th ed.; DSM-IV) diagnosis via the Schedule for Affective Disorders and Schizophrenia for School Age Children-Epidemiologic Version (K-SADS), results showed that compared with children with ADHD alone, those children with ADHD co-occurring with anxiety neither had more intense conduct problems nor were they more likely to be diagnosed with oppositional defiant disorder or conduct disorder. Different methodological measures of ADHD, anxiety, and conduct problem features influenced the outcome of the analyses.
Functional approximation using artificial neural networks in structural mechanics
NASA Technical Reports Server (NTRS)
Alam, Javed; Berke, Laszlo
1993-01-01
The artificial neural networks (ANN) methodology is an outgrowth of research in artificial intelligence. In this study, the feed-forward network model that was proposed by Rumelhart, Hinton, and Williams was applied to the mapping of functions that are encountered in structural mechanics problems. Several different network configurations were chosen to train the available data for problems in materials characterization and structural analysis of plates and shells. By using the recall process, the accuracy of these trained networks was assessed.
A Different Approach to the Generation of Patient Management Problems from a Knowledge-Based System
Barriga, Rosa Maria
1988-01-01
Several strategies are proposed to approach the generation of Patient Management Problems from a Knowledge Base and avoid inconsistencies in the results. These strategies are based on a different Knowledge Base structure and in the use of case introductions that describe the patient attributes which are not disease-dependent. This methodology has proven effective in a recent pilot test and it is on its way to implementation as part of an educational program at CWRU, School of Medicine.
ERIC Educational Resources Information Center
Ohlsson, Stellan; Cosejo, David G.
2014-01-01
The problem of how people process novel and unexpected information--"deep learning" (Ohlsson in "Deep learning: how the mind overrides experience." Cambridge University Press, New York, 2011)--is central to several fields of research, including creativity, belief revision, and conceptual change. Researchers have not converged…
Mixed Methods Research: What Are the Key Issues to Consider?
ERIC Educational Resources Information Center
Ghosh, Rajashi
2016-01-01
Mixed methods research (MMR) is increasingly becoming a popular methodological approach in several fields due to the promise it holds for comprehensive understanding of complex problems being researched. However, researchers interested in MMR often lack reference to a guide that can explain the key issues pertaining to the paradigm wars…
Training School Personnel on Implementation of Check-in--Check-out Behavioral Interventions
ERIC Educational Resources Information Center
Ruiz, Maria Isolina
2012-01-01
Over the last several years, educational policy has advocated for a preventive approach to keep problem behaviors from escalating and interfering with student achievement. Proactive methodologies such as school-wide positive behavior support (SWPBS) and response to intervention (RTI) have become common practice in school districts across the…
Assessing the Impact of Planned Social Change. Occasional Paper Series, #8.
ERIC Educational Resources Information Center
Campbell, Donald T.
Program impact methodology--usually referred to as evaluation research--is described as it is developing in the United States. Several problems face the field of evaluation research. First, those issues grouped as "meta-scientific" include: (1) the distinction between qualitative and quantitative studies; (2) the separation of implementation and…
Is Acculturation in Hispanic Health Research a Flawed Concept? JSRI Working Paper.
ERIC Educational Resources Information Center
Ponce, Carlos; Comer, Brendon
Some health researchers have used the concept of acculturation to explain health behaviors or illnesses prevalent among Hispanic people. This paper reviews studies in health, educational, and social science research among Hispanics and argues that acculturation studies are seriously limited by several basic conceptual and methodological problems.…
An Overview of Integrated Logistic Support in Medical Material Programs.
1980-12-01
OF MEDICAL INTEGRATED LOGISTIC SUPPORT ----------------- 7 B. PROBLEM DEFINITION AND OBJECTIVE ------------ 9 C. GENERAL APPROACH AND METHODOLOGY...SYSTEM ---------------------- 61 C. GENERAL CONCLUSIONS ------------------------- 63 D. RECOMMENDATIONS ----------------------------- 73 E. CONCLUSION...21 Technological advancement has caused major changes in medicine and dentistry in the last several decades. Inten- sive care units, computerized axial
Brain Oscillations Forever--Neurophysiology in Future Research of Child Psychiatric Problems
ERIC Educational Resources Information Center
Rothenberger, Aribert
2009-01-01
For decades neurophysiology has successfully contributed to research and clinical care in child psychiatry. Recently, methodological progress has led to a revival of interest in brain oscillations (i.e., a band of periodic neuronal frequencies with a wave-duration from milliseconds to several seconds which may code and decode information). These…
Flynn, Samantha; Vereenooghe, Leen; Hastings, Richard P; Adams, Dawn; Cooper, Sally-Ann; Gore, Nick; Hatton, Chris; Hood, Kerry; Jahoda, Andrew; Langdon, Peter E; McNamara, Rachel; Oliver, Chris; Roy, Ashok; Totsika, Vasiliki; Waite, Jane
2017-11-01
Mental health problems affect people with intellectual disabilities (ID) at rates similar to or in excess of the non-ID population. People with severe ID are likely to have persistent mental health problems. In this systematic review (PROSPERO 2015:CRD42015024469), we identify and evaluate the methodological quality of available measures of mental health problems or well-being in individuals with severe or profound ID. Electronic searches of ten databases identified relevant publications. Two reviewers independently reviewed titles and abstracts of retrieved records (n=41,232) and full-text articles (n=573). Data were extracted and the quality of included papers was appraised. Thirty-two papers reporting on 12 measures were included. Nine measures addressed a broad spectrum of mental health problems, and were largely observational. One physiological measure of well-being was included. The Aberrant Behavior Checklist, Diagnostic Assessment for the Severely Handicapped Scale-II and Mood, Interest and Pleasure Questionnaire are reliable measures in this population. However, the psychometric properties of six other measures were only considered within a single study - indicating a lack of research replication. Few mental health measures are available for people with severe or profound ID, particularly lacking are tools measuring well-being. Assessment methods that do not rely on proxy reports should be explored further. Copyright © 2017 Elsevier Ltd. All rights reserved.
Duncan, Niall W; Wiebking, Christine; Muñoz-Torres, Zeidy; Northoff, Georg
2014-01-15
There is an increasing interest in combining different imaging modalities to investigate the relationship between neural and biochemical activity. More specifically, imaging techniques like MRS and PET that allow for biochemical measurement are combined with techniques like fMRI and EEG that measure neural activity in different states. Such combination of neural and biochemical measures raises not only technical issues, such as merging the different data sets, but also several methodological issues. These methodological issues – ranging from hypothesis generation and hypothesis-guided use of technical facilities to target measures and experimental measures – are the focus of this paper. We discuss the various methodological problems and issues raised by the combination of different imaging methodologies in order to investigate neuro-biochemical relationships on a regional level in humans. For example, the choice of transmitter and scan type is discussed, along with approaches to allow the establishment of particular specificities (such as regional or biochemical) to in turn make results fully interpretable. An algorithm that can be used as a form of checklist for designing such multimodal studies is presented. The paper concludes that while several methodological and technical caveats needs to be overcome and addressed, multimodal imaging of the neuro-biochemical relationship provides an important tool to better understand the physiological mechanisms of the human brain.
Duncan, Niall W; Wiebking, Christine; Munoz-Torres, Zeidy; Northoff, Georg
2013-10-25
There is an increasing interest in combining different imaging modalities to investigate the relationship between neural and biochemical activity. More specifically, imaging techniques like MRS and PET that allow for biochemical measurement are combined with techniques like fMRI and EEG that measure neural activity in different states. Such combination of neural and biochemical measures raises not only technical issues, such as merging the different data sets, but also several methodological issues. These methodological issues - ranging from hypothesis generation and hypothesis-guided use of technical facilities to target measures and experimental measures - are the focus of this paper. We discuss the various methodological problems and issues raised by the combination of different imaging methodologies in order to investigate neuro-biochemical relationships on a regional level in humans. For example, the choice of transmitter and scan type is discussed, along with approaches to allow the establishment of particular specificities (such as regional or biochemical) to in turn make results fully interpretable. An algorithm that can be used as a form of checklist for designing such multimodal studies is presented. The paper concludes that while several methodological and technical caveats needs to be overcome and addressed, multimodal imaging of the neuro-biochemical relationship provides an important tool to better understand the physiological mechanisms of the human brain. Copyright © 2013. Published by Elsevier B.V.
Travel into a fairy land: a critique of modern qualitative and mixed methods psychologies.
Toomela, Aaro
2011-03-01
In this article modern qualitative and mixed methods approaches are criticized from the standpoint of structural-systemic epistemology. It is suggested that modern qualitative methodologies suffer from several fallacies: some of them are grounded on inherently contradictory epistemology, the others ask scientific questions after the methods have been chosen, conduct studies inductively so that not only answers but even questions are often supposed to be discovered, do not create artificial situations and constraints on study-situations, are adevelopmental by nature, study not the external things and phenomena but symbols and representations--often the object of studies turns out to be the researcher rather than researched, rely on ambiguous data interpretation methods based to a large degree on feelings and opinions, aim to understand unique which is theoretically impossible, or have theoretical problems with sampling. Any one of these fallacies would be sufficient to exclude any possibility to achieve structural-systemic understanding of the studied things and phenomena. It also turns out that modern qualitative methodologies share several fallacies with the quantitative methodology. Therefore mixed methods approaches are not able to overcome the fundamental difficulties that characterize mixed methods taken separately. It is proposed that structural-systemic methodology that dominated psychological thought in the pre-WWII continental Europe is philosophically and theoretically better grounded than the other methodologies that can be distinguished in psychology today. Future psychology should be based on structural-systemic methodology.
Method for the Direct Solve of the Many-Body Schrödinger Wave Equation
NASA Astrophysics Data System (ADS)
Jerke, Jonathan; Tymczak, C. J.; Poirier, Bill
We report on theoretical and computational developments towards a computationally efficient direct solve of the many-body Schrödinger wave equation for electronic systems. This methodology relies on two recent developments pioneered by the authors: 1) the development of a Cardinal Sine basis for electronic structure calculations; and 2) the development of a highly efficient and compact representation of multidimensional functions using the Canonical tensor rank representation developed by Belykin et. al. which we have adapted to electronic structure problems. We then show several relevant examples of the utility and accuracy of this methodology, scaling with system size, and relevant convergence issues of the methodology. Method for the Direct Solve of the Many-Body Schrödinger Wave Equation.
NASA Technical Reports Server (NTRS)
Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.
1993-01-01
Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.
Conflict and the Common Good. Studies in Third World Societies, Publication Number Twenty-Four.
ERIC Educational Resources Information Center
Merrill, Robert S., Ed.; Willner, Dorothy, Ed.
The fundamental theme of these papers is what constitutes the common good and the issues and problems related to the understanding of that common good. Several anthropologists and a political scientist explore this theme in various geographic settings and from many theoretical and methodological perspectives. Among the countries and cultures…
The Use of Educational Platforms as Teaching Resource in Mathematics
ERIC Educational Resources Information Center
Gómez-Zermeño, Marcela; Franco-Gutiérrez, Héctor
2018-01-01
Dropping out from the school system at High School level has been a problem for several years; high levels of mathematics' failing have been a recurring situation. This paper discusses how academic virtual counseling might be a tool to help students in math class. The methodological approach is based in the non-experimental, longitudinal model…
Class Size and Student Performance at a Public Research University: A Cross-Classified Model
ERIC Educational Resources Information Center
Johnson, Iryna Y.
2010-01-01
This study addresses several methodological problems that have confronted prior research on the effect of class size on student achievement. Unlike previous studies, this analysis accounts for the hierarchical data structure of student achievement, where grades are nested within classes and students, and considers a wide range of class sizes…
Neurophenomenology revisited: second-person methods for the study of human consciousness
Olivares, Francisco A.; Vargas, Esteban; Fuentes, Claudio; Martínez-Pernía, David; Canales-Johnson, Andrés
2015-01-01
In the study of consciousness, neurophenomenology was originally established as a novel research program attempting to reconcile two apparently irreconcilable methodologies in psychology: qualitative and quantitative methods. Its potential relies on Francisco Varela’s idea of reciprocal constraints, in which first-person accounts and neurophysiological data mutually inform each other. However, since its first conceptualization, neurophenomenology has encountered methodological problems. These problems have emerged mainly because of the difficulty of obtaining and analyzing subjective reports in a systematic manner. However, more recently, several interview techniques for describing subjective accounts have been developed, collectively known as “second-person methods.” Second-person methods refer to interview techniques that solicit both verbal and non-verbal information from participants in order to obtain systematic and detailed subjective reports. Here, we examine the potential for employing second-person methodologies in the neurophenomenological study of consciousness and we propose three practical ideas for developing a second-person neurophenomenological method. Thus, we first describe second-person methodologies available in the literature for analyzing subjective reports, identifying specific constraints on the status of the first-, second- and third- person methods. Second, we analyze two experimental studies that explicitly incorporate second-person methods for traversing the “gap” between phenomenology and neuroscience. Third, we analyze the challenges that second-person accounts face in establishing an objective methodology for comparing results across different participants and interviewers: this is the “validation” problem. Finally, we synthesize the common aspects of the interview methods described above. In conclusion, our arguments emphasize that second-person methods represent a powerful approach for closing the gap between the experiential and the neurobiological levels of description in the study of human consciousness. PMID:26074839
Jose Cherackal, George; Thomas, Eapen; Prathap, Akhilesh
2013-01-01
For patients whose orthodontic problems are so severe that neither growth modification nor camouflage offers a solution, surgery to realign the jaws or reposition dentoalveolar segments is the only possible treatment. Surgery is not a substitute for orthodontics in these patients. Instead, it must be properly coordinated with orthodontics and other dental treatments to achieve good overall results. Dramatic progress in recent years has made it possible for combined surgical orthodontic treatment to be carried out successfully for patients with a severe dentofacial problem of any type. This case report provides an overview of the current treatment methodology in managing a combination of asymmetrical mandibular prognathism and vertical maxillary excess.
Problem solving using soft systems methodology.
Land, L
This article outlines a method of problem solving which considers holistic solutions to complex problems. Soft systems methodology allows people involved in the problem situation to have control over the decision-making process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, James V.; Wellman, Gerald William; Emery, John M.
2011-09-01
Fracture or tearing of ductile metals is a pervasive engineering concern, yet accurate prediction of the critical conditions of fracture remains elusive. Sandia National Laboratories has been developing and implementing several new modeling methodologies to address problems in fracture, including both new physical models and new numerical schemes. The present study provides a double-blind quantitative assessment of several computational capabilities including tearing parameters embedded in a conventional finite element code, localization elements, extended finite elements (XFEM), and peridynamics. For this assessment, each of four teams reported blind predictions for three challenge problems spanning crack initiation and crack propagation. After predictionsmore » had been reported, the predictions were compared to experimentally observed behavior. The metal alloys for these three problems were aluminum alloy 2024-T3 and precipitation hardened stainless steel PH13-8Mo H950. The predictive accuracies of the various methods are demonstrated, and the potential sources of error are discussed.« less
Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.
Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin
2017-08-16
The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.
Baines, Janis; Cunningham, Judy; Leemhuis, Christel; Hambridge, Tracy; Mackerras, Dorothy
2011-01-01
The approach used by food regulation agencies to examine the literature and forecast the impact of possible food regulations has many similar features to the approach used in nutritional epidemiological research. We outline the Risk Analysis Framework described by FAO/WHO, in which there is formal progression from identification of the nutrient or food chemical of interest, through to describing its effect on health and then assessing whether there is a risk to the population based on dietary exposure estimates. We then discuss some important considerations for the dietary modeling component of the Framework, including several methodological issues that also exist in research nutritional epidemiology. Finally, we give several case studies that illustrate how the different methodological components are used together to inform decisions about how to manage the regulatory problem. PMID:22254081
Recent archaeomagnetic studies in Slovakia: Comparison of methodological approaches
NASA Astrophysics Data System (ADS)
Kubišová, Lenka
2016-03-01
We review the recent archaeomagnetic studies carried out on the territory of Slovakia, focusing on the comparison of methodological approaches, discussing pros and cons of the individual applied methods from the perspective of our experience. The most widely used methods for the determination of intensity and direction of the archaeomegnetic field by demagnetisation of the sample material are the alternating field (AF) demagnetisation and the Thellier double heating method. These methods are used not only for archaeomagnetic studies but also help to solve some geological problems. The two methods were applied to samples collected recently at several sites of Slovakia, where archaeological prospection invoked by earthwork or reconstruction work of developing projects demanded archaeomagnetic dating. Then we discuss advantages and weaknesses of the investigated methods from different perspectives based on several examples and our recent experience.
Artistic image analysis using graph-based learning approaches.
Carneiro, Gustavo
2013-08-01
We introduce a new methodology for the problem of artistic image analysis, which among other tasks, involves the automatic identification of visual classes present in an art work. In this paper, we advocate the idea that artistic image analysis must explore a graph that captures the network of artistic influences by computing the similarities in terms of appearance and manual annotation. One of the novelties of our methodology is the proposed formulation that is a principled way of combining these two similarities in a single graph. Using this graph, we show that an efficient random walk algorithm based on an inverted label propagation formulation produces more accurate annotation and retrieval results compared with the following baseline algorithms: bag of visual words, label propagation, matrix completion, and structural learning. We also show that the proposed approach leads to a more efficient inference and training procedures. This experiment is run on a database containing 988 artistic images (with 49 visual classification problems divided into a multiclass problem with 27 classes and 48 binary problems), where we show the inference and training running times, and quantitative comparisons with respect to several retrieval and annotation performance measures.
Problem reporting and tracking system: a systems engineering challenge
NASA Astrophysics Data System (ADS)
Cortez, Vasco; Lopez, Bernhard; Whyborn, Nicholas; Price, Roberto; Hernandez, Octavio; Gairing, Stefan; Barrios, Emilio; Alarcon, Hector
2016-08-01
The problem reporting and tracking system (PRTS) is the ALMA system to register operational problems, track unplanned corrective operational maintenance activities and follow the investigations of all problems or possible issues arisen in operation activities. After the PRTS implementation appeared several issues that finally produced a lack in the management of the investigations, problems to produce KPIs, loss of information, among others. In order to improve PRTS, we carried out a process to review the status of system, define a set of modifications and implement a solution; all according to the stakeholder requirements. In this work, we shall present the methodology applied to define a set of concrete actions at the basis of understanding the complexity of the problem, which finally got to improve the interactions between different subsystems and enhance the communication at different levels.
Evaluating a Web-Based Interface for Internet Telemedicine
NASA Technical Reports Server (NTRS)
Lathan, Corinna E.; Newman, Dava J.; Sebrechts, Marc M.; Doarn, Charles R.
1997-01-01
The objective is to introduce the usability engineering methodology, heuristic evaluation, to the design and development of a web-based telemedicine system. Using a set of usability criteria, or heuristics, one evaluator examined the Spacebridge to Russia web-site for usability problems. Thirty-four usability problems were found in this preliminary study and all were assigned a severity rating. The value of heuristic analysis in the iterative design of a system is shown because the problems can be fixed before deployment of a system and the problems are of a different nature than those found by actual users of the system. It was therefore determined that there is potential value of heuristic evaluation paired with user testing as a strategy for optimal system performance design.
What we know and don't know about mental health problems among immigrants in Norway.
Abebe, Dawit Shawel; Lien, Lars; Hjelde, Karin Harsløf
2014-02-01
Mental health problems have been regarded as one of the main public health challenges of immigrants in several countries. Understanding and generating research-based knowledge on immigrant health problems is highly relevant for planning preventive interventions, as well as guiding social and policy actions. This review aims to map the available knowledge on immigrants' mental health status and its associated risk factors in Norway. The reviewed literature about mental health problems among immigrant populations in Norway was found through databases, such as PUBMED, EMBASE, PsychINFO and MEDLINE. About 41 peer-reviewed original articles published since 1990s were included. In the majority of the studies, the immigrant populations, specifically adult immigrants from low and middle income countries, have been found with a higher degree of mental health problems compared to Norwegians and the general population. Increased risk for mental illness is primarily linked to a higher risk for acculturative stress, poor social support, deprived socioeconomic conditions, multiple negative life events, experiences of discrimination and traumatic pre-migration experiences. However, research in this field has been confronted by a number of gaps and methodological challenges. The available knowledge indicates a need for preventive interventions. Correspondingly, it strongly recommends a comprehensive research program that addresses gaps and methodological challenges.
ERIC Educational Resources Information Center
Delta Project on Postsecondary Education Costs, Productivity and Accountability, 2009
2009-01-01
Most fiscal reporting focuses on revenues (whether or not they go to core purposes), tuition and fees, and financial aid. "How" the money is spent is something that remains shrouded in too much mystery. Several national efforts to address this problem have largely come to naught--probably because those common methodologies are simultaneously not…
Problem Solving Approaches in Mathematics Education as a Product of Japanese Lesson Study
ERIC Educational Resources Information Center
Isoda, Masami
2011-01-01
What is the product of Lesson Study? Lesson Study is a scientific activity for teachers based on the methodology introduced in the 1880s. In Japan, research topics for Lesson Study are usually shared through the regular revisions of the curriculum and the research movement of several societies. As a result of teachers' efforts to overcome…
Violanti, S; Fraschetta, M; Adda, S; Caputo, E
2009-12-01
Within the framework of Environmental Agencies system's activities, coordinated by ISPRA (superior institute for environmental protection and research), a comparison among measurements was designed and accomplished, in order to go into depth on the matter of measurement problems and to evaluate magnetic field at power frequencies. These measurements have been taken near medium voltage /low voltage transformer substation. This project was developed with the contribution of several experts who belong to different Regional Agencies. In three of these regions, substations having specific international standard characteristics were chosen; then a measurement and data analysis protocol was arranged. Data analysis showed a good level of coherence among results obtained by different laboratories. However, a range of problems emerged, either during the protocol predisposition and definition of the data analysis procedure or during the execution of measures and data reprocessing, because of the spatial and temporal variability of magnetic field. These problems represent elements of particular interest in determining a correct measurement methodology, whose purpose is the comparison with limits of exposure, attention values and quality targets.
A mini-review of TAT-MyoD fused proteins: state of the art and problems to solve.
Patruno, Marco; Melotti, Luca; Gomiero, Chiara; Sacchetto, Roberta; Topel, Ohad; Martinello, Tiziana
2017-12-05
The transcriptional activator TAT is a small peptide essential for viral replication and possesses the property of entering the cells from the extracellular milieu, acting as a membrane shuttle. In order to safely differentiate cells an innovative methodology, based on the fusion of transcription factors and the TAT sequence, is discussed in this short review. In several studies, it has been demonstrated that TAT protein can be observed in the cell nucleus after few hours from the inoculation although its way of action is not fully understood. However, further studies will be necessary to develop this methodology for clinical purposes.
Statistics and Informatics in Space Astrophysics
NASA Astrophysics Data System (ADS)
Feigelson, E.
2017-12-01
The interest in statistical and computational methodology has seen rapid growth in space-based astrophysics, parallel to the growth seen in Earth remote sensing. There is widespread agreement that scientific interpretation of the cosmic microwave background, discovery of exoplanets, and classifying multiwavelength surveys is too complex to be accomplished with traditional techniques. NASA operates several well-functioning Science Archive Research Centers providing 0.5 PBy datasets to the research community. These databases are integrated with full-text journal articles in the NASA Astrophysics Data System (200K pageviews/day). Data products use interoperable formats and protocols established by the International Virtual Observatory Alliance. NASA supercomputers also support complex astrophysical models of systems such as accretion disks and planet formation. Academic researcher interest in methodology has significantly grown in areas such as Bayesian inference and machine learning, and statistical research is underway to treat problems such as irregularly spaced time series and astrophysical model uncertainties. Several scholarly societies have created interest groups in astrostatistics and astroinformatics. Improvements are needed on several fronts. Community education in advanced methodology is not sufficiently rapid to meet the research needs. Statistical procedures within NASA science analysis software are sometimes not optimal, and pipeline development may not use modern software engineering techniques. NASA offers few grant opportunities supporting research in astroinformatics and astrostatistics.
Jose Cherackal, George; Thomas, Eapen; Prathap, Akhilesh
2013-01-01
For patients whose orthodontic problems are so severe that neither growth modification nor camouflage offers a solution, surgery to realign the jaws or reposition dentoalveolar segments is the only possible treatment. Surgery is not a substitute for orthodontics in these patients. Instead, it must be properly coordinated with orthodontics and other dental treatments to achieve good overall results. Dramatic progress in recent years has made it possible for combined surgical orthodontic treatment to be carried out successfully for patients with a severe dentofacial problem of any type. This case report provides an overview of the current treatment methodology in managing a combination of asymmetrical mandibular prognathism and vertical maxillary excess. PMID:24455321
Rivera, José; Carrillo, Mariano; Chacón, Mario; Herrera, Gilberto; Bojorquez, Gilberto
2007-01-01
The development of smart sensors involves the design of reconfigurable systems capable of working with different input sensors. Reconfigurable systems ideally should spend the least possible amount of time in their calibration. An autocalibration algorithm for intelligent sensors should be able to fix major problems such as offset, variation of gain and lack of linearity, as accurately as possible. This paper describes a new autocalibration methodology for nonlinear intelligent sensors based on artificial neural networks, ANN. The methodology involves analysis of several network topologies and training algorithms. The proposed method was compared against the piecewise and polynomial linearization methods. Method comparison was achieved using different number of calibration points, and several nonlinear levels of the input signal. This paper also shows that the proposed method turned out to have a better overall accuracy than the other two methods. Besides, experimentation results and analysis of the complete study, the paper describes the implementation of the ANN in a microcontroller unit, MCU. In order to illustrate the method capability to build autocalibration and reconfigurable systems, a temperature measurement system was designed and tested. The proposed method is an improvement over the classic autocalibration methodologies, because it impacts on the design process of intelligent sensors, autocalibration methodologies and their associated factors, like time and cost.
The mathematical statement for the solving of the problem of N-version software system design
NASA Astrophysics Data System (ADS)
Kovalev, I. V.; Kovalev, D. I.; Zelenkov, P. V.; Voroshilova, A. A.
2015-10-01
The N-version programming, as a methodology of the fault-tolerant software systems design, allows successful solving of the mentioned tasks. The use of N-version programming approach turns out to be effective, since the system is constructed out of several parallel executed versions of some software module. Those versions are written to meet the same specification but by different programmers. The problem of developing an optimal structure of N-version software system presents a kind of very complex optimization problem. This causes the use of deterministic optimization methods inappropriate for solving the stated problem. In this view, exploiting heuristic strategies looks more rational. In the field of pseudo-Boolean optimization theory, the so called method of varied probabilities (MVP) has been developed to solve problems with a large dimensionality.
A combinatorial framework to quantify peak/pit asymmetries in complex dynamics.
Hasson, Uri; Iacovacci, Jacopo; Davis, Ben; Flanagan, Ryan; Tagliazucchi, Enzo; Laufs, Helmut; Lacasa, Lucas
2018-02-23
We explore a combinatorial framework which efficiently quantifies the asymmetries between minima and maxima in local fluctuations of time series. We first showcase its performance by applying it to a battery of synthetic cases. We find rigorous results on some canonical dynamical models (stochastic processes with and without correlations, chaotic processes) complemented by extensive numerical simulations for a range of processes which indicate that the methodology correctly distinguishes different complex dynamics and outperforms state of the art metrics in several cases. Subsequently, we apply this methodology to real-world problems emerging across several disciplines including cases in neurobiology, finance and climate science. We conclude that differences between the statistics of local maxima and local minima in time series are highly informative of the complex underlying dynamics and a graph-theoretic extraction procedure allows to use these features for statistical learning purposes.
The Davey-Stewartson Equation on the Half-Plane
NASA Astrophysics Data System (ADS)
Fokas, A. S.
2009-08-01
The Davey-Stewartson (DS) equation is a nonlinear integrable evolution equation in two spatial dimensions. It provides a multidimensional generalisation of the celebrated nonlinear Schrödinger (NLS) equation and it appears in several physical situations. The implementation of the Inverse Scattering Transform (IST) to the solution of the initial-value problem of the NLS was presented in 1972, whereas the analogous problem for the DS equation was solved in 1983. These results are based on the formulation and solution of certain classical problems in complex analysis, namely of a Riemann Hilbert problem (RH) and of either a d-bar or a non-local RH problem respectively. A method for solving the mathematically more complicated but physically more relevant case of boundary-value problems for evolution equations in one spatial dimension, like the NLS, was finally presented in 1997, after interjecting several novel ideas to the panoply of the IST methodology. Here, this method is further extended so that it can be applied to evolution equations in two spatial dimensions, like the DS equation. This novel extension involves several new steps, including the formulation of a d-bar problem for a sectionally non-analytic function, i.e. for a function which has different non-analytic representations in different domains of the complex plane. This, in addition to the computation of a d-bar derivative, also requires the computation of the relevant jumps across the different domains. This latter step has certain similarities (but is more complicated) with the corresponding step for those initial-value problems in two dimensions which can be solved via a non-local RH problem, like KPI.
A Selective Review of Group Selection in High-Dimensional Models
Huang, Jian; Breheny, Patrick; Ma, Shuangge
2013-01-01
Grouping structures arise naturally in many statistical modeling problems. Several methods have been proposed for variable selection that respect grouping structure in variables. Examples include the group LASSO and several concave group selection methods. In this article, we give a selective review of group selection concerning methodological developments, theoretical properties and computational algorithms. We pay particular attention to group selection methods involving concave penalties. We address both group selection and bi-level selection methods. We describe several applications of these methods in nonparametric additive models, semiparametric regression, seemingly unrelated regressions, genomic data analysis and genome wide association studies. We also highlight some issues that require further study. PMID:24174707
Molecular engineering of colloidal liquid crystals using DNA origami
NASA Astrophysics Data System (ADS)
Siavashpouri, Mahsa; Wachauf, Christian; Zakhary, Mark; Praetorius, Florian; Dietz, Hendrik; Dogic, Zvonimir
Understanding the microscopic origin of cholesteric phase remains a foundational, yet unresolved problem in the field of liquid crystals. Lack of experimental model system that allows for the systematic control of the microscopic chiral structure makes it difficult to investigate this problem for several years. Here, using DNA origami technology, we systematically vary the chirality of the colloidal particles with molecular precision and establish a quantitative relationship between the microscopic structure of particles and the macroscopic cholesteric pitch. Our study presents a new methodology for predicting bulk behavior of diverse phases based on the microscopic architectures of the constituent molecules.
Applications of decision analysis and related techniques to industrial engineering problems at KSC
NASA Technical Reports Server (NTRS)
Evans, Gerald W.
1995-01-01
This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).
Clinical relevance in anesthesia journals.
Lauritsen, Jakob; Møller, Ann M
2006-04-01
The purpose of this review is to present the latest knowledge and research on the definition and distribution of clinically relevant articles in anesthesia journals. It will also discuss the importance of the chosen methodology and outcome of articles. In the last few years, more attention has been paid to evidence-based medicine in anesthesia. Several articles on the subject have focused on the need to base clinical decisions on sound research employing both methodological rigor and clinically relevant outcomes. The number of systematic reviews in anesthesia literature is increasing as well as the focus on diminishing the number of surrogate outcomes. It has been shown that the impact factor is not a valid measure of establishing the level of clinical relevance to a journal. This review presents definitions of clinically relevant anesthesia articles. A clinically relevant article employs both methodological rigor and a clinically relevant outcome. The terms methodological rigor and clinical outcomes are fully discussed in the review as well as problems with journal impact factors.
Conceptual design and multidisciplinary optimization of in-plane morphing wing structures
NASA Astrophysics Data System (ADS)
Inoyama, Daisaku; Sanders, Brian P.; Joo, James J.
2006-03-01
In this paper, the topology optimization methodology for the synthesis of distributed actuation system with specific applications to the morphing air vehicle is discussed. The main emphasis is placed on the topology optimization problem formulations and the development of computational modeling concepts. For demonstration purposes, the inplane morphing wing model is presented. The analysis model is developed to meet several important criteria: It must allow large rigid-body displacements, as well as variation in planform area, with minimum strain on structural members while retaining acceptable numerical stability for finite element analysis. Preliminary work has indicated that addressed modeling concept meets the criteria and may be suitable for the purpose. Topology optimization is performed on the ground structure based on this modeling concept with design variables that control the system configuration. In other words, states of each element in the model are design variables and they are to be determined through optimization process. In effect, the optimization process assigns morphing members as 'soft' elements, non-morphing load-bearing members as 'stiff' elements, and non-existent members as 'voids.' In addition, the optimization process determines the location and relative force intensities of distributed actuators, which is represented computationally as equal and opposite nodal forces with soft axial stiffness. Several different optimization problem formulations are investigated to understand their potential benefits in solution quality, as well as meaningfulness of formulation itself. Sample in-plane morphing problems are solved to demonstrate the potential capability of the methodology introduced in this paper.
General Methodology for Designing Spacecraft Trajectories
NASA Technical Reports Server (NTRS)
Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.
2012-01-01
A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.
Philosophy of clinical psychopharmacology.
Aragona, Massimiliano
2013-03-01
The renewal of the philosophical debate in psychiatry is one exciting news of recent years. However, its use in psychopharmacology may be problematic, ranging from self-confinement into the realm of values (which leaves the evidence-based domain unchallenged) to complete rejection of scientific evidence. In this paper philosophy is conceived as a conceptual audit of clinical psychopharmacology. Its function is to criticise the epistemological and methodological problems of current neopositivist, ingenuously realist and evidence-servant psychiatry from within the scientific stance and with the aim of aiding psychopharmacologists in practicing a more self-aware, critical and possibly useful clinical practice. Three examples are discussed to suggest that psychopharmacological practice needs conceptual clarification. At the diagnostic level it is shown that the crisis of the current diagnostic system and the problem of comorbidity strongly influence psychopharmacological results, new conceptualizations more respondent to the psychopharmacological requirements being needed. Heterogeneity of research samples, lack of specificity of psychotropic drugs, difficult generalizability of results, need of a phenomenological study of drug-induced psychopathological changes are discussed herein. At the methodological level the merits and limits of evidence-based practice are considered, arguing that clinicians should know the best available evidence but that guidelines should not be constrictive (due to several methodological biases and rhetorical tricks of which the clinician should be aware, sometimes respondent to extra-scientific, economical requests). At the epistemological level it is shown that the clinical stance is shaped by implicit philosophical beliefs about the mind/body problem (reductionism, dualism, interactionism, pragmatism), and that philosophy can aid physicians to be more aware of their beliefs in order to choose the most useful view and to practice coherently. In conclusion, psychopharmacologists already use methodological audit (e.g. statistical audit); similarly, conceptual clarification is needed in both research planning/evaluation and everyday psychopharmacological practice.
Decision-problem state analysis methodology
NASA Technical Reports Server (NTRS)
Dieterly, D. L.
1980-01-01
A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.
Topology synthesis and size optimization of morphing wing structures
NASA Astrophysics Data System (ADS)
Inoyama, Daisaku
This research demonstrates a novel topology and size optimization methodology for synthesis of distributed actuation systems with specific applications to morphing air vehicle structures. The main emphasis is placed on the topology and size optimization problem formulations and the development of computational modeling concepts. The analysis model is developed to meet several important criteria: It must allow a rigid-body displacement, as well as a variation in planform area, with minimum strain on structural members while retaining acceptable numerical stability for finite element analysis. Topology optimization is performed on a semi-ground structure with design variables that control the system configuration. In effect, the optimization process assigns morphing members as "soft" elements, non-morphing load-bearing members as "stiff' elements, and non-existent members as "voids." The optimization process also determines the optimum actuator placement, where each actuator is represented computationally by equal and opposite nodal forces with soft axial stiffness. In addition, the configuration of attachments that connect the morphing structure to a non-morphing structure is determined simultaneously. Several different optimization problem formulations are investigated to understand their potential benefits in solution quality, as well as meaningfulness of the formulations. Extensions and enhancements to the initial concept and problem formulations are made to accommodate multiple-configuration definitions. In addition, the principal issues on the external-load dependency and the reversibility of a design, as well as the appropriate selection of a reference configuration, are addressed in the research. The methodology to control actuator distributions and concentrations is also discussed. Finally, the strategy to transfer the topology solution to the sizing optimization is developed and cross-sectional areas of existent structural members are optimized under applied aerodynamic loads. That is, the optimization process is implemented in sequential order: The actuation system layout is first determined through multi-disciplinary topology optimization process, and then the thickness or cross-sectional area of each existent member is optimized under given constraints and boundary conditions. Sample problems are solved to demonstrate the potential capabilities of the presented methodology. The research demonstrates an innovative structural design procedure from a computational perspective and opens new insights into the potential design requirements and characteristics of morphing structures.
Mining data from hemodynamic simulations for generating prediction and explanation models.
Bosnić, Zoran; Vračar, Petar; Radović, Milos D; Devedžić, Goran; Filipović, Nenad D; Kononenko, Igor
2012-03-01
One of the most common causes of human death is stroke, which can be caused by carotid bifurcation stenosis. In our work, we aim at proposing a prototype of a medical expert system that could significantly aid medical experts to detect hemodynamic abnormalities (increased artery wall shear stress). Based on the acquired simulated data, we apply several methodologies for1) predicting magnitudes and locations of maximum wall shear stress in the artery, 2) estimating reliability of computed predictions, and 3) providing user-friendly explanation of the model's decision. The obtained results indicate that the evaluated methodologies can provide a useful tool for the given problem domain. © 2012 IEEE
Effects of Caffeine and Warrior Stress on Behavioral : An Animal Model
2016-03-14
contributes invaluably to ethical and humane research. A special thank you to Erin Barry for providing statistical expertise and methodological support...of behavioral health in rats. Several ethical and logistical issues prevent the use of humans in true controlled experiments that manipulate stress...play in the development or maintenance of behavioral problems. There are ethical issues associated with exposing humans to high caffeine doses and
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC. Program Evaluation and Methodology Div.
In response to a request by the United States Senate Committee on Labor and Human Resources, the General Accounting Office (GAO) examined the methodological soundness of current population estimates of the number of homeless chronically mentally ill persons, and proposed several options for estimating the size of this population. The GAO reviewed…
Choi, Ickwon; Kattan, Michael W; Wells, Brian J; Yu, Changhong
2012-01-01
In medical society, the prognostic models, which use clinicopathologic features and predict prognosis after a certain treatment, have been externally validated and used in practice. In recent years, most research has focused on high dimensional genomic data and small sample sizes. Since clinically similar but molecularly heterogeneous tumors may produce different clinical outcomes, the combination of clinical and genomic information, which may be complementary, is crucial to improve the quality of prognostic predictions. However, there is a lack of an integrating scheme for clinic-genomic models due to the P ≥ N problem, in particular, for a parsimonious model. We propose a methodology to build a reduced yet accurate integrative model using a hybrid approach based on the Cox regression model, which uses several dimension reduction techniques, L₂ penalized maximum likelihood estimation (PMLE), and resampling methods to tackle the problem. The predictive accuracy of the modeling approach is assessed by several metrics via an independent and thorough scheme to compare competing methods. In breast cancer data studies on a metastasis and death event, we show that the proposed methodology can improve prediction accuracy and build a final model with a hybrid signature that is parsimonious when integrating both types of variables.
Applications of fuzzy theories to multi-objective system optimization
NASA Technical Reports Server (NTRS)
Rao, S. S.; Dhingra, A. K.
1991-01-01
Most of the computer aided design techniques developed so far deal with the optimization of a single objective function over the feasible design space. However, there often exist several engineering design problems which require a simultaneous consideration of several objective functions. This work presents several techniques of multiobjective optimization. In addition, a new formulation, based on fuzzy theories, is also introduced for the solution of multiobjective system optimization problems. The fuzzy formulation is useful in dealing with systems which are described imprecisely using fuzzy terms such as, 'sufficiently large', 'very strong', or 'satisfactory'. The proposed theory translates the imprecise linguistic statements and multiple objectives into equivalent crisp mathematical statements using fuzzy logic. The effectiveness of all the methodologies and theories presented is illustrated by formulating and solving two different engineering design problems. The first one involves the flight trajectory optimization and the main rotor design of helicopters. The second one is concerned with the integrated kinematic-dynamic synthesis of planar mechanisms. The use and effectiveness of nonlinear membership functions in fuzzy formulation is also demonstrated. The numerical results indicate that the fuzzy formulation could yield results which are qualitatively different from those provided by the crisp formulation. It is felt that the fuzzy formulation will handle real life design problems on a more rational basis.
Analysis of individual risk belief structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonn, B.E.; Travis, C.B.; Arrowood, L.
An interactive computer program developed at Oak Ridge National Laboratory is presented as a methodology to model individualized belief structures. The logic and general strategy of the model is presented for two risk topics: AIDs and toxic waste. Subjects identified desirable and undesirable consequences for each topic and formulated an associative rule linking topic and consequence in either a causal or correlational framework. Likelihood estimates, generated by subjects in several formats (probability, odds statements, etc.), constituted one outcome measure. Additionally, source of belief (personal experience, news media, etc.) and perceived personal and societal impact are reviewed. Briefly, subjects believe thatmore » AIDs causes significant emotional problems, and to a lesser degree, physical health problems whereas toxic waste causes significant environmental problems.« less
Monitoring the Impact of Solution Concepts within a Given Problematic
NASA Astrophysics Data System (ADS)
Cavallucci, Denis; Rousselot, François; Zanni, Cecilia
It is acknowledged that one of the most critical issues facing today’s organizations concerns the substantial leaps required to methodologically structure innovation. Among other published work, some suggest that a complete rethinking of current practices is required. In this article, we propose a methodology aiming at providing controlled R&D choices based on a monitoring of the impact Solution Concepts provoke on a problematic situation. Initially this problematic situation is modeled in a graph form, namely a Problem Graph. It has the objective to assists R&D managers when choosing which activities to support and bring them concrete arguments to defend their choices. We postulate that by improving the robustness of such approaches we help deciders to switch from intuitive decisions (mostly built upon their past experiences, fear regarding risks, and awareness of the company’s level of acceptance of novelties) to thoroughly constructed inventive problem solving strategies. Our approach will be discussed using a computer application that illustrates our hypothesis after being tested in several industrial applications.
Ergonomic initiatives at Inmetro: measuring occupational health and safety.
Drucker, L; Amaral, M; Carvalheira, C
2012-01-01
This work studies biomechanical hazards to which the workforce of Instituto Nacional de Metrologia, Qualidade e Tecnologia Industrial (Inmetro) is exposed. It suggests a model for ergonomic evaluation of work, based on the concepts of resilience engineering which take into consideration the institute's ability to manage risk and deal with its consequences. Methodology includes the stages of identification, inventory, analysis, and risk management. Diagnosis of the workplace uses as parameters the minimal criteria stated in Brazilian legislation. The approach has several prospectives and encompasses the points of view of public management, safety engineering, physical therapy and ergonomics-oriented design. The suggested solution integrates all aspects of the problem: biological, psychological, sociological and organizational. Results obtained from a pilot Project allow to build a significant sample of Inmetro's workforce, identifying problems and validating the methodology employed as a tool to be applied to the whole institution. Finally, this work intends to draw risk maps and support goals and methods based on resiliency engineering to assess environmental and ergonomic risk management.
A methodology model for quality management in a general hospital.
Stern, Z; Naveh, E
1997-01-01
A reappraisal is made of the relevance of industrial modes of quality management to the issues of medical care. Analysis of the nature of medical care, which differentiates it from the supplier-client relationships of industry, presents the main intrinsic characteristics, which create problems in application of the industrial quality management approaches to medical care. Several examples are the complexity of the relationship between the medical action and the result obtained, the client's nonacceptance of economic profitability as a value in his medical care, and customer satisfaction biased by variable standards of knowledge. The real problems unique to hospitals are addressed, and a methodology model for their quality management is offered. Included is a sample of indicator vectors, measurements of quality care, cost of medical care, quality of service, and human resources. These are based on the trilogy of planning quality, quality control, and improving quality. The conclusions confirm the inadequacy of industrial quality management approaches for medical institutions and recommend investment in formulation of appropriate concepts.
Bezerra, Rui M F; Fraga, Irene; Dias, Albino A
2013-01-01
Enzyme kinetic parameters are usually determined from initial rates nevertheless, laboratory instruments only measure substrate or product concentration versus reaction time (progress curves). To overcome this problem we present a methodology which uses integrated models based on Michaelis-Menten equation. The most severe practical limitation of progress curve analysis occurs when the enzyme shows a loss of activity under the chosen assay conditions. To avoid this problem it is possible to work with the same experimental points utilized for initial rates determination. This methodology is illustrated by the use of integrated kinetic equations with the well-known reaction catalyzed by alkaline phosphatase enzyme. In this work nonlinear regression was performed with the Solver supplement (Microsoft Office Excel). It is easy to work with and track graphically the convergence of SSE (sum of square errors). The diagnosis of enzyme inhibition was performed according to Akaike information criterion. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Gauthier, Benoit; And Others
1997-01-01
Identifies the more representative problem-solving models in environmental education. Suggests the addition of a strategy for defining a problem situation using Soft Systems Methodology to environmental education activities explicitly designed for the development of critical thinking. Contains 45 references. (JRH)
Towards the unification of inference structures in medical diagnostic tasks.
Mira, J; Rives, J; Delgado, A E; Martínez, R
1998-01-01
The central purpose of artificial intelligence applied to medicine is to develop models for diagnosis and therapy planning at the knowledge level, in the Newell sense, and software environments to facilitate the reduction of these models to the symbol level. The usual methodology (KADS, Common-KADS, GAMES, HELIOS, Protégé, etc) has been to develop libraries of generic tasks and reusable problem-solving methods with explicit ontologies. The principal problem which clinicians have with these methodological developments concerns the diversity and complexity of new terms whose meaning is not sufficiently clear, precise, unambiguous and consensual for them to be accessible in the daily clinical environment. As a contribution to the solution of this problem, we develop in this article the conjecture that one inference structure is enough to describe the set of analysis tasks associated with medical diagnoses. To this end, we first propose a modification of the systematic diagnostic inference scheme to obtain an analysis generic task and then compare it with the monitoring and the heuristic classification task inference schemes using as comparison criteria the compatibility of domain roles (data structures), the similarity in the inferences, and the commonality in the set of assumptions which underlie the functionally equivalent models. The equivalences proposed are illustrated with several examples. Note that though our ongoing work aims to simplify the methodology and to increase the precision of the terms used, the proposal presented here should be viewed more in the nature of a conjecture.
C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component
NASA Astrophysics Data System (ADS)
Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.
2018-06-01
The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.
Formal verification of human-automation interaction
NASA Technical Reports Server (NTRS)
Degani, Asaf; Heymann, Michael
2002-01-01
This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.
C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component
NASA Astrophysics Data System (ADS)
Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.
2018-02-01
The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.
Multichannel myopic deconvolution in underwater acoustic channels via low-rank recovery
Tian, Ning; Byun, Sung-Hoon; Sabra, Karim; Romberg, Justin
2017-01-01
This paper presents a technique for solving the multichannel blind deconvolution problem. The authors observe the convolution of a single (unknown) source with K different (unknown) channel responses; from these channel outputs, the authors want to estimate both the source and the channel responses. The authors show how this classical signal processing problem can be viewed as solving a system of bilinear equations, and in turn can be recast as recovering a rank-1 matrix from a set of linear observations. Results of prior studies in the area of low-rank matrix recovery have identified effective convex relaxations for problems of this type and efficient, scalable heuristic solvers that enable these techniques to work with thousands of unknown variables. The authors show how a priori information about the channels can be used to build a linear model for the channels, which in turn makes solving these systems of equations well-posed. This study demonstrates the robustness of this methodology to measurement noises and parametrization errors of the channel impulse responses with several stylized and shallow water acoustic channel simulations. The performance of this methodology is also verified experimentally using shipping noise recorded on short bottom-mounted vertical line arrays. PMID:28599565
White, Andrew A; Wright, Seth W; Blanco, Roberto; Lemonds, Brent; Sisco, Janice; Bledsoe, Sandy; Irwin, Cindy; Isenhour, Jennifer; Pichert, James W
2004-10-01
Identifying the etiologies of adverse outcomes is an important first step in improving patient safety and reducing malpractice risks. However, relatively little is known about the causes of emergency department-related adverse outcomes. The objective was to describe a method for identification of common causes of adverse outcomes in an emergency department. This methodology potentially can suggest ways to improve care and might provide a model for identification of factors associated with adverse outcomes. This was a retrospective analysis of 74 consecutive files opened by a malpractice insurer between 1995 and 2000. Each risk-management file was analyzed to identify potential causes of adverse outcomes. The main outcomes were rater-assigned codes for alleged problems with care (e.g., failures of communication or problems related to diagnosis). About 50% of cases were related to injuries or abdominal complaints. A contributing cause was found in 92% of cases, and most had more than one contributing cause. The most frequent contributing categories included failure to diagnose (45%), supervision problems (31%), communication problems (30%), patient behavior (24%), administrative problems (20%), and documentation (20%). Specific relating factors within these categories, such as lack of timely resident supervision and failure to follow policies and procedures, were identified. This project documented that an aggregate analysis of risk-management files has the potential to identify shared causes related to real or perceived adverse outcomes. Several potentially correctable systems problems were identified using this methodology. These simple, descriptive management tools may be useful in identifying issues for problem solving and can be easily learned by physicians and managers.
Soft Computing Methods for Disulfide Connectivity Prediction.
Márquez-Chamorro, Alfonso E; Aguilar-Ruiz, Jesús S
2015-01-01
The problem of protein structure prediction (PSP) is one of the main challenges in structural bioinformatics. To tackle this problem, PSP can be divided into several subproblems. One of these subproblems is the prediction of disulfide bonds. The disulfide connectivity prediction problem consists in identifying which nonadjacent cysteines would be cross-linked from all possible candidates. Determining the disulfide bond connectivity between the cysteines of a protein is desirable as a previous step of the 3D PSP, as the protein conformational search space is highly reduced. The most representative soft computing approaches for the disulfide bonds connectivity prediction problem of the last decade are summarized in this paper. Certain aspects, such as the different methodologies based on soft computing approaches (artificial neural network or support vector machine) or features of the algorithms, are used for the classification of these methods.
Davis, Barbara J; Schmidt, Jonathan; Bowman, Lynn G; Boelter, Eric W
2012-01-01
Current research provides few suggestions for modifications to functional analysis procedures to accommodate low rate, high intensity problem behavior. This study examined the results of the extended duration functional analysis procedures of Kahng, Abt, and Schonbachler (2001) with six children admitted to an inpatient hospital for the treatment of severe problem behavior. Results of initial functional analyses (Iwata, Dorsey, Slifer, Bauman, & Richman, 1982/1994) were inconclusive for all children because of low levels of responding. The altered functional analyses, which changed multiple variables including the duration of the functional analysis (i.e., 6 or 7 hrs), yielded clear behavioral functions for all six participants. These results add additional support for the utility of an altered analysis of low rate, high intensity problem behavior when standard functional analyses do not yield differentiated results. PMID:23326628
Davis, Barbara J; Kahng, Sungwoo; Schmidt, Jonathan; Bowman, Lynn G; Boelter, Eric W
2012-01-01
Current research provides few suggestions for modifications to functional analysis procedures to accommodate low rate, high intensity problem behavior. This study examined the results of the extended duration functional analysis procedures of Kahng, Abt, and Schonbachler (2001) with six children admitted to an inpatient hospital for the treatment of severe problem behavior. Results of initial functional analyses (Iwata, Dorsey, Slifer, Bauman, & Richman, 1982/1994) were inconclusive for all children because of low levels of responding. The altered functional analyses, which changed multiple variables including the duration of the functional analysis (i.e., 6 or 7 hrs), yielded clear behavioral functions for all six participants. These results add additional support for the utility of an altered analysis of low rate, high intensity problem behavior when standard functional analyses do not yield differentiated results.
Anderson, Malcolm I; Parmenter, Trevor R; Mok, Magdalena
2002-09-01
This study used a modern theory of stress as a framework to strengthen the understanding of the relationship between neurobehavioural problems of TBI, family functioning and psychological distress in spouse/caregivers. The research was an ex post facto design utilising a cross-sectional methodology. Path analysis was used to determine the structural effect of neurobehavioural problems on family functioning and psychological distress. Forty-seven female and 17 male spouse/caregivers of partners with severe TBI were recruited. Spouse/caregivers who reported partners with TBI as having high levels of behavioural and cognitive problems experienced high levels of unhealthy family functioning. High levels of unhealthy family functioning were related to high levels of distress in spouse/caregivers, as family functioning had a moderate influence on psychological distress. Furthermore, indirect effects of behavioural and cognitive problems operating through family functioning intensified the level of psychological distress experienced by spouse/caregivers. Additionally, spouse/caregivers who reported high levels of behavioural, communication and social problems in their partners also experienced high levels of psychological distress. This study was significant because the impact of TBI on the spouse/caregiver from a multidimensional perspective is an important and under-researched area in the brain injury and disability field.
An adaptive response surface method for crashworthiness optimization
NASA Astrophysics Data System (ADS)
Shi, Lei; Yang, Ren-Jye; Zhu, Ping
2013-11-01
Response surface-based design optimization has been commonly used for optimizing large-scale design problems in the automotive industry. However, most response surface models are built by a limited number of design points without considering data uncertainty. In addition, the selection of a response surface in the literature is often arbitrary. This article uses a Bayesian metric to systematically select the best available response surface among several candidates in a library while considering data uncertainty. An adaptive, efficient response surface strategy, which minimizes the number of computationally intensive simulations, was developed for design optimization of large-scale complex problems. This methodology was demonstrated by a crashworthiness optimization example.
Hypovitaminosis A: international programmatic issues.
Underwood, B A
1994-08-01
The virtual elimination of vitamin A deficiency and all its consequences is high on the political agenda as a solvable public health problem by the end of the decade. Five to six times more children in the developing world are likely to be subclinically than clinically deficient. Subclinical deficiency can be detected by newer methodological approaches. Subclinically deficient children are at increased risk of severe and fatal infections. The problem at a population level is avoidable by the appropriate selection and application of a mix of available interventions. Countries are challenged to assess, analyze and take actions to incorporate nutrition concerns into development planning to attain end-of-decade goals.
Eco-analytical Methodology in Environmental Problems Monitoring
NASA Astrophysics Data System (ADS)
Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.
2017-01-01
Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.
A quality evaluation methodology of health web-pages for non-professionals.
Currò, Vincenzo; Buonuomo, Paola Sabrina; Onesimo, Roberta; de Rose, Paola; Vituzzi, Andrea; di Tanna, Gian Luca; D'Atri, Alessandro
2004-06-01
The proposal of an evaluation methodology for determining the quality of healthcare web sites for the dissemination of medical information to non-professionals. Three (macro) factors are considered for the quality evaluation: medical contents, accountability of the authors, and usability of the web site. Starting from two results in the literature the problem of whether or not to introduce a weighting function has been investigated. This methodology has been validated on a specialized information content, i.e., sore throats, due to the large interest such a topic enjoys with target users. The World Wide Web was accessed using a meta-search system merging several search engines. A statistical analysis was made to compare the proposed methodology with the obtained ranks of the sample web pages. The statistical analysis confirms that the variables examined (per item and sub factor) show substantially similar ranks and are capable of contributing to the evaluation of the main quality macro factors. A comparison between the aggregation functions in the proposed methodology (non-weighted averages) and the weighting functions, derived from the literature, allowed us to verify the suitability of the method. The proposed methodology suggests a simple approach which can quickly award an overall quality score for medical web sites oriented to non-professionals.
New methodology for fast prediction of wheel wear evolution
NASA Astrophysics Data System (ADS)
Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.
2017-07-01
In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.
Osborn, A J; Mathias, J L; Fairweather-Schmidt, A K
2014-11-01
Depression is one of the most frequently reported psychological problems following TBI, however prevalence estimates vary widely. Methodological and sampling differences may explain some of this variability, but it is not known to what extent. Data from 99 studies examining the prevalence of clinically diagnosed depression (MDD/dysthymia) and self-reports of depression (clinically significant cases or depression scale scores) following adult, non-penetrating TBI were analysed, taking into consideration diagnostic criteria, measure, post-injury interval, and injury severity. Overall, 27% of people were diagnosed with MDD/dysthymia following TBI and 38% reported clinically significant levels of depression when assessed with self-report scales. Estimates of MDD/dysthymia varied according to diagnostic criteria (ICD-10: 14%; DSM-IV: 25%; DSM-III: 47%) and injury severity (mild: 16%; severe: 30%). When self-report measures were used, the prevalence of clinically significant cases of depression differed between scales (HADS: 32%; CES-D: 48%) method of administration (phone: 26%; mail 46%), post-injury interval (range: 33-42%), and injury severity (mild: 64%; severe: 39%). Depression is very common after TBI and has the potential to impact on recovery and quality of life. However, the diagnostic criteria, measure, time post-injury and injury severity, all impact on prevalence rates and must therefore be considered for benchmarking purposes. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.
Meta-Analyses and Orthodontic Evidence-Based Clinical Practice in the 21st Century
Papadopoulos, Moschos A.
2010-01-01
Introduction: Aim of this systematic review was to assess the orthodontic related issues which currently provide the best evidence as documented by meta-analyses, by critically evaluating and discussing the methodology used in these studies. Material and Methods: Several electronic databases were searched and handsearching was also performed in order to identify the corresponding meta-analyses investigating orthodontic related subjects. In total, 197 studies were retrieved initially. After applying specific inclusion and exclusion criteria, 27 articles were identified as meta-analyses treating orthodontic-related subjects. Results: Many of these 27 papers presented sufficient quality and followed appropriate meta-analytic approaches to quantitatively synthesize data and presented adequately supported evidence. However, the methodology used in some of them presented weaknesses, limitations or deficiencies. Consequently, the topics in orthodontics which currently provide the best evidence, include some issues related to Class II or Class III treatment, treatment of transverse problems, external apical root resorption, dental anomalies, such as congenital missing teeth and tooth transposition, frequency of severe occlusal problems, nickel hypersensitivity, obstructive sleep apnea syndrome, and computer-assisted learning in orthodontic education. Conclusions: Only a few orthodontic related issues have been so far investigated by means of MAs. In addition, for some of these issues investigated in the corresponding MAs no definite conclusions could be drawn, due to significant methodological deficiencies of these studies. According to this investigation, it can be concluded that at the begin of the 21st century there is evidence for only a few orthodontic related issues as documented by meta-analyses, and more well-conducted high quality research studies are needed to produce strong evidence in order to support evidence-based clinical practice in orthodontics. PMID:21673839
Barriers to self-care in women of reproductive age with HIV/AIDS in Iran: a qualitative study.
Oskouie, Fatemeh; Kashefi, Farzaneh; Rafii, Forough; Gouya, Mohammad Mehdi
2017-01-01
Although increasing attention is paid to HIV/AIDS, patients with HIV still experience several barriers to self-care. These barriers have been previously identified in small quantitative studies on women with HIV, but qualitative studies are required to clarify barriers to self-care. We conducted our study using the grounded theory methodological approach. A total of 28 women with HIV and their family members, were interviewed. The data were analyzed with the Corbin and Strauss method (1998). The key barriers to self-care in women with HIV/AIDS included social stigma, addiction, psychological problems, medication side-effects and financial problems. Women with HIV/AIDS face several barriers to self-care. Therefore, when designing self-care models for these women, social and financial barriers should be identified. Mental health treatment should also be incorporated into such models and patients' access to health care services should be facilitated.
Dynamic Decision Making under Uncertainty and Partial Information
2017-01-30
order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those
Generalized Predictive and Neural Generalized Predictive Control of Aerospace Systems
NASA Technical Reports Server (NTRS)
Kelkar, Atul G.
2000-01-01
The research work presented in this thesis addresses the problem of robust control of uncertain linear and nonlinear systems using Neural network-based Generalized Predictive Control (NGPC) methodology. A brief overview of predictive control and its comparison with Linear Quadratic (LQ) control is given to emphasize advantages and drawbacks of predictive control methods. It is shown that the Generalized Predictive Control (GPC) methodology overcomes the drawbacks associated with traditional LQ control as well as conventional predictive control methods. It is shown that in spite of the model-based nature of GPC it has good robustness properties being special case of receding horizon control. The conditions for choosing tuning parameters for GPC to ensure closed-loop stability are derived. A neural network-based GPC architecture is proposed for the control of linear and nonlinear uncertain systems. A methodology to account for parametric uncertainty in the system is proposed using on-line training capability of multi-layer neural network. Several simulation examples and results from real-time experiments are given to demonstrate the effectiveness of the proposed methodology.
Goede, Patricia A.; Lauman, Jason R.; Cochella, Christopher; Katzman, Gregory L.; Morton, David A.; Albertine, Kurt H.
2004-01-01
Use of digital medical images has become common over the last several years, coincident with the release of inexpensive, mega-pixel quality digital cameras and the transition to digital radiology operation by hospitals. One problem that clinicians, medical educators, and basic scientists encounter when handling images is the difficulty of using business and graphic arts commercial-off-the-shelf (COTS) software in multicontext authoring and interactive teaching environments. The authors investigated and developed software-supported methodologies to help clinicians, medical educators, and basic scientists become more efficient and effective in their digital imaging environments. The software that the authors developed provides the ability to annotate images based on a multispecialty methodology for annotation and visual knowledge representation. This annotation methodology is designed by consensus, with contributions from the authors and physicians, medical educators, and basic scientists in the Departments of Radiology, Neurobiology and Anatomy, Dermatology, and Ophthalmology at the University of Utah. The annotation methodology functions as a foundation for creating, using, reusing, and extending dynamic annotations in a context-appropriate, interactive digital environment. The annotation methodology supports the authoring process as well as output and presentation mechanisms. The annotation methodology is the foundation for a Windows implementation that allows annotated elements to be represented as structured eXtensible Markup Language and stored separate from the image(s). PMID:14527971
A hierarchical modeling methodology for the definition and selection of requirements
NASA Astrophysics Data System (ADS)
Dufresne, Stephane
This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the epistemic uncertainty. The proposed methodology is applied to the design of a hurricane tracker Unmanned Aerial Vehicles to demonstrate the origin and impact of requirements on the concept of operations and systems alternatives. This research demonstrates that the hierarchical modeling methodology provides a traceable flow-down of the requirements from the problem definition to the systems alternatives phases of conceptual design.
Review of auditory subliminal psychodynamic activation experiments.
Fudin, R; Benjamin, C
1991-12-01
Subliminal psychodynamic activation experiments using auditory stimuli have yielded only a modicum of support for the contention that such activation produces predictable behavioral changes. Problems in many auditory subliminal psychodynamic activation experiments indicate that those predictions have not been tested adequately. The auditory mode of presentation, however, has several methodological advantages over the visual one, the method used in the vast majority of subliminal psychodynamic activation experiments. Consequently, it should be considered in subsequent research in this area.
A geometric multigrid preconditioning strategy for DPG system matrices
Roberts, Nathan V.; Chan, Jesse
2017-08-23
Here, the discontinuous Petrov–Galerkin (DPG) methodology of Demkowicz and Gopalakrishnan (2010, 2011) guarantees the optimality of the solution in an energy norm, and provides several features facilitating adaptive schemes. A key question that has not yet been answered in general – though there are some results for Poisson, e.g.– is how best to precondition the DPG system matrix, so that iterative solvers may be used to allow solution of large-scale problems.
JPRS Report, Soviet Union, Sociological Studies, No. 5, September-October 1987
1988-06-10
of the Methodology and History of Sociology of the Institute of Sociological Research, USSR Academy of Sciences, author of the books "Tekhnika i...the problem is perpetuated. It is no secret that the history of several liberal arts institutes of the USSR Academy of Sciences in the last few...most important revolutionary process since Great October, is that it, for the first time in our country’s history , is not only creating the
Methodological issues in the study of violence against women
Ruiz‐Pérez, Isabel; Plazaola‐Castaño, Juncal; Vives‐Cases, Carmen
2007-01-01
The objective of this paper is to review the methodological issues that arise when studying violence against women as a public health problem, focusing on intimate partner violence (IPV), since this is the form of violence that has the greatest consequences at a social and political level. The paper focuses first on the problems of defining what is meant by IPV. Secondly, the paper describes the difficulties in assessing the magnitude of the problem. Obtaining reliable data on this type of violence is a complex task, because of the methodological issues derived from the very nature of the phenomenon, such as the private, intimate context in which this violence often takes place, which means the problem cannot be directly observed. Finally, the paper examines the limitations and bias in research on violence, including the lack of consensus with regard to measuring events that may or may not represent a risk factor for violence against women or the methodological problem related to the type of sampling used in both aetiological and prevalence studies. PMID:18000113
New approaches to some methodological problems of meteor science
NASA Technical Reports Server (NTRS)
Meisel, David D.
1987-01-01
Several low cost approaches to continuous radioscatter monitoring of the incoming meteor flux are described. Preliminary experiments were attempted using standard time frequency stations WWVH and CHU (on frequencies near 15 MHz) during nighttime hours. Around-the-clock monitoring using the international standard aeronautical beacon frequency of 75 MHz was also attempted. The techniques are simple and can be managed routinely by amateur astronomers with relatively little technical expertise. Time series analysis can now be performed using relatively inexpensive microcomputers. Several algorithmic approaches to the analysis of meteor rates are discussed. Methods of obtaining optimal filter predictions of future meteor flux are also discussed.
Ultrasensitive low noise voltage amplifier for spectral analysis.
Giusi, G; Crupi, F; Pace, C
2008-08-01
Recently we have proposed several voltage noise measurement methods that allow, at least in principle, the complete elimination of the noise introduced by the measurement amplifier. The most severe drawback of these methods is that they require a multistep measurement procedure. Since environmental conditions may change in the different measurement steps, the final result could be affected by these changes. This problem is solved by the one-step voltage noise measurement methodology based on a novel amplifier topology proposed in this paper. Circuit implementations for the amplifier building blocks based on operational amplifiers are critically discussed. The proposed approach is validated through measurements performed on a prototype circuit.
Dealing With Shallow-Water Flow in the Deepwater Gulf of Mexico
NASA Astrophysics Data System (ADS)
Ostermeier, R.
2006-05-01
Some of the Shell experience in dealing with the shallow-water flow problem in the Deepwater Gulf of Mexico (GOM) will be presented. The nature of the problem, including areal extent and over-pressuring mechanisms, will be discussed. Methods for sand prediction and shallow sediment and flow characterization will be reviewed. These include seismic techniques, the use of geo-technical wells, regional trends, and various MWD methods. Some examples of flow incidents with pertinent drilling issues, including well failures and abandonment, will be described. To address the shallow-water flow problem, Shell created a multi-disciplinary team of specialists in geology, geophysics, petrophysics, drilling, and civil engineering. The team developed several methodologies to deal with various aspects of the problem. These include regional trends and data bases, shallow seismic interpretation and sand prediction, well site and casing point selection, geo-technical well design and data interpretation, logging program design and interpretation, cementing design and fluids formulation, methods for remediation and mitigation of lost circulation, and so on. Shell's extensive Deepwater GOM drilling experience has lead to new understanding of the problem. Examples include delineation of trends in shallow water flow occurrence and severity, trends and departures in PP/FG, rock properties pertaining to seismic identification of sands, and so on. New knowledge has also been acquired through the use of geo-technical wells. One example is the observed rapid onset and growth of over-pressures below the mudline. Total trouble costs due to shallow water flow for all GOM operators almost certainly runs into the several hundred million dollars. Though the problem remains a concern, advances in our knowledge and understanding make it a problem that is manageable and not the "show stopper" once feared.
Training effectiveness assessment: Methodological problems and issues
NASA Technical Reports Server (NTRS)
Cross, Kenneth D.
1992-01-01
The U.S. military uses a large number of simulators to train and sustain the flying skills of helicopter pilots. Despite the enormous resources required to purchase, maintain, and use those simulators, little effort has been expended in assessing their training effectiveness. One reason for this is the lack of an evaluation methodology that yields comprehensive and valid data at a practical cost. Some of these methodological problems and issues that arise in assessing simulator training effectiveness, as well as problems with the classical transfer-of-learning paradigm were discussed.
NASA Astrophysics Data System (ADS)
Huseyin Turan, Hasan; Kasap, Nihat; Savran, Huseyin
2014-03-01
Nowadays, every firm uses telecommunication networks in different amounts and ways in order to complete their daily operations. In this article, we investigate an optimisation problem that a firm faces when acquiring network capacity from a market in which there exist several network providers offering different pricing and quality of service (QoS) schemes. The QoS level guaranteed by network providers and the minimum quality level of service, which is needed for accomplishing the operations are denoted as fuzzy numbers in order to handle the non-deterministic nature of the telecommunication network environment. Interestingly, the mathematical formulation of the aforementioned problem leads to the special case of a well-known two-dimensional bin packing problem, which is famous for its computational complexity. We propose two different heuristic solution procedures that have the capability of solving the resulting nonlinear mixed integer programming model with fuzzy constraints. In conclusion, the efficiency of each algorithm is tested in several test instances to demonstrate the applicability of the methodology.
Electromagnetic Simulation of the Near-Field Distribution around a Wind Farm
Yang, Shang-Te; Ling, Hao
2013-01-01
An efficienmore » t approach to compute the near-field distribution around and within a wind farm under plane wave excitation is proposed. To make the problem computationally tractable, several simplifying assumptions are made based on the geometry problem. By comparing the approximations against full-wave simulations at 500 MHz, it is shown that the assumptions do not introduce significant errors into the resulting near-field distribution. The near fields around a 3 × 3 wind farm are computed using the developed methodology at 150 MHz, 500 MHz, and 3 GHz. Both the multipath interference patterns and the forward shadows are predicted by the proposed method.« less
Dynamic loading and stress life analysis of permanent space station modules
NASA Astrophysics Data System (ADS)
Anisimov, A. V.; Krokhin, I. A.; Likhoded, A. I.; Malinin, A. A.; Panichkin, N. G.; Sidorov, V. V.; Titov, V. A.
2016-11-01
Some methodological approaches to solving several key problems of dynamic loading and structural strength analysis of Permanent Space Station (PSS)modules developed on the basis of the working experience of Soviet and Russian PSS and the International Space station (ISS) are presented. The solutions of the direct and semi-inverse problems of PSS structure dynamics are mathematically stated. Special attention is paid to the use of the results of ground structural strength tests of space station modules and the data on the actual flight actions on the station and its dynamic responses in the orbital operation regime. The procedure of determining the dynamics and operation life parameters of elements of the PSS modules is described.
Ethical and Legal Implications of the Methodological Crisis in Neuroimaging.
Kellmeyer, Philipp
2017-10-01
Currently, many scientific fields such as psychology or biomedicine face a methodological crisis concerning the reproducibility, replicability, and validity of their research. In neuroimaging, similar methodological concerns have taken hold of the field, and researchers are working frantically toward finding solutions for the methodological problems specific to neuroimaging. This article examines some ethical and legal implications of this methodological crisis in neuroimaging. With respect to ethical challenges, the article discusses the impact of flawed methods in neuroimaging research in cognitive and clinical neuroscience, particularly with respect to faulty brain-based models of human cognition, behavior, and personality. Specifically examined is whether such faulty models, when they are applied to neurological or psychiatric diseases, could put patients at risk, and whether this places special obligations on researchers using neuroimaging. In the legal domain, the actual use of neuroimaging as evidence in United States courtrooms is surveyed, followed by an examination of ways that the methodological problems may create challenges for the criminal justice system. Finally, the article reviews and promotes some promising ideas and initiatives from within the neuroimaging community for addressing the methodological problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, Nathan V.; Demkowiz, Leszek; Moser, Robert
2015-11-15
The discontinuous Petrov-Galerkin methodology with optimal test functions (DPG) of Demkowicz and Gopalakrishnan [18, 20] guarantees the optimality of the solution in an energy norm, and provides several features facilitating adaptive schemes. Whereas Bubnov-Galerkin methods use identical trial and test spaces, Petrov-Galerkin methods allow these function spaces to differ. In DPG, test functions are computed on the fly and are chosen to realize the supremum in the inf-sup condition; the method is equivalent to a minimum residual method. For well-posed problems with sufficiently regular solutions, DPG can be shown to converge at optimal rates—the inf-sup constants governing the convergence aremore » mesh-independent, and of the same order as those governing the continuous problem [48]. DPG also provides an accurate mechanism for measuring the error, and this can be used to drive adaptive mesh refinements. We employ DPG to solve the steady incompressible Navier-Stokes equations in two dimensions, building on previous work on the Stokes equations, and focusing particularly on the usefulness of the approach for automatic adaptivity starting from a coarse mesh. We apply our approach to a manufactured solution due to Kovasznay as well as the lid-driven cavity flow, backward-facing step, and flow past a cylinder problems.« less
Systematic Review of Empirically Evaluated School-Based Gambling Education Programs.
Keen, Brittany; Blaszczynski, Alex; Anjoul, Fadi
2017-03-01
Adolescent problem gambling prevalence rates are reportedly five times higher than in the adult population. Several school-based gambling education programs have been developed in an attempt to reduce problem gambling among adolescents; however few have been empirically evaluated. The aim of this review was to report the outcome of studies empirically evaluating gambling education programs across international jurisdictions. A systematic review following guidelines outlined in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement searching five academic databases: PubMed, Scopus, Medline, PsycINFO, and ERIC, was conducted. A total of 20 papers and 19 studies were included after screening and exclusion criteria were applied. All studies reported intervention effects on cognitive outcomes such as knowledge, perceptions, and beliefs. Only nine of the studies attempted to measure intervention effects on behavioural outcomes, and only five of those reported significant changes in gambling behaviour. Of these five, methodological inadequacies were commonly found including brief follow-up periods, lack of control comparison in post hoc analyses, and inconsistencies and misclassifications in the measurement of gambling behaviour, including problem gambling. Based on this review, recommendations are offered for the future development and evaluation of school-based gambling education programs relating to both methodological and content design and delivery considerations.
Solving multi-objective optimization problems in conservation with the reference point method
Dujardin, Yann; Chadès, Iadine
2018-01-01
Managing the biodiversity extinction crisis requires wise decision-making processes able to account for the limited resources available. In most decision problems in conservation biology, several conflicting objectives have to be taken into account. Most methods used in conservation either provide suboptimal solutions or use strong assumptions about the decision-maker’s preferences. Our paper reviews some of the existing approaches to solve multi-objective decision problems and presents new multi-objective linear programming formulations of two multi-objective optimization problems in conservation, allowing the use of a reference point approach. Reference point approaches solve multi-objective optimization problems by interactively representing the preferences of the decision-maker with a point in the criteria (objectives) space, called the reference point. We modelled and solved the following two problems in conservation: a dynamic multi-species management problem under uncertainty and a spatial allocation resource management problem. Results show that the reference point method outperforms classic methods while illustrating the use of an interactive methodology for solving combinatorial problems with multiple objectives. The method is general and can be adapted to a wide range of ecological combinatorial problems. PMID:29293650
Computer-Aided Diagnosis Systems for Lung Cancer: Challenges and Methodologies
El-Baz, Ayman; Beache, Garth M.; Gimel'farb, Georgy; Suzuki, Kenji; Okada, Kazunori; Elnakib, Ahmed; Soliman, Ahmed; Abdollahi, Behnoush
2013-01-01
This paper overviews one of the most important, interesting, and challenging problems in oncology, the problem of lung cancer diagnosis. Developing an effective computer-aided diagnosis (CAD) system for lung cancer is of great clinical importance and can increase the patient's chance of survival. For this reason, CAD systems for lung cancer have been investigated in a huge number of research studies. A typical CAD system for lung cancer diagnosis is composed of four main processing steps: segmentation of the lung fields, detection of nodules inside the lung fields, segmentation of the detected nodules, and diagnosis of the nodules as benign or malignant. This paper overviews the current state-of-the-art techniques that have been developed to implement each of these CAD processing steps. For each technique, various aspects of technical issues, implemented methodologies, training and testing databases, and validation methods, as well as achieved performances, are described. In addition, the paper addresses several challenges that researchers face in each implementation step and outlines the strengths and drawbacks of the existing approaches for lung cancer CAD systems. PMID:23431282
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Toniolo, Matthew D.; Tartabini, Paul V.; Roithmayr, Carlos M.; Albertson, Cindy W.; Karlgaard, Christopher D.
2016-01-01
The objective of this report is to develop and implement a physics based method for analysis and simulation of multi-body dynamics including launch vehicle stage separation. The constraint force equation (CFE) methodology discussed in this report provides such a framework for modeling constraint forces and moments acting at joints when the vehicles are still connected. Several stand-alone test cases involving various types of joints were developed to validate the CFE methodology. The results were compared with ADAMS(Registered Trademark) and Autolev, two different industry standard benchmark codes for multi-body dynamic analysis and simulations. However, these two codes are not designed for aerospace flight trajectory simulations. After this validation exercise, the CFE algorithm was implemented in Program to Optimize Simulated Trajectories II (POST2) to provide a capability to simulate end-to-end trajectories of launch vehicles including stage separation. The POST2/CFE methodology was applied to the STS-1 Space Shuttle solid rocket booster (SRB) separation and Hyper-X Research Vehicle (HXRV) separation from the Pegasus booster as a further test and validation for its application to launch vehicle stage separation problems. Finally, to demonstrate end-to-end simulation capability, POST2/CFE was applied to the ascent, orbit insertion, and booster return of a reusable two-stage-to-orbit (TSTO) vehicle concept. With these validation exercises, POST2/CFE software can be used for performing conceptual level end-to-end simulations, including launch vehicle stage separation, for problems similar to those discussed in this report.
Multi-Time Step Service Restoration for Advanced Distribution Systems and Microgrids
Chen, Bo; Chen, Chen; Wang, Jianhui; ...
2017-07-07
Modern power systems are facing increased risk of disasters that can cause extended outages. The presence of remote control switches (RCSs), distributed generators (DGs), and energy storage systems (ESS) provides both challenges and opportunities for developing post-fault service restoration methodologies. Inter-temporal constraints of DGs, ESS, and loads under cold load pickup (CLPU) conditions impose extra complexity on problem formulation and solution. In this paper, a multi-time step service restoration methodology is proposed to optimally generate a sequence of control actions for controllable switches, ESSs, and dispatchable DGs to assist the system operator with decision making. The restoration sequence is determinedmore » to minimize the unserved customers by energizing the system step by step without violating operational constraints at each time step. The proposed methodology is formulated as a mixed-integer linear programming (MILP) model and can adapt to various operation conditions. Furthermore, the proposed method is validated through several case studies that are performed on modified IEEE 13-node and IEEE 123-node test feeders.« less
A standard methodology for the analysis, recording, and control of verbal behavior
Drash, Philip W.; Tudor, Roger M.
1991-01-01
Lack of a standard methodology has been one of the major obstacles preventing advancement of behavior analytic research in verbal behavior. This article presents a standard method for the analysis, recording, and control of verbal behavior that overcomes several major methodological problems that have hindered operant research in verbal behavior. The system divides all verbal behavior into four functional response classes, correct, error, no response, and inappropriate behavior, from which all vocal responses of a subject may be classified and consequated. The effects of contingencies of reinforcement on verbal operants within each category are made immediately visible to the researcher as changes in frequency of response. Incorporating frequency of response within each category as the unit of response allows both rate and probability of verbal response to be utilized as basic dependent variables. This method makes it possible to record and consequate verbal behavior in essentially the same way as any other operant response. It may also facilitate an experimental investigation of Skinner's verbal response categories. PMID:22477629
Multi-Time Step Service Restoration for Advanced Distribution Systems and Microgrids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Bo; Chen, Chen; Wang, Jianhui
Modern power systems are facing increased risk of disasters that can cause extended outages. The presence of remote control switches (RCSs), distributed generators (DGs), and energy storage systems (ESS) provides both challenges and opportunities for developing post-fault service restoration methodologies. Inter-temporal constraints of DGs, ESS, and loads under cold load pickup (CLPU) conditions impose extra complexity on problem formulation and solution. In this paper, a multi-time step service restoration methodology is proposed to optimally generate a sequence of control actions for controllable switches, ESSs, and dispatchable DGs to assist the system operator with decision making. The restoration sequence is determinedmore » to minimize the unserved customers by energizing the system step by step without violating operational constraints at each time step. The proposed methodology is formulated as a mixed-integer linear programming (MILP) model and can adapt to various operation conditions. Furthermore, the proposed method is validated through several case studies that are performed on modified IEEE 13-node and IEEE 123-node test feeders.« less
Conceptual, Methodological, and Ethical Problems in Communicating Uncertainty in Clinical Evidence
Han, Paul K. J.
2014-01-01
The communication of uncertainty in clinical evidence is an important endeavor that poses difficult conceptual, methodological, and ethical problems. Conceptual problems include logical paradoxes in the meaning of probability and “ambiguity”— second-order uncertainty arising from the lack of reliability, credibility, or adequacy of probability information. Methodological problems include questions about optimal methods for representing fundamental uncertainties and for communicating these uncertainties in clinical practice. Ethical problems include questions about whether communicating uncertainty enhances or diminishes patient autonomy and produces net benefits or harms. This article reviews the limited but growing literature on these problems and efforts to address them and identifies key areas of focus for future research. It is argued that the critical need moving forward is for greater conceptual clarity and consistent representational methods that make the meaning of various uncertainties understandable, and for clinical interventions to support patients in coping with uncertainty in decision making. PMID:23132891
Modified teaching approach for an enhanced medical physics graduate education experience
Rutel, IB
2011-01-01
Lecture-based teaching promotes a passive interaction with students. Opportunities to modify this format are available to enhance the overall learning experience for both students and instructors. The description for a discussion-based learning format is presented as it applies to a graduate curriculum with technical (formal mathematical derivation) topics. The presented hybrid method involves several techniques, including problem-based learning, modeling, and online lectures, eliminating didactic lectures. The results from an end-of-course evaluation show that the students appear to prefer the modified format over the more traditional methodology of “lecture only” contact time. These results are motivation for further refinement and continued implementation of the described methodology in the current course and potentially other courses within the department graduate curriculum. PMID:22279505
Outcomes of planetary close encounters - A systematic comparison of methodologies
NASA Technical Reports Server (NTRS)
Greenberg, Richard; Carusi, Andrea; Valsecchi, G. B.
1988-01-01
Several methods for estimating the outcomes of close planetary encounters are compared on the basis of the numerical integration of a range of encounter types. An attempt is made to lay the foundation for the development of predictive rules concerning the encounter outcomes applicable to the refinement of the statistical mechanics that apply to planet-formation and similar problems concerning planetary swarms. Attention is given to Oepik's (1976) formulation of the two-body approximation, whose predicted motion differs from the correct three-body behavior.
GRAMM-X public web server for protein–protein docking
Tovchigrechko, Andrey; Vakser, Ilya A.
2006-01-01
Protein docking software GRAMM-X and its web interface () extend the original GRAMM Fast Fourier Transformation methodology by employing smoothed potentials, refinement stage, and knowledge-based scoring. The web server frees users from complex installation of database-dependent parallel software and maintaining large hardware resources needed for protein docking simulations. Docking problems submitted to GRAMM-X server are processed by a 320 processor Linux cluster. The server was extensively tested by benchmarking, several months of public use, and participation in the CAPRI server track. PMID:16845016
Seals Research at AlliedSignal
NASA Technical Reports Server (NTRS)
Ullah, M. Rifat
1996-01-01
A consortium has been formed to address seal problems in the Aerospace sector of Allied Signal, Inc. The consortium is represented by makers of Propulsion Engines, Auxiliary Power Units, Gas Turbine Starters, etc. The goal is to improve Face Seal reliability, since Face Seals have become reliability drivers in many of our product lines. Several research programs are being implemented simultaneously this year. They include: Face Seal Modeling and Analysis Methodology; Oil Cooling of Seals; Seal Tracking Dynamics; Coking Formation & Prevention; and Seal Reliability Methods.
Black youth suicide: literature review with a focus on prevention.
Baker, F. M.
1990-01-01
The national rates of completed suicide in the black population between 1950 and 1981 are presented, including age-adjusted rates. Specific studies of black suicide attempters and completed suicides by blacks in several cities are discussed. Methodological problems with existing studies and national suicide statistics are presented. Proposed theories of black suicide are reviewed. Based on a summary of the characteristics of black suicide attempters reported by the literature, preventive strategies--primary, secondary, and tertiary--are presented. PMID:2204709
Schroder, P
1997-01-01
The author critiques the way population density is represented in school atlases, focusing on those used in German-speaking countries. After a discussion of the methodological problems underlying such representations, he selects examples from several German atlases to illustrate the transmission of contradictory, misleading, or out-of-date information. He also suggests ways to improve this situation, including better teaching of underlying cartographical issues and the use of a dot system to illustrate population density.
[Methodological problems in the use of information technologies in physical education].
Martirosov, E G; Zaĭtseva, G A
2000-01-01
The paper considers methodological problems in the use of computer technologies in physical education by applying diagnostic and consulting systems, educational and educational-and-training process automation systems, and control and self-control programmes for athletes and others.
Polcin, Douglas L
Communities throughout the U.S. are struggling to find solutions for serious and persistent homelessness. Alcohol and drug problems can be causes and consequences of homelessness, as well as co-occurring problems that complicate efforts to succeed in finding stable housing. Two prominent service models exist, one known as "Housing First" takes a harm reduction approach and the other known as the "linear" model typically supports a goal of abstinence from alcohol and drugs. Despite their popularity, the research supporting these models suffers from methodological problems and inconsistent findings. One purpose of this paper is to describe systematic reviews of the homelessness services literature, which illustrate weaknesses in research designs and inconsistent conclusions about the effectiveness of current models. Problems among some of the seminal studies on homelessness include poorly defined inclusion and exclusion criteria, inadequate measures of alcohol and drug use, unspecified or poorly implemented comparison conditions, and lack of procedures documenting adherence to service models. Several recent papers have suggested broader based approaches for homeless services that integrate alternatives and respond better to consumer needs. Practical considerations for implementing a broader system of services are described and peer managed recovery homes are presented as examples of services that address some of the gaps in current approaches. Three issues are identified that need more attention from researchers: 1) improving upon the methodological limitations in current studies, 2) assessing the impact of broader based, integrated services on outcome, and 3) assessing approaches to the service needs of homeless persons involved in the criminal justice system.
Polcin, Douglas L.
2016-01-01
Abstract Communities throughout the U.S. are struggling to find solutions for serious and persistent homelessness. Alcohol and drug problems can be causes and consequences of homelessness, as well as co-occurring problems that complicate efforts to succeed in finding stable housing. Two prominent service models exist, one known as “Housing First” takes a harm reduction approach and the other known as the “linear” model typically supports a goal of abstinence from alcohol and drugs. Despite their popularity, the research supporting these models suffers from methodological problems and inconsistent findings. One purpose of this paper is to describe systematic reviews of the homelessness services literature, which illustrate weaknesses in research designs and inconsistent conclusions about the effectiveness of current models. Problems among some of the seminal studies on homelessness include poorly defined inclusion and exclusion criteria, inadequate measures of alcohol and drug use, unspecified or poorly implemented comparison conditions, and lack of procedures documenting adherence to service models. Several recent papers have suggested broader based approaches for homeless services that integrate alternatives and respond better to consumer needs. Practical considerations for implementing a broader system of services are described and peer-managed recovery homes are presented as examples of services that address some of the gaps in current approaches. Three issues are identified that need more attention from researchers: (1) improving upon the methodological limitations in current studies, (2) assessing the impact of broader based, integrated services on outcome, and (3) assessing approaches to the service needs of homeless persons involved in the criminal justice system. PMID:27092027
Service lifetime prediction for encapsulated photovoltaic cells/minimodules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czanderna, A.W.; Jorgensen, G.J.
The overall purposes of this paper are to elucidate the crucial importance of predicting the service lifetime (SLP) for photovoltaics (PV) modules and to present an outline for developing a SLP methodology for encapsulated PV cells and minimodules. The specific objectives are (a) to illustrate the generic nature of SLP for several types of solar energy conversion or conversion devices, (b) to summarize the major durability issues concerned with these devices, (c) to justify using SLP in the triad of cost, performance, and durability instead of only durability, (d) to define and explain the seven major elements that comprise amore » generic SLP methodology, (e) to provide background about implementing the SLP methodology for PV cells and minimodules including the complexity of the encapsulation problems, (f) to summarize briefly the past focus of our task for improving and/or replacing ethylene vinyl acetate (EVA) as a PV pottant, and (g) to provide an outline of our present and future studies using encapsulated PV cells and minimodules for improving the encapsulation of PV cells and predicting a service lifetime for them using the SLP methodology outlined in objective (d). By using this methodology, our major conclusion is that predicting the service lifetime of PV cells and minimodules is possible. {copyright} {ital 1997 American Institute of Physics.}« less
A novel approach based on preference-based index for interval bilevel linear programming problem.
Ren, Aihong; Wang, Yuping; Xue, Xingsi
2017-01-01
This paper proposes a new methodology for solving the interval bilevel linear programming problem in which all coefficients of both objective functions and constraints are considered as interval numbers. In order to keep as much uncertainty of the original constraint region as possible, the original problem is first converted into an interval bilevel programming problem with interval coefficients in both objective functions only through normal variation of interval number and chance-constrained programming. With the consideration of different preferences of different decision makers, the concept of the preference level that the interval objective function is preferred to a target interval is defined based on the preference-based index. Then a preference-based deterministic bilevel programming problem is constructed in terms of the preference level and the order relation [Formula: see text]. Furthermore, the concept of a preference δ -optimal solution is given. Subsequently, the constructed deterministic nonlinear bilevel problem is solved with the help of estimation of distribution algorithm. Finally, several numerical examples are provided to demonstrate the effectiveness of the proposed approach.
Conjugate gradient based projection - A new explicit methodology for frictional contact
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Li, Maocheng; Sha, Desong
1993-01-01
With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.
Spinal Cord Injury-Induced Dysautonomia via Plasticity in Paravertebral Sympathetic Postganglionic
2017-10-01
their near anatomical inaccessibility. We have solved the accessibility problem with a strategic methodological advance. We will determine the extent...inaccessibility. We have solved the accessibility problem with a strategic methodological advance. We will determine the extent to which paravertebral
Human Prenatal Effects: Methodological Problems and Some Suggested Solutions
ERIC Educational Resources Information Center
Copans, Stuart A.
1974-01-01
Briefly reviews the relevant literature on human prenatal effects, describes some of the possible designs for such studies; and discusses some of the methodological problem areas: sample choice, measurement of prenatal variables, monitoring of labor and delivery, and neonatal assessment. (CS)
Overcoming an obstacle in expanding a UMLS semantic type extent.
Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James
2012-02-01
This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. Copyright © 2011 Elsevier Inc. All rights reserved.
Overcoming an Obstacle in Expanding a UMLS Semantic Type Extent
Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James
2011-01-01
This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. PMID:21925287
Slaug, Björn; Schilling, Oliver; Iwarsson, Susanne; Carlsson, Gunilla
2015-09-02
Making the built environment accessible for all regardless of functional capacity is an important goal for public health efforts. Considerable impediments to achieving this goal suggest the need for valid measurements of acccessibility and for greater attention to the complexity of person-environment fit issues. To address these needs, this study aimed to provide a methodological platform, useful for further research and instrument development within accessibility research. This was accomplished by the construction of a typology of problematic person-environment fit constellations, utilizing an existing methodology developed to assess and analyze accessibility problems in the built environment. By means of qualitative review and statistical methods we classified the person-environment fit components covered by an existing application which targets housing accessibility: the Housing Enabler (HE) instrument. The International Classification of Functioning, Disability and Health (ICF) was used as a conceptual framework. Qualitative classification principles were based on conceptual similarities and for quantitative analysis of similarities, Principal Component Analysis was carried out. We present a typology of problematic person-environment fit constellations classified along three dimensions: 1) accessibility problem range and severity 2) aspects of functioning 3) environmental context. As a result of the classification of the HE components, 48 typical person-environment fit constellations were recognised. The main contribution of this study is the proposed typology of person-environment fit constellations. The typology provides a methodological platform for the identification and quantification of problematic person-environment fit constellations. Its link to the globally accepted ICF classification system facilitates communication within the scientific and health care practice communities. The typology also highlights how relations between aspects of functioning and physical environmental barriers generate typical accessibility problems, and thereby furnishes a reference point for research oriented to how the built environment may be designed to be supportive for activity, participation and health.
NASA Astrophysics Data System (ADS)
Greca, Ileana M.
2016-03-01
Several international reports promote the use of the inquiry teaching methodology for improvements in science education at elementary school. Nevertheless, research indicates that pre-service elementary teachers have insufficient experience with this methodology and when they try to implement it, the theory they learnt in their university education clashes with the classroom practice they observe, a problem that has also been noted with other innovative methodologies. So, it appears essential for pre-service teachers to conduct supportive reflective practice during their education to integrate theory and practice, which various studies suggest is not usually done. Our study shows how opening up a third discursive space can assist this supportive reflective practice. The third discursive space appears when pre-service teachers are involved in specific activities that allow them to contrast the discourses of theoretical knowledge taught at university with practical knowledge arising from their ideas on science and science teaching and their observations during classroom practice. The case study of three pre-service teachers shows that this strategy was fundamental in helping them to integrate theory and practice, resulting in a better understanding of the inquiry methodology and its application in the classroom.
Sadowski, Lukasz
2013-01-01
In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm × 750 mm reinforced concrete slab specimens were investigated. Potential E corr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures.
Statistical theory and methodology for remote sensing data analysis with special emphasis on LACIE
NASA Technical Reports Server (NTRS)
Odell, P. L.
1975-01-01
Crop proportion estimators for determining crop acreage through the use of remote sensing were evaluated. Several studies of these estimators were conducted, including an empirical comparison of the different estimators (using actual data) and an empirical study of the sensitivity (robustness) of the class of mixture estimators. The effect of missing data upon crop classification procedures is discussed in detail including a simulation of the missing data effect. The final problem addressed is that of taking yield data (bushels per acre) gathered at several yield stations and extrapolating these values over some specified large region. Computer programs developed in support of some of these activities are described.
Reiner-Benaim, Anat; Yekutieli, Daniel; Letwin, Noah E; Elmer, Gregory I; Lee, Norman H; Kafkafi, Neri; Benjamini, Yoav
2007-09-01
Gene expression and phenotypic functionality can best be associated when they are measured quantitatively within the same experiment. The analysis of such a complex experiment is presented, searching for associations between measures of exploratory behavior in mice and gene expression in brain regions. The analysis of such experiments raises several methodological problems. First and foremost, the size of the pool of potential discoveries being screened is enormous yet only few biologically relevant findings are expected, making the problem of multiple testing especially severe. We present solutions based on screening by testing related hypotheses, then testing the hypotheses of interest. In one variant the subset is selected directly, in the other one a tree of hypotheses is tested hierarchical; both variants control the False Discovery Rate (FDR). Other problems in such experiments are in the fact that the level of data aggregation may be different for the quantitative traits (one per animal) and gene expression measurements (pooled across animals); in that the association may not be linear; and in the resolution of interest only few replications exist. We offer solutions to these problems as well. The hierarchical FDR testing strategies presented here can serve beyond the structure of our motivating example study to any complex microarray study. Supplementary data are available at Bioinformatics online.
E-therapy for mental health problems: a systematic review.
Postel, Marloes G; de Haan, Hein A; De Jong, Cor A J
2008-09-01
The widespread availability of the Internet offers opportunities for improving access to therapy for people with mental health problems. There is a seemingly infinite supply of Internet-based interventions available on the World Wide Web. The aim of the present study is to systematically assess the methodological quality of randomized controlled trials (RCTs) concerning e-therapy for mental health problems. Two reviewers independently assessed the methodological quality of the RCTs, based on a list of criteria for the methodological quality assessment as recommended by the Cochrane Back Review Group. The search yielded 14 papers that reported RCTs concerning e-therapy for mental-health problems. The methodological quality of studies included in this review was generally low. It is concluded that e-therapy may turn out to be an appropriate therapeutic entity, but the evidence needs to be more convincing. Recommendations are made concerning the method of reporting RCTs and the need to add some content items to an e-therapy study.
Rapid space trajectory generation using a Fourier series shape-based approach
NASA Astrophysics Data System (ADS)
Taheri, Ehsan
With the insatiable curiosity of human beings to explore the universe and our solar system, it is essential to benefit from larger propulsion capabilities to execute efficient transfers and carry more scientific equipments. In the field of space trajectory optimization the fundamental advances in using low-thrust propulsion and exploiting the multi-body dynamics has played pivotal role in designing efficient space mission trajectories. The former provides larger cumulative momentum change in comparison with the conventional chemical propulsion whereas the latter results in almost ballistic trajectories with negligible amount of propellant. However, the problem of space trajectory design translates into an optimal control problem which is, in general, time-consuming and very difficult to solve. Therefore, the goal of the thesis is to address the above problem by developing a methodology to simplify and facilitate the process of finding initial low-thrust trajectories in both two-body and multi-body environments. This initial solution will not only provide mission designers with a better understanding of the problem and solution but also serves as a good initial guess for high-fidelity optimal control solvers and increases their convergence rate. Almost all of the high-fidelity solvers enjoy the existence of an initial guess that already satisfies the equations of motion and some of the most important constraints. Despite the nonlinear nature of the problem, it is sought to find a robust technique for a wide range of typical low-thrust transfers with reduced computational intensity. Another important aspect of our developed methodology is the representation of low-thrust trajectories by Fourier series with which the number of design variables reduces significantly. Emphasis is given on simplifying the equations of motion to the possible extent and avoid approximating the controls. These facts contribute to speeding up the solution finding procedure. Several example applications of two and three-dimensional two-body low-thrust transfers are considered. In addition, in the multi-body dynamic, and in particular the restricted-three-body dynamic, several Earth-to-Moon low-thrust transfers are investigated.
Fitting methods to paradigms: are ergonomics methods fit for systems thinking?
Salmon, Paul M; Walker, Guy H; M Read, Gemma J; Goode, Natassia; Stanton, Neville A
2017-02-01
The issues being tackled within ergonomics problem spaces are shifting. Although existing paradigms appear relevant for modern day systems, it is worth questioning whether our methods are. This paper asks whether the complexities of systems thinking, a currently ubiquitous ergonomics paradigm, are outpacing the capabilities of our methodological toolkit. This is achieved through examining the contemporary ergonomics problem space and the extent to which ergonomics methods can meet the challenges posed. Specifically, five key areas within the ergonomics paradigm of systems thinking are focused on: normal performance as a cause of accidents, accident prediction, system migration, systems concepts and ergonomics in design. The methods available for pursuing each line of inquiry are discussed, along with their ability to respond to key requirements. In doing so, a series of new methodological requirements and capabilities are identified. It is argued that further methodological development is required to provide researchers and practitioners with appropriate tools to explore both contemporary and future problems. Practitioner Summary: Ergonomics methods are the cornerstone of our discipline. This paper examines whether our current methodological toolkit is fit for purpose given the changing nature of ergonomics problems. The findings provide key research and practice requirements for methodological development.
Induced abortion and adolescent mental health.
Stotland, Nada L
2011-10-01
Induced abortion is widely believed - by the public, healthcare professionals, and policy-makers - to lead to adverse mental health sequelae. This belief is false, as it applies both to adult women and adolescents. However, it has been used to rationalize, and been quoted in, restrictive and intrusive legislation in several states and in proposed federal legislation. It is essential for gynecologists to have accurate information, as clinicians, for their patients, and, as key experts, for policy makers. New articles concluding that there are adverse psychological outcomes from induced abortion continue to be published. The methodological flaws in these articles are so serious as to invalidate those conclusions. Several recent scholarly analyses detail these flaws. Methodologically sound studies and reviews continue to demonstrate that psychosocial problems play a role in unwanted conception and the decision to abort unwanted pregnancies but are not the result of abortion. Clinicians may have to reassure patients making decisions about their pregnancies that abortion does not cause psychiatric illness. They can do so on the basis of recent analyses substantiating that finding. (C) 2011 Lippincott Williams & Wilkins, Inc.
Extending substructure based iterative solvers to multiple load and repeated analyses
NASA Technical Reports Server (NTRS)
Farhat, Charbel
1993-01-01
Direct solvers currently dominate commercial finite element structural software, but do not scale well in the fine granularity regime targeted by emerging parallel processors. Substructure based iterative solvers--often called also domain decomposition algorithms--lend themselves better to parallel processing, but must overcome several obstacles before earning their place in general purpose structural analysis programs. One such obstacle is the solution of systems with many or repeated right hand sides. Such systems arise, for example, in multiple load static analyses and in implicit linear dynamics computations. Direct solvers are well-suited for these problems because after the system matrix has been factored, the multiple or repeated solutions can be obtained through relatively inexpensive forward and backward substitutions. On the other hand, iterative solvers in general are ill-suited for these problems because they often must restart from scratch for every different right hand side. In this paper, we present a methodology for extending the range of applications of domain decomposition methods to problems with multiple or repeated right hand sides. Basically, we formulate the overall problem as a series of minimization problems over K-orthogonal and supplementary subspaces, and tailor the preconditioned conjugate gradient algorithm to solve them efficiently. The resulting solution method is scalable, whereas direct factorization schemes and forward and backward substitution algorithms are not. We illustrate the proposed methodology with the solution of static and dynamic structural problems, and highlight its potential to outperform forward and backward substitutions on parallel computers. As an example, we show that for a linear structural dynamics problem with 11640 degrees of freedom, every time-step beyond time-step 15 is solved in a single iteration and consumes 1.0 second on a 32 processor iPSC-860 system; for the same problem and the same parallel processor, a pair of forward/backward substitutions at each step consumes 15.0 seconds.
Family and school spillover in adolescents' daily lives.
Flook, Lisa; Fuligni, Andrew J
2008-01-01
This study examined spillover between daily family stressors and school problems among 589 ninth-grade students (mean age = 14.9 years) from Mexican, Chinese, and European backgrounds. Spillover was examined using a daily diary methodology in which adolescents reported on their school and family experiences each day for 2 weeks. Analyses using hierarchical linear modeling revealed reciprocal spillover effects between adolescents' daily functioning in the family and school domains that spanned several days. Longitudinal analyses indicated that spillover between family stressors and school problems also occurs across the high school years, from 9th to 12th grade, and that both are predictive of poorer academic performance in 12th grade. These findings have practical implications for adolescents' academic achievement trajectories and general well-being.
Increasing accuracy in the assessment of motion sickness: A construct methodology
NASA Technical Reports Server (NTRS)
Stout, Cynthia S.; Cowings, Patricia S.
1993-01-01
The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.
Problem Solving in Biology: A Methodology
ERIC Educational Resources Information Center
Wisehart, Gary; Mandell, Mark
2008-01-01
A methodology is described that teaches science process by combining informal logic and a heuristic for rating factual reliability. This system facilitates student hypothesis formation, testing, and evaluation of results. After problem solving with this scheme, students are asked to examine and evaluate arguments for the underlying principles of…
SOME POSSIBLE APPLICATIONS OF PROJECT OUTCOMES RESEARCH METHODOLOGY
Section I, refers to the possibility of using the theory and methodology of Project Outcomes to problems of strategic information. It is felt that...purposes of assessing present and future organizational effectiveness . Section IV, refers to the applications that our study may have for problems of
Cochran, Bryan N; Cauce, Ana Mari
2006-03-01
Previous research has suggested that lesbian, gay, bisexual, and transgender (LGBT) individuals enter treatment for substance abuse with more severe problems than heterosexual individuals. However, methodological difficulties, particularly the difficulty of obtaining a representative sample, have limited the ability to draw conclusions about LGBT individuals who receive services for substance abuse. This study took advantage of a unique opportunity to examine a representative sample of openly LGBT clients receiving publicly funded substance abuse treatment by using data gathered by treatment providers in Washington State. Baseline differences between openly LGBT and heterosexual clients were compared in a variety of domains. Results demonstrated that openly LGBT clients enter treatment with more severe substance abuse problems, greater psychopathology, and greater medical service utilization when compared with heterosexual clients. When the analyses were stratified based on sex, different patterns of substance use and associated psychosocial characteristics emerged for the LGBT clients. Implications for provision of appropriate services and recommendations to treatment agencies are discussed in this article.
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
Verhoest, Niko E.C; Lievens, Hans; Wagner, Wolfgang; Álvarez-Mozos, Jesús; Moran, M. Susan; Mattia, Francesco
2008-01-01
Synthetic Aperture Radar has shown its large potential for retrieving soil moisture maps at regional scales. However, since the backscattered signal is determined by several surface characteristics, the retrieval of soil moisture is an ill-posed problem when using single configuration imagery. Unless accurate surface roughness parameter values are available, retrieving soil moisture from radar backscatter usually provides inaccurate estimates. The characterization of soil roughness is not fully understood, and a large range of roughness parameter values can be obtained for the same surface when different measurement methodologies are used. In this paper, a literature review is made that summarizes the problems encountered when parameterizing soil roughness as well as the reported impact of the errors made on the retrieved soil moisture. A number of suggestions were made for resolving issues in roughness parameterization and studying the impact of these roughness problems on the soil moisture retrieval accuracy and scale. PMID:27879932
On Multifunctional Collaborative Methods in Engineering Science
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.
2001-01-01
Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized.
Frequent methodological errors in clinical research.
Silva Aycaguer, L C
2018-03-07
Several errors that are frequently present in clinical research are listed, discussed and illustrated. A distinction is made between what can be considered an "error" arising from ignorance or neglect, from what stems from a lack of integrity of researchers, although it is recognized and documented that it is not easy to establish when we are in a case and when in another. The work does not intend to make an exhaustive inventory of such problems, but focuses on those that, while frequent, are usually less evident or less marked in the various lists that have been published with this type of problems. It has been a decision to develop in detail the examples that illustrate the problems identified, instead of making a list of errors accompanied by an epidermal description of their characteristics. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.
Transient responses' optimization by means of set-based multi-objective evolution
NASA Astrophysics Data System (ADS)
Avigad, Gideon; Eisenstadt, Erella; Goldvard, Alex; Salomon, Shaul
2012-04-01
In this article, a novel solution to multi-objective problems involving the optimization of transient responses is suggested. It is claimed that the common approach of treating such problems by introducing auxiliary objectives overlooks tradeoffs that should be presented to the decision makers. This means that, if at some time during the responses, one of the responses is optimal, it should not be overlooked. An evolutionary multi-objective algorithm is suggested in order to search for these optimal solutions. For this purpose, state-wise domination is utilized with a new crowding measure for ordered sets being suggested. The approach is tested on both artificial as well as on real life problems in order to explain the methodology and demonstrate its applicability and importance. The results indicate that, from an engineering point of view, the approach possesses several advantages over existing approaches. Moreover, the applications highlight the importance of set-based evolution.
Methodological Problems on the Way to Integrative Human Neuroscience.
Kotchoubey, Boris; Tretter, Felix; Braun, Hans A; Buchheim, Thomas; Draguhn, Andreas; Fuchs, Thomas; Hasler, Felix; Hastedt, Heiner; Hinterberger, Thilo; Northoff, Georg; Rentschler, Ingo; Schleim, Stephan; Sellmaier, Stephan; Tebartz Van Elst, Ludger; Tschacher, Wolfgang
2016-01-01
Neuroscience is a multidisciplinary effort to understand the structures and functions of the brain and brain-mind relations. This effort results in an increasing amount of data, generated by sophisticated technologies. However, these data enhance our descriptive knowledge , rather than improve our understanding of brain functions. This is caused by methodological gaps both within and between subdisciplines constituting neuroscience, and the atomistic approach that limits the study of macro- and mesoscopic issues. Whole-brain measurement technologies do not resolve these issues, but rather aggravate them by the complexity problem. The present article is devoted to methodological and epistemic problems that obstruct the development of human neuroscience. We neither discuss ontological questions (e.g., the nature of the mind) nor review data, except when it is necessary to demonstrate a methodological issue. As regards intradisciplinary methodological problems, we concentrate on those within neurobiology (e.g., the gap between electrical and chemical approaches to neurophysiological processes) and psychology (missing theoretical concepts). As regards interdisciplinary problems, we suggest that core disciplines of neuroscience can be integrated using systemic concepts that also entail human-environment relations. We emphasize the necessity of a meta-discussion that should entail a closer cooperation with philosophy as a discipline of systematic reflection. The atomistic reduction should be complemented by the explicit consideration of the embodiedness of the brain and the embeddedness of humans. The discussion is aimed at the development of an explicit methodology of integrative human neuroscience , which will not only link different fields and levels, but also help in understanding clinical phenomena.
Methodological Problems on the Way to Integrative Human Neuroscience
Kotchoubey, Boris; Tretter, Felix; Braun, Hans A.; Buchheim, Thomas; Draguhn, Andreas; Fuchs, Thomas; Hasler, Felix; Hastedt, Heiner; Hinterberger, Thilo; Northoff, Georg; Rentschler, Ingo; Schleim, Stephan; Sellmaier, Stephan; Tebartz Van Elst, Ludger; Tschacher, Wolfgang
2016-01-01
Neuroscience is a multidisciplinary effort to understand the structures and functions of the brain and brain-mind relations. This effort results in an increasing amount of data, generated by sophisticated technologies. However, these data enhance our descriptive knowledge, rather than improve our understanding of brain functions. This is caused by methodological gaps both within and between subdisciplines constituting neuroscience, and the atomistic approach that limits the study of macro- and mesoscopic issues. Whole-brain measurement technologies do not resolve these issues, but rather aggravate them by the complexity problem. The present article is devoted to methodological and epistemic problems that obstruct the development of human neuroscience. We neither discuss ontological questions (e.g., the nature of the mind) nor review data, except when it is necessary to demonstrate a methodological issue. As regards intradisciplinary methodological problems, we concentrate on those within neurobiology (e.g., the gap between electrical and chemical approaches to neurophysiological processes) and psychology (missing theoretical concepts). As regards interdisciplinary problems, we suggest that core disciplines of neuroscience can be integrated using systemic concepts that also entail human-environment relations. We emphasize the necessity of a meta-discussion that should entail a closer cooperation with philosophy as a discipline of systematic reflection. The atomistic reduction should be complemented by the explicit consideration of the embodiedness of the brain and the embeddedness of humans. The discussion is aimed at the development of an explicit methodology of integrative human neuroscience, which will not only link different fields and levels, but also help in understanding clinical phenomena. PMID:27965548
Layer Stripping Solutions of Inverse Seismic Problems.
1985-03-21
problems--more so than has generally been recognized. The subject of this thesis is the theoretical development of the . layer-stripping methodology , and...medium varies sharply at each interface, which would be expected to cause difficulties for the algorithm, since it was designed for a smoothy varying... methodology was applied in a novel way. The inverse problem considered in this chapter was that of reconstructing a layered medium from measurement of its
Emergent Aerospace Designs Using Negotiating Autonomous Agents
NASA Technical Reports Server (NTRS)
Deshmukh, Abhijit; Middelkoop, Timothy; Krothapalli, Anjaneyulu; Smith, Charles
2000-01-01
This paper presents a distributed design methodology where designs emerge as a result of the negotiations between different stake holders in the process, such as cost, performance, reliability, etc. The proposed methodology uses autonomous agents to represent design decision makers. Each agent influences specific design parameters in order to maximize their utility. Since the design parameters depend on the aggregate demand of all the agents in the system, design agents need to negotiate with others in the market economy in order to reach an acceptable utility value. This paper addresses several interesting research issues related to distributed design architectures. First, we present a flexible framework which facilitates decomposition of the design problem. Second, we present overview of a market mechanism for generating acceptable design configurations. Finally, we integrate learning mechanisms in the design process to reduce the computational overhead.
Researching Street Children: Methodological and Ethical Issues.
ERIC Educational Resources Information Center
Hutz, Claudio S.; And Others
This paper describes the ethical and methodological problems associated with studying prosocial moral reasoning of street children and children of low and high SES living with their families, and problems associated with studying sexual attitudes and behavior of street children and their knowledge of sexually transmitted diseases, especially AIDS.…
Problem-Based Learning: Lessons for Administrators, Educators and Learners
ERIC Educational Resources Information Center
Yeo, Roland
2005-01-01
Purpose: The paper aims to explore the challenges of problem-based learning (PBL) as an unconventional teaching methodology experienced by a higher learning institute in Singapore. Design/methodology/approach: The exploratory study was conducted using focus group discussions and semi-structured interviews. Four groups of people were invited to…
Assessment of capillary suction time (CST) test methodologies.
Sawalha, O; Scholz, M
2007-12-01
The capillary suction time (CST) test is a commonly used method to measure the filterability and the easiness of removing moisture from slurry and sludge in numerous environmental and industrial applications. This study assessed several novel alterations of both the test methodology and the current standard capillary suction time (CST) apparatus. Twelve different papers including the standard Whatman No. 17 chromatographic paper were tested. The tests were run using four different types of sludge including a synthetic sludge, which was specifically developed for benchmarking purposes. The standard apparatus was altered by the introduction of a novel rectangular funnel instead of a standard circular one. A stirrer was also introduced to solve the problem of test inconsistency (e.g. high CST variability) particularly for heavy types of sludge. Results showed that several alternative papers, which are cheaper than the standard paper, can be used to estimate CST values accurately, and that the test repeatability can be improved in many cases and for different types of sludge. The introduction of the rectangular funnel demonstrated an obvious enhancement of test repeatability. The use of a stirrer to avoid sedimentation of heavy sludge did not have statistically significant impact on the CST values or the corresponding data variability. The application of synthetic sludge can support the testing of experimental methodologies and should be used for subsequent benchmarking purposes.
The Speaker Respoken: Material Rhetoric as Feminist Methodology.
ERIC Educational Resources Information Center
Collins, Vicki Tolar
1999-01-01
Presents a methodology based on the concept of "material rhetoric" that can help scholars avoid problems as they reclaim women's historical texts. Defines material rhetoric and positions it theoretically in relation to other methodologies, including bibliographical studies, reception theory, and established feminist methodologies. Illustrates…
A methodology for the assessment of manned flight simulator fidelity
NASA Technical Reports Server (NTRS)
Hess, Ronald A.; Malsbury, Terry N.
1989-01-01
A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.
Evidence-Based Psychosocial Treatments for Ethnic Minority Youth
Huey, Stanley J.; Polo, Antonio J.
2008-01-01
This article reviews research on evidence-based treatments (EBTs) for ethnic minority youth using criteria from Chambless et al. (1998), Chambless et al. (1996), and Chambless and Hollon (1998). Although no well-established treatments were identified, probably efficacious or possibly efficacious treatments were found for ethnic minority youth with anxiety-related problems, attention-deficit/hyperactivity disorder, depression, conduct problems, substance use problems, trauma-related syndromes, and other clinical problems. In addition, all studies met either Nathan and Gorman's (2002) Type 1 or Type 2 methodological criteria. A brief meta-analysis showed overall treatment effects of medium magnitude (d = .44). Effects were larger when EBTs were compared to no treatment (d = .58) or psychological placebos (d = .51) versus treatment as usual (d = .22). Youth ethnicity (African American, Latino, mixed/other minority), problem type, clinical severity, diagnostic status, and culture-responsive treatment status did not moderate treatment outcome. Most studies had low statistical power and poor representation of less acculturated youth. Few tests of cultural adaptation effects have been conducted in the literature and culturally validated outcome measures are mostly lacking. Recommendations for clinical practice and future research directions are provided. PMID:18444061
A hybrid neural networks-fuzzy logic-genetic algorithm for grade estimation
NASA Astrophysics Data System (ADS)
Tahmasebi, Pejman; Hezarkhani, Ardeshir
2012-05-01
The grade estimation is a quite important and money/time-consuming stage in a mine project, which is considered as a challenge for the geologists and mining engineers due to the structural complexities in mineral ore deposits. To overcome this problem, several artificial intelligence techniques such as Artificial Neural Networks (ANN) and Fuzzy Logic (FL) have recently been employed with various architectures and properties. However, due to the constraints of both methods, they yield the desired results only under the specific circumstances. As an example, one major problem in FL is the difficulty of constructing the membership functions (MFs).Other problems such as architecture and local minima could also be located in ANN designing. Therefore, a new methodology is presented in this paper for grade estimation. This method which is based on ANN and FL is called "Coactive Neuro-Fuzzy Inference System" (CANFIS) which combines two approaches, ANN and FL. The combination of these two artificial intelligence approaches is achieved via the verbal and numerical power of intelligent systems. To improve the performance of this system, a Genetic Algorithm (GA) - as a well-known technique to solve the complex optimization problems - is also employed to optimize the network parameters including learning rate, momentum of the network and the number of MFs for each input. A comparison of these techniques (ANN, Adaptive Neuro-Fuzzy Inference System or ANFIS) with this new method (CANFIS-GA) is also carried out through a case study in Sungun copper deposit, located in East-Azerbaijan, Iran. The results show that CANFIS-GA could be a faster and more accurate alternative to the existing time-consuming methodologies for ore grade estimation and that is, therefore, suggested to be applied for grade estimation in similar problems.
A hybrid neural networks-fuzzy logic-genetic algorithm for grade estimation
Tahmasebi, Pejman; Hezarkhani, Ardeshir
2012-01-01
The grade estimation is a quite important and money/time-consuming stage in a mine project, which is considered as a challenge for the geologists and mining engineers due to the structural complexities in mineral ore deposits. To overcome this problem, several artificial intelligence techniques such as Artificial Neural Networks (ANN) and Fuzzy Logic (FL) have recently been employed with various architectures and properties. However, due to the constraints of both methods, they yield the desired results only under the specific circumstances. As an example, one major problem in FL is the difficulty of constructing the membership functions (MFs).Other problems such as architecture and local minima could also be located in ANN designing. Therefore, a new methodology is presented in this paper for grade estimation. This method which is based on ANN and FL is called “Coactive Neuro-Fuzzy Inference System” (CANFIS) which combines two approaches, ANN and FL. The combination of these two artificial intelligence approaches is achieved via the verbal and numerical power of intelligent systems. To improve the performance of this system, a Genetic Algorithm (GA) – as a well-known technique to solve the complex optimization problems – is also employed to optimize the network parameters including learning rate, momentum of the network and the number of MFs for each input. A comparison of these techniques (ANN, Adaptive Neuro-Fuzzy Inference System or ANFIS) with this new method (CANFIS–GA) is also carried out through a case study in Sungun copper deposit, located in East-Azerbaijan, Iran. The results show that CANFIS–GA could be a faster and more accurate alternative to the existing time-consuming methodologies for ore grade estimation and that is, therefore, suggested to be applied for grade estimation in similar problems. PMID:25540468
[Women, health, and labor in Brazil: challenges for new action].
Aquino, E M; Menezes, G M; Marinho, L F
1995-01-01
Despite the remarkable rise in women's participation in the labor market in Brazil, its consequences on health are still virtually unknown. This study aims to identify theoretical and methodological problems in the relationship between labor and women's health from a gender perspective. Characteristics of women's occupational placement are described and analyzed as resulting from their role in social reproduction. The study examines the development of several conciliatory strategies between paid work and housework which are discussed as potential determinants of health problems and support the need for a critical reappraisal of theoretical and methodological strategies to reach a better understanding of the complexity and specificities of women's living and working conditions. The author also stresses the role of women's recent participation in the trade union movements in defense of health, body rights, and women's issues in the workplace, as well as the need for a new framework embodied in the women's social movement. The study thus points to the challenge to produce knowledge on this subject in order to unveil the uniqueness of the national scenario marked by unemployment, informal jobs, low salaries, weak trade unions and other civil organizations, and traditional domestic and marriage relationships.
Behavioral interventions for agitation in older adults with dementia: an evaluative review.
Spira, Adam P; Edelstein, Barry A
2006-06-01
Older adults with dementia commonly exhibit agitated behavior that puts them at risk of injury and institutionalization and is associated with caregiver stress. A range of theoretical approaches has produced numerous interventions to manage these behavior problems. This paper critically reviews the empirical literature on behavioral interventions to reduce agitation in older adults with dementia. A literature search yielded 23 articles that met inclusion criteria. These articles described interventions that targeted wandering, disruptive vocalization, physical aggression, other agitated behaviors and a combination of these behaviors. Studies are summarized individually and then evaluated. Behavioral interventions targeting agitated behavior exhibited by older adults with dementia show considerable promise. A number of methodological issues must be addressed to advance this research area. Problem areas include inconsistent use of functional assessment techniques, failure to report quantitative findings and inadequate demonstrations of experimental control. The reviewed studies collectively provide evidence that warrants optimism regarding the application of behavioral principles to the management of agitation among older adults with dementia. Although the results of some studies were mixed and several studies revealed methodological shortcomings, many of them offered innovations that can be used in future, more rigorously designed, intervention studies.
Parallel methodology to capture cyclic variability in motored engines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ameen, Muhsin M.; Yang, Xiaofeng; Kuo, Tang-Wei
2016-07-28
Numerical prediction of of cycle-to-cycle variability (CCV) in SI engines is extremely challenging for two key reasons: (i) high-fidelity methods such as large eddy simulation (LES) are require to accurately capture the in-cylinder turbulent flowfield, and (ii) CCV is experienced over long timescales and hence the simulations need to be performed for hundreds of consecutive cycles. In this study, a new methodology is proposed to dissociate this long time-scale problem into several shorter time-scale problems, which can considerably reduce the computational time without sacrificing the fidelity of the simulations. The strategy is to perform multiple single-cycle simulations in parallel bymore » effectively perturbing the simulation parameters such as the initial and boundary conditions. It is shown that by perturbing the initial velocity field effectively based on the intensity of the in-cylinder turbulence, the mean and variance of the in-cylinder flowfield is captured reasonably well. Adding perturbations in the initial pressure field and the boundary pressure improves the predictions. It is shown that this new approach is able to give accurate predictions of the flowfield statistics in less than one-tenth of time required for the conventional approach of simulating consecutive engine cycles.« less
Use of Six Sigma Methodology to Reduce Appointment Lead-Time in Obstetrics Outpatient Department.
Ortiz Barrios, Miguel A; Felizzola Jiménez, Heriberto
2016-10-01
This paper focuses on the issue of longer appointment lead-time in the obstetrics outpatient department of a maternal-child hospital in Colombia. Because of extended appointment lead-time, women with high-risk pregnancy could develop severe complications in their health status and put their babies at risk. This problem was detected through a project selection process explained in this article and to solve it, Six Sigma methodology has been used. First, the process was defined through a SIPOC diagram to identify its input and output variables. Second, six sigma performance indicators were calculated to establish the process baseline. Then, a fishbone diagram was used to determine the possible causes of the problem. These causes were validated with the aid of correlation analysis and other statistical tools. Later, improvement strategies were designed to reduce appointment lead-time in this department. Project results evidenced that average appointment lead-time reduced from 6,89 days to 4,08 days and the deviation standard dropped from 1,57 days to 1,24 days. In this way, the hospital will serve pregnant women faster, which represents a risk reduction of perinatal and maternal mortality.
A methodology to enhance electromagnetic compatibility in joint military operations
NASA Astrophysics Data System (ADS)
Buckellew, William R.
The development and validation of an improved methodology to identify, characterize, and prioritize potential joint EMI (electromagnetic interference) interactions and identify and develop solutions to reduce the effects of the interference are discussed. The methodology identifies potential EMI problems using results from field operations, historical data bases, and analytical modeling. Operational expertise, engineering analysis, and testing are used to characterize and prioritize the potential EMI problems. Results can be used to resolve potential EMI during the development and acquisition of new systems and to develop engineering fixes and operational workarounds for systems already employed. The analytic modeling portion of the methodology is a predictive process that uses progressive refinement of the analysis and the operational electronic environment to eliminate noninterfering equipment pairs, defer further analysis on pairs lacking operational significance, and resolve the remaining EMI problems. Tests are conducted on equipment pairs to ensure that the analytical models provide a realistic description of the predicted interference.
Recovery of speed of information processing in closed-head-injury patients.
Zwaagstra, R; Schmidt, I; Vanier, M
1996-06-01
After severe traumatic brain injury, patients almost invariably demonstrate a slowing of reaction time, reflecting a slowing of central information processing. Methodological problems associated with the traditional method for the analysis of longitudinal data (MANOVA) severely complicate studies on cognitive recovery. It is argued that multilevel models are often better suited for the analysis of improvement over time in clinical settings. Multilevel models take into account individual differences in both overall performance level and recovery. These models enable individual predictions for the recovery of speed of information processing. Recovery is modelled in a group of closed-head-injury patients (N = 24). Recovery was predicted by age and severity of injury, as indicated by coma duration. Over a period up to 44 months post trauma, reaction times were found to decrease faster for patients with longer coma duration.
Gerridzen, Ineke J; Moerman-van den Brink, Wiltine G; Depla, Marja F; Verschuur, Els M L; Veenhuizen, Ruth B; van der Wouden, Johannes C; Hertogh, Cees M P M; Joling, Karlijn J
2017-03-01
Experiences from clinical practice suggest that behavioural symptoms in patients with Korsakoff syndrome (KS) are a frequent problem. Knowledge about behavioural symptoms is important in understanding and managing these symptoms. The aim of this study is to review the prevalence and severity of behavioural symptoms in KS. Relevant articles were identified by searching Medline (PubMed), PsycINFO, Embase and CINAHL up to 4 June 2014. Two reviewers independently selected the studies, extracted their baseline data and assessed methodological quality using a standardized checklist. Fifteen studies fulfilled the inclusion criteria. A diversity of diagnoses was used indicating that KS and other alcohol-related cognitive disorders and terms were used interchangeably. None of the studies were primarily designed to estimate the prevalence or severity of behavioural symptoms in patients with KS. Most studies had serious methodological limitations. The reported prevalence estimates of behavioural symptoms in the included studies varied strongly. Most prevalent were depressive symptoms and disorders (2-50%, median 27%) and agitation and aggression (10-54%, median 27%). None of the reported, mean severity estimates met pathological thresholds. The highest severity estimates were found for apathy. Good quality studies on behavioural symptoms in patients with KS are lacking. Observational research designed to provide reliable estimates of the prevalence and severity of behavioural symptoms in patients with KS is needed. This could improve understanding and managing these symptoms and help care staff to better support the needs of this specific patient group. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
[The ethical reflection approach in decision-making processes in health institutes].
Gruat, Renaud
2015-12-01
Except in the specific case of end-of-life care, the law says nothing about the way in which health professionals must carry out ethical reflection regarding the treatment of their patients. A problem-solving methodology called the "ethical reflection approach" performed over several stages can be used. The decision-making process involves the whole team and draws on the ability of each caregiver to put forward a reasoned argument, in the interest of the patient. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Keuleers, Emmanuel; Balota, David A
2015-01-01
This paper introduces and summarizes the special issue on megastudies, crowdsourcing, and large datasets in psycholinguistics. We provide a brief historical overview and show how the papers in this issue have extended the field by compiling new databases and making important theoretical contributions. In addition, we discuss several studies that use text corpora to build distributional semantic models to tackle various interesting problems in psycholinguistics. Finally, as is the case across the papers, we highlight some methodological issues that are brought forth via the analyses of such datasets.
NASA Technical Reports Server (NTRS)
Beggs, John H.
2000-01-01
The upwind leapfrog or Linear Bicharacteristic Scheme (LBS) has previously been extended to treat lossy dielectric and magnetic materials. This paper examines different methodologies for treatment of the electric loss term in the Linear Bicharacteristic Scheme for computational electromagnetics. Several different treatments of the electric loss term using the LBS are explored and compared on one-dimensional model problems involving reflection from lossy dielectric materials on both uniform and nonuniform grids. Results using these LBS implementations are also compared with the FDTD method for convenience.
Hierarchical Parallelism in Finite Difference Analysis of Heat Conduction
NASA Technical Reports Server (NTRS)
Padovan, Joseph; Krishna, Lala; Gute, Douglas
1997-01-01
Based on the concept of hierarchical parallelism, this research effort resulted in highly efficient parallel solution strategies for very large scale heat conduction problems. Overall, the method of hierarchical parallelism involves the partitioning of thermal models into several substructured levels wherein an optimal balance into various associated bandwidths is achieved. The details are described in this report. Overall, the report is organized into two parts. Part 1 describes the parallel modelling methodology and associated multilevel direct, iterative and mixed solution schemes. Part 2 establishes both the formal and computational properties of the scheme.
Intelligent Systems: Shaping the Future of Aeronautics and Space Exploration
NASA Technical Reports Server (NTRS)
Krishnakumar, Kalmanje; Lohn, Jason; Kaneshige, John
2004-01-01
Intelligent systems are nature-inspired, mathematically sound, computationally intensive problem solving tools and methodologies that have become important for NASA's future roles in Aeronautics and Space Exploration. Intelligent systems will enable safe, cost and mission-effective approaches to air& control, system design, spacecraft autonomy, robotic space exploration and human exploration of Moon, Mars, and beyond. In this talk, we will discuss intelligent system technologies and expand on the role of intelligent systems in NASA's missions. We will also present several examples of which some are highlighted m this extended abstract.
Parallelized modelling and solution scheme for hierarchically scaled simulations
NASA Technical Reports Server (NTRS)
Padovan, Joe
1995-01-01
This two-part paper presents the results of a benchmarked analytical-numerical investigation into the operational characteristics of a unified parallel processing strategy for implicit fluid mechanics formulations. This hierarchical poly tree (HPT) strategy is based on multilevel substructural decomposition. The Tree morphology is chosen to minimize memory, communications and computational effort. The methodology is general enough to apply to existing finite difference (FD), finite element (FEM), finite volume (FV) or spectral element (SE) based computer programs without an extensive rewrite of code. In addition to finding large reductions in memory, communications, and computational effort associated with a parallel computing environment, substantial reductions are generated in the sequential mode of application. Such improvements grow with increasing problem size. Along with a theoretical development of general 2-D and 3-D HPT, several techniques for expanding the problem size that the current generation of computers are capable of solving, are presented and discussed. Among these techniques are several interpolative reduction methods. It was found that by combining several of these techniques that a relatively small interpolative reduction resulted in substantial performance gains. Several other unique features/benefits are discussed in this paper. Along with Part 1's theoretical development, Part 2 presents a numerical approach to the HPT along with four prototype CFD applications. These demonstrate the potential of the HPT strategy.
Methodological Issues and Practical Problems in Conducting Research on Abused Children.
ERIC Educational Resources Information Center
Kinard, E. Milling
In order to inform policy and programs, research on child abuse must be not only methodologically rigorous, but also practically feasible. However, practical problems make child abuse research difficult to conduct. Definitions of abuse must be explicit and different types of abuse must be assessed separately. Study samples should be as…
Research Methodology in Second Language Studies: Trends, Concerns, and New Directions
ERIC Educational Resources Information Center
King, Kendall A.; Mackey, Alison
2016-01-01
The field of second language studies is using increasingly sophisticated methodological approaches to address a growing number of urgent, real-world problems. These methodological developments bring both new challenges and opportunities. This article briefly reviews recent ontological and methodological debates in the field, then builds on these…
Evaluating thermoregulation in reptiles: the fallacy of the inappropriately applied method.
Seebacher, Frank; Shine, Richard
2004-01-01
Given the importance of heat in most biological processes, studies on thermoregulation have played a major role in understanding the ecology of ectothermic vertebrates. It is, however, difficult to assess whether body temperature is actually regulated, and several techniques have been developed that allow an objective assessment of thermoregulation. Almost all recent studies on reptiles follow a single methodology that, when used correctly, facilitates comparisons between species, climates, and so on. However, the use of operative temperatures in this methodology assumes zero heat capacity of the study animals and is, therefore, appropriate for small animals only. Operative temperatures represent potentially available body temperatures accurately for small animals but can substantially overestimate the ranges of body temperature available to larger animals whose slower rates of heating and cooling mean that they cannot reach equilibrium if they encounter operative temperatures that change rapidly through either space or time. This error may lead to serious misinterpretations of field data. We derive correction factors specific for body mass and rate of movement that can be used to estimate body temperature null distributions of larger reptiles, thereby overcoming this methodological problem.
MacNamara, Annmarie; Phan, K Luan
2016-03-01
NIMH's Research Domain Criteria (RDoC) project seeks to advance the diagnosis, prevention, and treatment of mental disorders by promoting psychobiological research on dimensional constructs that might cut across traditional diagnostic boundaries (Kozak & Cuthbert, ). At the core of this approach is the notion that these dimensional constructs can be assessed across different units of analysis (e.g., genes, physiology, behavior), enriching the constructs and providing more complete explanations of clinical problems. While the conceptual aspects of RDoC have been discussed in several prior papers, its methodological aspects have received comparatively less attention. For example, how to integrate data from different units of analysis has been relatively unclear. Here, we discuss one means of psychobiologically operationalizing RDoC constructs across different units of analysis (the psychoneurometric approach; Yancey et al., ), highlighting ways in which this approach might be refined in future iterations. We conclude that there is much to be learned from this technique; however, greater attention to scale-development methods and to psychometrics will likely benefit this and other methodological approaches to combining measurements across multiple units of analysis. © 2016 Society for Psychophysiological Research.
Sharma, Prashant; Das, Reena
2016-03-26
Cation-exchange high-performance liquid chromatography (CE-HPLC) is a widely used laboratory test to detect variant hemoglobins as well as quantify hemoglobins F and A2 for the diagnosis of thalassemia syndromes. It's versatility, speed, reproducibility and convenience have made CE-HPLC the method of choice to initially screen for hemoglobin disorders. Despite its popularity, several methodological aspects of the technology remain obscure to pathologists and this may have consequences in specific situations. This paper discusses the basic principles of the technique, the initial quality control steps and the interpretation of various controls and variables that are available on the instrument output. Subsequent sections are devoted to methodological considerations that arise during reporting of cases. For instance, common problems of misidentified peaks, totals crossing 100%, causes of total area being above or below acceptable limits and the importance of pre-integration region peaks are dealt with. Ultimately, CE-HPLC remains an investigation, the reporting of which combines in-depth knowledge of the biological basics with more than a working knowledge of the technological aspects of the technique.
Cramer, Robert J.; Johnson, Shara M.; McLaughlin, Jennifer; Rausch, Emilie M.; Conroy, Mary Alice
2014-01-01
Clinical and counseling psychology programs currently lack adequate evidence-based competency goals and training in suicide risk assessment. To begin to address this problem, this article proposes core competencies and an integrated training framework that can form the basis for training and research in this area. First, we evaluate the extent to which current training is effective in preparing trainees for suicide risk assessment. Within this discussion, sample and methodological issues are reviewed. Second, as an extension of these methodological training issues, we integrate empirically- and expert-derived suicide risk assessment competencies from several sources with the goal of streamlining core competencies for training purposes. Finally, a framework for suicide risk assessment training is outlined. The approach employs Objective Structured Clinical Examination (OSCE) methodology, an approach commonly utilized in medical competency training. The training modality also proposes the Suicide Competency Assessment Form (SCAF), a training tool evaluating self- and observer-ratings of trainee core competencies. The training framework and SCAF are ripe for empirical evaluation and potential training implementation. PMID:24672588
Cramer, Robert J; Johnson, Shara M; McLaughlin, Jennifer; Rausch, Emilie M; Conroy, Mary Alice
2013-02-01
Clinical and counseling psychology programs currently lack adequate evidence-based competency goals and training in suicide risk assessment. To begin to address this problem, this article proposes core competencies and an integrated training framework that can form the basis for training and research in this area. First, we evaluate the extent to which current training is effective in preparing trainees for suicide risk assessment. Within this discussion, sample and methodological issues are reviewed. Second, as an extension of these methodological training issues, we integrate empirically- and expert-derived suicide risk assessment competencies from several sources with the goal of streamlining core competencies for training purposes. Finally, a framework for suicide risk assessment training is outlined. The approach employs Objective Structured Clinical Examination (OSCE) methodology, an approach commonly utilized in medical competency training. The training modality also proposes the Suicide Competency Assessment Form (SCAF), a training tool evaluating self- and observer-ratings of trainee core competencies. The training framework and SCAF are ripe for empirical evaluation and potential training implementation.
Berne, Rosalyn W; Raviv, Daniel
2004-04-01
This paper introduces the Eight Dimensional Methodology for Innovative Thinking (the Eight Dimensional Methodology), for innovative problem solving, as a unified approach to case analysis that builds on comprehensive problem solving knowledge from industry, business, marketing, math, science, engineering, technology, arts, and daily life. It is designed to stimulate innovation by quickly generating unique "out of the box" unexpected and high quality solutions. It gives new insights and thinking strategies to solve everyday problems faced in the workplace, by helping decision makers to see otherwise obscure alternatives and solutions. Daniel Raviv, the engineer who developed the Eight Dimensional Methodology, and paper co-author, technology ethicist Rosalyn Berne, suggest that this tool can be especially useful in identifying solutions and alternatives for particular problems of engineering, and for the ethical challenges which arise with them. First, the Eight Dimensional Methodology helps to elucidate how what may appear to be a basic engineering problem also has ethical dimensions. In addition, it offers to the engineer a methodology for penetrating and seeing new dimensions of those problems. To demonstrate the effectiveness of the Eight Dimensional Methodology as an analytical tool for thinking about ethical challenges to engineering, the paper presents the case of the construction of the Large Binocular Telescope (LBT) on Mount Graham in Arizona. Analysis of the case offers to decision makers the use of the Eight Dimensional Methodology in considering alternative solutions for how they can proceed in their goals of exploring space. It then follows that same process through the second stage of exploring the ethics of each of those different solutions. The LBT project pools resources from an international partnership of universities and research institutes for the construction and maintenance of a highly sophisticated, powerful new telescope. It will soon mark the erection of the world's largest and most powerful optical telescope, designed to see fine detail otherwise visible only from space. It also represents a controversial engineering project that is being undertaken on land considered to be sacred by the local, native Apache people. As presented, the case features the University of Virginia, and its challenges in consideration of whether and how to join the LBT project consortium.
Aeroelastic optimization methodology for viscous and turbulent flows
NASA Astrophysics Data System (ADS)
Barcelos Junior, Manuel Nascimento Dias
2007-12-01
In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi; Sundaram, P.
2001-01-01
HUMS systems have been an area of increased research in the recent times due to two main reasons: (a) increase in the occurrences of accidents in the aerospace, and (b) stricter FAA regulations on aircrafts maintenance [2]. There are several problems associated with the maintenance of aircrafts that the HUMS systems can solve through the use of several monitoring technologies.This paper documents our methodology of employing scenarios in the specification and evaluation of architecture for HUMS. Section 2 investigates related works that use scenarios in software development. Section 3 describes how we use scenarios in our work, which is followed by a demonstration of our methods in the development of KUMS in section 4. Conclusion summarizes results.
Yanos, Philip T; West, Michelle L; Smith, Stephen M
2010-12-01
Most studies on coping among persons with severe mental illness have relied on retrospective self-report methods; a limitation of this methodology is susceptibility to recall bias. The purpose of the present investigation was to expand the current understanding of the impact of coping among persons with severe mental illness by examining coping strategies, mood, and social functioning (operationalized as productive time use) using a daily process design. Twenty-seven adults diagnosed with severe mental illness completed baseline clinical interviews and up to 20 days of nightly telephone interviews addressing coping and daily life. A total of 198 coping efforts were reported for 387 days. Mixed-effects regression analyses examined the association between type of daily coping strategy (problem-centered, neutral, or avoidant) and both daily proportion of time participants spent in productive activity and daily negative mood, controlling for demographic and clinical variables. The results indicated that productive time use was significantly lower on days when avoidant strategies were used, in contrast with days when problem-centered strategies and neutral strategies were used. There was no significant main effect of coping on negative mood, although there was a trend in the expected direction. Findings support the hypothesis that the types of coping strategies adults with severe mental illness use are related to better social functioning on a daily level. Copyright © 2010 Elsevier B.V. All rights reserved.
IMSF: Infinite Methodology Set Framework
NASA Astrophysics Data System (ADS)
Ota, Martin; Jelínek, Ivan
Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.
16S rRNA beacons for bacterial monitoring during human space missions.
Larios-Sanz, Maia; Kourentzi, Katerina D; Warmflash, David; Jones, Jeffrey; Pierson, Duane L; Willson, Richard C; Fox, George E
2007-04-01
Microorganisms are unavoidable in space environments and their presence has, at times, been a source of problems. Concerns about disease during human space missions are particularly important considering the significant changes the immune system incurs during spaceflight and the history of microbial contamination aboard the Mir space station. Additionally, these contaminants may have adverse effects on instrumentation and life-support systems. A sensitive, highly specific system to detect, characterize, and monitor these microbial populations is essential. Herein we describe a monitoring approach that uses 16S rRNA targeted molecular beacons to successfully detect several specific bacterial groupings. This methodology will greatly simplify in-flight monitoring by minimizing sample handling and processing. We also address and provide solutions to target accessibility problems encountered in hybridizations that target 16S rRNA.
NASA Technical Reports Server (NTRS)
Modesitt, Kenneth L.
1990-01-01
A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.
NASA Astrophysics Data System (ADS)
Dehbozorgi, Mohammad Reza
2000-10-01
Improvements in power system reliability have always been of interest to both power companies and customers. Since there are no sizable electrical energy storage elements in electrical power systems, the generated power should match the load demand at any given time. Failure to meet this balance may cause severe system problems, including loss of generation and system blackouts. This thesis proposes a methodology which can respond to either loss of generation or loss of load. It is based on switching of electric water heaters using power system frequency as the controlling signal. The proposed methodology encounters, and the thesis has addressed, the following associated problems. The controller must be interfaced with the existing thermostat control. When necessary to switch on loads, the water in the tank should not be overheated. Rapid switching of blocks of load, or chattering, has been considered. The contributions of the thesis are: (A) A system has been proposed which makes a significant portion of the distributed loads connected to a power system to behave in a predetermined manner to improve the power system response during disturbances. (B) The action of the proposed system is transparent to the customers. (C) The thesis proposes a simple analysis for determining the amount of such loads which might be switched and relates this amount to the size of the disturbances which can occur in the utility. (D) The proposed system acts without any formal communication links, solely using the embedded information present system-wide. (E) The methodology of the thesis proposes switching of water heater loads based on a simple, localized frequency set-point controller. The thesis has identified the consequent problem of rapid switching of distributed loads, which is referred to as chattering. (F) Two approaches have been proposed to reduce chattering to tolerable levels. (G) A frequency controller has been designed and built according to the specifications required to switch electric water heater loads in response to power system disturbances. (H) A cost analysis for building and installing the distributed frequency controller has been carried out. (I) The proposed equipment and methodology has been implemented and tested successfully. (Abstract shortened by UMI.)
Aerodynamic shape optimization using preconditioned conjugate gradient methods
NASA Technical Reports Server (NTRS)
Burgreen, Greg W.; Baysal, Oktay
1993-01-01
In an effort to further improve upon the latest advancements made in aerodynamic shape optimization procedures, a systematic study is performed to examine several current solution methodologies as applied to various aspects of the optimization procedure. It is demonstrated that preconditioned conjugate gradient-like methodologies dramatically decrease the computational efforts required for such procedures. The design problem investigated is the shape optimization of the upper and lower surfaces of an initially symmetric (NACA-012) airfoil in inviscid transonic flow and at zero degree angle-of-attack. The complete surface shape is represented using a Bezier-Bernstein polynomial. The present optimization method then automatically obtains supercritical airfoil shapes over a variety of freestream Mach numbers. Furthermore, the best optimization strategy examined resulted in a factor of 8 decrease in computational time as well as a factor of 4 decrease in memory over the most efficient strategies in current use.
A dynamic multi-scale Markov model based methodology for remaining life prediction
NASA Astrophysics Data System (ADS)
Yan, Jihong; Guo, Chaozhong; Wang, Xing
2011-05-01
The ability to accurately predict the remaining life of partially degraded components is crucial in prognostics. In this paper, a performance degradation index is designed using multi-feature fusion techniques to represent deterioration severities of facilities. Based on this indicator, an improved Markov model is proposed for remaining life prediction. Fuzzy C-Means (FCM) algorithm is employed to perform state division for Markov model in order to avoid the uncertainty of state division caused by the hard division approach. Considering the influence of both historical and real time data, a dynamic prediction method is introduced into Markov model by a weighted coefficient. Multi-scale theory is employed to solve the state division problem of multi-sample prediction. Consequently, a dynamic multi-scale Markov model is constructed. An experiment is designed based on a Bently-RK4 rotor testbed to validate the dynamic multi-scale Markov model, experimental results illustrate the effectiveness of the methodology.
On regulators with a prescribed degree of stability. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ng, P. T. P.
1981-01-01
Several important aspects of the Regulator with a Prescribed Degree of Stability (RPDS) methodology and its applications are considered. The solution of the time varying RPDS problem as well as the characterization of RPDS closed loop eigenstructure properties are obtained. Based on the asymptotic behavior of RPDS root loci, a one step algorithm for designing Regulators with Prescribed Damping Ratio (RPDR) is developed. The robustness properties of RPDS are characterized in terms of the properties of the return difference and the inverse return difference matrices for the RPDS state feedback loop. This class of regulators is found to possess excellent multiloop margins with respect to stability and degree of stability properties. The ability of RPDS design to tolerate changing operating conditions and unmodelled dynamics are illustrated with a multiterminal dc/ac power system example. The output feedback realization of RPDS requires the use of Linear Quadratic Gaussian (LQG) methodology.
Feasibility and benefits of laminar flow control on supersonic cruise airplanes
NASA Technical Reports Server (NTRS)
Powell, A. G.; Agrawal, S.; Lacey, T. R.
1989-01-01
An evaluation was made of the applicability and benefits of laminar flow control (LFC) technology to supersonic cruise airplanes. Ancillary objectives were to identify the technical issues critical to supersonic LFC application, and to determine how those issues can be addressed through flight and wind-tunnel testing. Vehicle types studied include a Mach 2.2 supersonic transport configuration, a Mach 4.0 transport, and two Mach 2-class fighter concepts. Laminar flow control methodologies developed for subsonic and transonic wing laminarization were extended and applied. No intractible aerodynamic problems were found in applying LFC to airplanes of the Mach 2 class, even ones of large size. Improvements of 12 to 17 percent in lift-drag ratios were found. Several key technical issues, such as contamination avoidance and excresence criteria were identified. Recommendations are made for their resolution. A need for an inverse supersonic wing design methodology is indicated.
Louzao, Iria; Koch, Britta; Taresco, Vincenzo; Ruiz-Cantu, Laura; Irvine, Derek J; Roberts, Clive J; Tuck, Christopher; Alexander, Cameron; Hague, Richard; Wildman, Ricky; Alexander, Morgan R
2018-02-28
A robust methodology is presented to identify novel biomaterials suitable for three-dimensional (3D) printing. Currently, the application of additive manufacturing is limited by the availability of functional inks, especially in the area of biomaterials; this is the first time when this method is used to tackle this problem, allowing hundreds of formulations to be readily assessed. Several functional properties, including the release of an antidepressive drug (paroxetine), cytotoxicity, and printability, are screened for 253 new ink formulations in a high-throughput format as well as mechanical properties. The selected candidates with the desirable properties are successfully scaled up using 3D printing into a range of object architectures. A full drug release study and degradability and tensile modulus experiments are presented on a simple architecture to validating the suitability of this methodology to identify printable inks for 3D printing devices with bespoke properties.
Dating Violence Prevention Programming: Directions for Future Interventions
Shorey, Ryan C.; Zucosky, Heather; Brasfield, Hope; Febres, Jeniimarie; Cornelius, Tara L.; Sage, Chelsea; Stuart, Gregory L.
2012-01-01
Dating violence among college students is a widespread and destructive problem. The field of dating violence has seen a substantial rise in research over the past several years, which has improved our understanding of factors that increase risk for perpetration. Unfortunately, there has been less attention paid to dating violence prevention programming, and existing programs have been marred with methodological weaknesses and a lack of demonstrated effectiveness in reducing aggression. In hopes of sparking new research on dating violence prevention programs, the current review examines possible new avenues for dating violence prevention programming among college students. We discuss clinical interventions that have shown to be effective in reducing a number of problematic behaviors, including motivational interventions, dialectical behavior therapy, mindfulness, and bystander interventions, and how they could be applied to dating violence prevention. We also discuss methodological issues to consider when implementing dating violence prevention programs. PMID:22773916
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janjusic, Tommy; Kartsaklis, Christos
Memory scalability is an enduring problem and bottleneck that plagues many parallel codes. Parallel codes designed for High Performance Systems are typically designed over the span of several, and in some instances 10+, years. As a result, optimization practices which were appropriate for earlier systems may no longer be valid and thus require careful optimization consideration. Specifically, parallel codes whose memory footprint is a function of their scalability must be carefully considered for future exa-scale systems. In this paper we present a methodology and tool to study the memory scalability of parallel codes. Using our methodology we evaluate an applicationmore » s memory footprint as a function of scalability, which we coined memory efficiency, and describe our results. In particular, using our in-house tools we can pinpoint the specific application components which contribute to the application s overall memory foot-print (application data- structures, libraries, etc.).« less
[Critical of the additive model of the randomized controlled trial].
Boussageon, Rémy; Gueyffier, François; Bejan-Angoulvant, Theodora; Felden-Dominiak, Géraldine
2008-01-01
Randomized, double-blind, placebo-controlled clinical trials are currently the best way to demonstrate the clinical effectiveness of drugs. Its methodology relies on the method of difference (John Stuart Mill), through which the observed difference between two groups (drug vs placebo) can be attributed to the pharmacological effect of the drug being tested. However, this additive model can be questioned in the event of statistical interactions between the pharmacological and the placebo effects. Evidence in different domains has shown that the placebo effect can influence the effect of the active principle. This article evaluates the methodological, clinical and epistemological consequences of this phenomenon. Topics treated include extrapolating results, accounting for heterogeneous results, demonstrating the existence of several factors in the placebo effect, the necessity to take these factors into account for given symptoms or pathologies, as well as the problem of the "specific" effect.
NASA Astrophysics Data System (ADS)
Eliçabe, Guillermo E.
2013-09-01
In this work, an exact scattering model for a system of clusters of spherical particles, based on the Rayleigh-Gans approximation, has been parameterized in such a way that it can be solved in inverse form using Thikhonov Regularization to obtain the morphological parameters of the clusters. That is to say, the average number of particles per cluster, the size of the primary spherical units that form the cluster, and the Discrete Distance Distribution Function from which the z-average square radius of gyration of the system of clusters is obtained. The methodology is validated through a series of simulated and experimental examples of x-ray and light scattering that show that the proposed methodology works satisfactorily in unideal situations such as: presence of error in the measurements, presence of error in the model, and several types of unideallities present in the experimental cases.
Advanced Design Methodology for Robust Aircraft Sizing and Synthesis
NASA Technical Reports Server (NTRS)
Mavris, Dimitri N.
1997-01-01
Contract efforts are focused on refining the Robust Design Methodology for Conceptual Aircraft Design. Robust Design Simulation (RDS) was developed earlier as a potential solution to the need to do rapid trade-offs while accounting for risk, conflict, and uncertainty. The core of the simulation revolved around Response Surface Equations as approximations of bounded design spaces. An ongoing investigation is concerned with the advantages of using Neural Networks in conceptual design. Thought was also given to the development of systematic way to choose or create a baseline configuration based on specific mission requirements. Expert system was developed, which selects aerodynamics, performance and weights model from several configurations based on the user's mission requirements for subsonic civil transport. The research has also resulted in a step-by-step illustration on how to use the AMV method for distribution generation and the search for robust design solutions to multivariate constrained problems.
The Beliefs of Teachers and Daycare Staff regarding Children of Divorce: A Q Methodological Study
ERIC Educational Resources Information Center
Overland, Klara; Thorsen, Arlene Arstad; Storksen, Ingunn
2012-01-01
This Q methodological study explores beliefs of daycare staff and teachers regarding young children's reactions related to divorce. The Q factor analysis resulted in two viewpoints. Participants on the viewpoint "Child problems" believe that children show various emotional and behavioral problems related to divorce, while those on the "Structure…
The Ranking of Higher Education Institutions in Russia: Some Methodological Problems.
ERIC Educational Resources Information Center
Filinov, Nikolay B.; Ruchkina, Svetlana
2002-01-01
The ranking of higher education institutions in Russia is examined from two points of view: as a social phenomenon and as a multi-criteria decision-making problem. The first point of view introduces the idea of interested and involved parties; the second introduces certain principles on which a rational ranking methodology should be based.…
Integration of PBL Methodologies into Online Learning Courses and Programs
ERIC Educational Resources Information Center
van Oostveen, Roland; Childs, Elizabeth; Flynn, Kathleen; Clarkson, Jessica
2014-01-01
Problem-based learning (PBL) challenges traditional views of teaching and learning as the learner determines, to a large extent with support from a skilled facilitator, what topics will be explored, to what depth and which processes will be used. This paper presents the implementation of problem-based learning methodologies in an online Bachelor's…
A Novel Performance Evaluation Methodology for Single-Target Trackers.
Kristan, Matej; Matas, Jiri; Leonardis, Ales; Vojir, Tomas; Pflugfelder, Roman; Fernandez, Gustavo; Nebehay, Georg; Porikli, Fatih; Cehovin, Luka
2016-11-01
This paper addresses the problem of single-target tracker performance evaluation. We consider the performance measures, the dataset and the evaluation system to be the most important components of tracker evaluation and propose requirements for each of them. The requirements are the basis of a new evaluation methodology that aims at a simple and easily interpretable tracker comparison. The ranking-based methodology addresses tracker equivalence in terms of statistical significance and practical differences. A fully-annotated dataset with per-frame annotations with several visual attributes is introduced. The diversity of its visual properties is maximized in a novel way by clustering a large number of videos according to their visual attributes. This makes it the most sophistically constructed and annotated dataset to date. A multi-platform evaluation system allowing easy integration of third-party trackers is presented as well. The proposed evaluation methodology was tested on the VOT2014 challenge on the new dataset and 38 trackers, making it the largest benchmark to date. Most of the tested trackers are indeed state-of-the-art since they outperform the standard baselines, resulting in a highly-challenging benchmark. An exhaustive analysis of the dataset from the perspective of tracking difficulty is carried out. To facilitate tracker comparison a new performance visualization technique is proposed.
2013-01-01
In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm × 750 mm reinforced concrete slab specimens were investigated. Potential E corr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures. PMID:23766706
Leavesley, Silas J; Sweat, Brenner; Abbott, Caitlyn; Favreau, Peter; Rich, Thomas C
2018-01-01
Spectral imaging technologies have been used for many years by the remote sensing community. More recently, these approaches have been applied to biomedical problems, where they have shown great promise. However, biomedical spectral imaging has been complicated by the high variance of biological data and the reduced ability to construct test scenarios with fixed ground truths. Hence, it has been difficult to objectively assess and compare biomedical spectral imaging assays and technologies. Here, we present a standardized methodology that allows assessment of the performance of biomedical spectral imaging equipment, assays, and analysis algorithms. This methodology incorporates real experimental data and a theoretical sensitivity analysis, preserving the variability present in biomedical image data. We demonstrate that this approach can be applied in several ways: to compare the effectiveness of spectral analysis algorithms, to compare the response of different imaging platforms, and to assess the level of target signature required to achieve a desired performance. Results indicate that it is possible to compare even very different hardware platforms using this methodology. Future applications could include a range of optimization tasks, such as maximizing detection sensitivity or acquisition speed, providing high utility for investigators ranging from design engineers to biomedical scientists. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Introducing soft systems methodology plus (SSM+): why we need it and what it can contribute.
Braithwaite, Jeffrey; Hindle, Don; Iedema, Rick; Westbrook, Johanna I
2002-01-01
There are many complicated and seemingly intractable problems in the health care sector. Past ways to address them have involved political responses, economic restructuring, biomedical and scientific studies, and managerialist or business-oriented tools. Few methods have enabled us to develop a systematic response to problems. Our version of soft systems methodology, SSM+, seems to improve problem solving processes by providing an iterative, staged framework that emphasises collaborative learning and systems redesign involving both technical and cultural fixes.
Aramendía, Maite; Rello, Luis; Vanhaecke, Frank; Resano, Martín
2012-10-16
Collection of biological fluids on clinical filter papers shows important advantages from a logistic point of view, although analysis of these specimens is far from straightforward. Concerning urine analysis, and particularly when direct trace elemental analysis by laser ablation-inductively coupled plasma mass spectrometry (LA-ICPMS) is aimed at, several problems arise, such as lack of sensitivity or different distribution of the analytes on the filter paper, rendering obtaining reliable quantitative results quite difficult. In this paper, a novel approach for urine collection is proposed, which circumvents many of these problems. This methodology consists on the use of precut filter paper discs where large amounts of sample can be retained upon a single deposition. This provides higher amounts of the target analytes and, thus, sufficient sensitivity, and allows addition of an adequate internal standard at the clinical lab prior to analysis, therefore making it suitable for a strategy based on unsupervised sample collection and ulterior analysis at referral centers. On the basis of this sampling methodology, an analytical method was developed for the direct determination of several elements in urine (Be, Bi, Cd, Co, Cu, Ni, Sb, Sn, Tl, Pb, and V) at the low μg L(-1) level by means of LA-ICPMS. The method developed provides good results in terms of accuracy and LODs (≤1 μg L(-1) for most of the analytes tested), with a precision in the range of 15%, fit-for-purpose for clinical control analysis.
NASA Astrophysics Data System (ADS)
Campanelli, Monica; Mascitelli, Alessandra; Sanò, Paolo; Diémoz, Henri; Estellés, Victor; Federico, Stefano; Iannarelli, Anna Maria; Fratarcangeli, Francesca; Mazzoni, Augusto; Realini, Eugenio; Crespi, Mattia; Bock, Olivier; Martínez-Lozano, Jose A.; Dietrich, Stefano
2018-01-01
The estimation of the precipitable water vapour content (W) with high temporal and spatial resolution is of great interest to both meteorological and climatological studies. Several methodologies based on remote sensing techniques have been recently developed in order to obtain accurate and frequent measurements of this atmospheric parameter. Among them, the relative low cost and easy deployment of sun-sky radiometers, or sun photometers, operating in several international networks, allowed the development of automatic estimations of W from these instruments with high temporal resolution. However, the great problem of this methodology is the estimation of the sun-photometric calibration parameters. The objective of this paper is to validate a new methodology based on the hypothesis that the calibration parameters characterizing the atmospheric transmittance at 940 nm are dependent on vertical profiles of temperature, air pressure and moisture typical of each measurement site. To obtain the calibration parameters some simultaneously seasonal measurements of W, from independent sources, taken over a large range of solar zenith angle and covering a wide range of W, are needed. In this work yearly GNSS/GPS datasets were used for obtaining a table of photometric calibration constants and the methodology was applied and validated in three European ESR-SKYNET network sites, characterized by different atmospheric and climatic conditions: Rome, Valencia and Aosta. Results were validated against the GNSS/GPS and AErosol RObotic NETwork (AERONET) W estimations. In both the validations the agreement was very high, with a percentage RMSD of about 6, 13 and 8 % in the case of GPS intercomparison at Rome, Aosta and Valencia, respectively, and of 8 % in the case of AERONET comparison in Valencia. Analysing the results by W classes, the present methodology was found to clearly improve W estimation at low W content when compared against AERONET in terms of % bias, bringing the agreement with the GPS (considered the reference one) from a % bias of 5.76 to 0.52.
Life after critical illness: an overview.
Rattray, Janice
2014-03-01
To illustrate the potential physical and psychological problems faced by patients after an episode of critical illness, highlight some of the interventions that have been tested and identify areas for future research. Recovery from critical illness is an international problem and as an issue is likely to increase. For some, recovery from critical illness is prolonged, subject to physical and psychological problems that may negatively impact upon health-related quality of life. The literature accessed for this review includes the work of a number of key researchers in the field of critical care research. These were identified from a number of sources include (1) personal knowledge of the research field accumulated over the last decade and (2) using the search engine 'The Knowledge Network Scotland'. Fatigue and weakness are significant problems for critical care survivors and are common in patients who have been in ICU for more than one week. Psychological problems include anxiety, depression, post-traumatic stress, delirium and cognitive impairment. Prevalence of these problems is difficult to establish for a number of methodological reasons that include the use of self-report questionnaires, the number of different questionnaires used and the variation in administration and timing. Certain subgroups of ICU survivors especially those at the more severe end of the illness severity spectrum are more at risk and this has been demonstrated for both physical and psychological problems. Findings from international studies of a range of potential interventions are presented. However, establishing effectiveness for most of these still has to be empirically demonstrated. What seems clear is the need for a co-ordinated, multidisciplinary, designated recovery and rehabilitation pathway that begins as soon as the patient is admitted into an intensive care unit. © 2013 John Wiley & Sons Ltd.
Agile methodology selection criteria: IT start-up case study
NASA Astrophysics Data System (ADS)
Micic, Lj
2017-05-01
Project management in modern IT companies is often based on agile methodologies which have several advantages compared to traditional methodologies such is waterfall. Having in mind that clients sometimes change project during development it is crucial for an IT company to choose carefully which methodology is going to implement and is it going to be mostly based on one or is it going got be combination of several. There are several modern and often used methodologies but among those Scrum, Kanban and XP programming are usually the most common. Sometimes companies use mostly tools and procedures from one but quite often they use some of the combination of those methodologies. Having in mind that those methodologies are just a framework they allow companies to adapt it for their specific projects as well as for other limitations. These methodologies are in limited usage Bosnia but more and more IT companies are starting to use agile methodologies because it is practice and common not just for their clients abroad but also starting to be the only option in order to deliver quality product on time. However it is always challenging which methodology or combination of several companies should implement and how to connect it to its own project, organizational framework and HR management. This paper presents one case study based on local IT start up and delivers solution based on theoretical framework and practical limitations that case company has.
Antecedents of narcotic use and addiction. A study of 898 Vietnam veterans.
Helzer, J E; Robins, L N; Davis, D H
1976-02-01
Previous studies of predictors of narcotic abuse have been retrospective and based on samples of long-term addicts obtained from legal or medical channels. There are several methodological problems in this approach. The present study is an attempt to test certain alleged predictors of narcotic use in a cohort of 898 Vietnam veterans. The design overcomes several of the methodological weaknesses of previous studies. Eight variables which have been reported as predictors of drug use or addiction in the drug literature were inquired about during a personal interview which included the premilitary life of each subject. The antecedent variables were socioeconomic background, inner city residence, psychiatric illness, broken home, race, employment history, education and antisocial history. Using information obtained from interviews and military records, we then tested the predictive value of each of these antecedents by comparing narcotic used and addiction in Vietman and use after Vietnam in men differing with respect to each antecedent. Results indicate that some of the variables were very poor, and others very good predictors of the various levels of narcotic involvement. The predictive value and overall importance of each of the variables we tested are discussed.
Pathophysiology of primary burning mouth syndrome with special focus on taste dysfunction: a review.
Kolkka-Palomaa, M; Jääskeläinen, S K; Laine, M A; Teerijoki-Oksa, T; Sandell, M; Forssell, H
2015-11-01
Primary burning mouth syndrome (BMS) is a chronic oral condition characterized by burning pain often accompanied with taste dysfunction and xerostomia. The most compelling evidence concerning BMS pathophysiology comes from studies on the somatosensory system using neurophysiologic or psychophysical methods such as blink reflex, thermal quantitative sensory testing, as well as functional brain imaging. They have provided convincing evidence for neuropathic involvement at several levels of the somatosensory system in BMS pain pathophysiology. The number of taste function studies trying to substantiate the subjective taste disturbances or studies on salivary factors in BMS is much more limited, and most of them suffer from definitional and methodological problems. This review aims to critically evaluate the existing literature on the pathophysiology of BMS, paying special attention to the correctness of case selection and the methodology used in published studies, and to summarize the current state of knowledge. Based on the recognition of several gaps in the current understanding of the pathophysiology of BMS especially as regards taste and pain system interactions, the review ends with future scenarios for research in this area. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Seraglia, Bruno; Gamberini, Luciano; Priftis, Konstantinos; Scatturin, Pietro; Martinelli, Massimiliano; Cutini, Simone
2011-01-01
For over two decades Virtual Reality (VR) has been used as a useful tool in several fields, from medical and psychological treatments, to industrial and military applications. Only in recent years researchers have begun to study the neural correlates that subtend VR experiences. Even if the functional Magnetic Resonance Imaging (fMRI) is the most common and used technique, it suffers several limitations and problems. Here we present a methodology that involves the use of a new and growing brain imaging technique, functional Near-infrared Spectroscopy (fNIRS), while participants experience immersive VR. In order to allow a proper fNIRS probe application, a custom-made VR helmet was created. To test the adapted helmet, a virtual version of the line bisection task was used. Participants could bisect the lines in a virtual peripersonal or extrapersonal space, through the manipulation of a Nintendo Wiimote ® controller in order for the participants to move a virtual laser pointer. Although no neural correlates of the dissociation between peripersonal and extrapersonal space were found, a significant hemodynamic activity with respect to the baseline was present in the right parietal and occipital areas. Both advantages and disadvantages of the presented methodology are discussed.
Deb, Kalyanmoy; Sinha, Ankur
2010-01-01
Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.
Park, Bongki; Noh, Hyeonseok; Choi, Dong-Jun
2018-06-01
Xerostomia (dry mouth) causes many clinical problems, including oral infections, speech difficulties, and impaired chewing and swallowing of food. Many cancer patients have complained of xerostomia induced by cancer therapy. The aim of this systematic review is to assess the efficacy of herbal medicine for the treatment of xerostomia in cancer patients. Randomized controlled trials investigating the use of herbal medicines to treat xerostomia in cancer patients were included. We searched the following 12 databases without restrictions on time or language. The risk of bias was assessed using the Cochrane Risk of Bias Tool. Twenty-five randomized controlled trials involving 1586 patients met the inclusion criteria. A total of 24 formulas were examined in the included trials. Most of the included trials were insufficiently reported in the methodology section. Five formulas were shown to significantly improve the salivary flow rate compared to comparators. Regarding the grade of xerostomia, all formulas with the exception of a Dark Plum gargle solution with normal saline were significantly effective in reducing the severity of dry mouth. Adverse events were reported in 4 trials, and adverse effects of herbal medicine were reported in 3 trials. We found herbal medicines had potential benefits for improving salivary function and reducing the severity of dry mouth in cancer patients. However, methodological limitations and a relatively small sample size reduced the strength of the evidence. More high-quality trials reporting sufficient methodological data are warranted to enforce the strength of evidence regarding the effectiveness of herbal medicines.
William H. Cooke; Dennis M. Jacobs
2002-01-01
FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land useiland cover information....
Capturing security requirements for software systems.
El-Hadary, Hassan; El-Kassas, Sherif
2014-07-01
Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.
Capturing security requirements for software systems
El-Hadary, Hassan; El-Kassas, Sherif
2014-01-01
Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way. PMID:25685514
Roca, Judith; Reguant, Mercedes; Canet, Olga
2016-11-01
Teaching strategies are essential in order to facilitate meaningful learning and the development of high-level thinking skills in students. To compare three teaching methodologies (problem-based learning, case-based teaching and traditional methods) in terms of the learning outcomes achieved by nursing students. This quasi-experimental research was carried out in the Nursing Degree programme in a group of 74 students who explored the subject of The Oncology Patient through the aforementioned strategies. A performance test was applied based on Bloom's Revised Taxonomy. A significant correlation was found between the intragroup theoretical and theoretical-practical dimensions. Likewise, intergroup differences were related to each teaching methodology. Hence, significant differences were estimated between the traditional methodology (x-=9.13), case-based teaching (x-=12.96) and problem-based learning (x-=14.84). Problem-based learning was shown to be the most successful learning method, followed by case-based teaching and the traditional methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.
Robust Design Optimization via Failure Domain Bounding
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2007-01-01
This paper extends and applies the strategies recently developed by the authors for handling constraints under uncertainty to robust design optimization. For the scope of this paper, robust optimization is a methodology aimed at problems for which some parameters are uncertain and are only known to belong to some uncertainty set. This set can be described by either a deterministic or a probabilistic model. In the methodology developed herein, optimization-based strategies are used to bound the constraint violation region using hyper-spheres and hyper-rectangles. By comparing the resulting bounding sets with any given uncertainty model, it can be determined whether the constraints are satisfied for all members of the uncertainty model (i.e., constraints are feasible) or not (i.e., constraints are infeasible). If constraints are infeasible and a probabilistic uncertainty model is available, upper bounds to the probability of constraint violation can be efficiently calculated. The tools developed enable approximating not only the set of designs that make the constraints feasible but also, when required, the set of designs for which the probability of constraint violation is below a prescribed admissible value. When constraint feasibility is possible, several design criteria can be used to shape the uncertainty model of performance metrics of interest. Worst-case, least-second-moment, and reliability-based design criteria are considered herein. Since the problem formulation is generic and the tools derived only require standard optimization algorithms for their implementation, these strategies are easily applicable to a broad range of engineering problems.
Ensemble-based data assimilation and optimal sensor placement for scalar source reconstruction
NASA Astrophysics Data System (ADS)
Mons, Vincent; Wang, Qi; Zaki, Tamer
2017-11-01
Reconstructing the characteristics of a scalar source from limited remote measurements in a turbulent flow is a problem of great interest for environmental monitoring, and is challenging due to several aspects. Firstly, the numerical estimation of the scalar dispersion in a turbulent flow requires significant computational resources. Secondly, in actual practice, only a limited number of observations are available, which generally makes the corresponding inverse problem ill-posed. Ensemble-based variational data assimilation techniques are adopted to solve the problem of scalar source localization in a turbulent channel flow at Reτ = 180 . This approach combines the components of variational data assimilation and ensemble Kalman filtering, and inherits the robustness from the former and the ease of implementation from the latter. An ensemble-based methodology for optimal sensor placement is also proposed in order to improve the condition of the inverse problem, which enhances the performances of the data assimilation scheme. This work has been partially funded by the Office of Naval Research (Grant N00014-16-1-2542) and by the National Science Foundation (Grant 1461870).
Mental Health Problems in Children with Prader-Willi Syndrome
Skokauskas, Norbert; Sweeny, Eileen; Meehan, Judith; Gallagher, Louise
2012-01-01
Background: Prader-Willi Syndrome (PWS) is a genetically determined neurodevelopmental disorder, which occurs in approximately one in 22000 births. Aims: This study aimed to investigate psychiatric characteristics of children diagnosed with PWS compared with an age-, gender- and IQ-matched control group. The parents of children with PWS were assessed for psychological distress in comparison to the parents of the control group. Methodological limitations identified in previous studies were addressed in the present study. Methods: Psychiatric problems were evaluated in a sample of children with genetically confirmed PWS and an age- and IQ-matched control group using the Child Behaviour Checklist 6–18. Parental psychological distress for both groups was evaluated with the Brief Symptom Inventory. Results: Children with PWS had more severe somatic, social, and thought problems, and were more withdrawn-depressed in comparison to controls. Borderline difficulties were detected for the affective, somatic, and attention deficit-hyperactivity CBCL DSM-orientated subscales in the PWS group. Parents of PWS children, in comparison to controls, had more somatization, phobic anxiety, obsessive-compulsive, and anxiety problems. Conclusions: PWS represents a complex psychological disorder with multiple areas of disturbances. PMID:22876265
NASA Technical Reports Server (NTRS)
Banks, H. T.; Brown, D. E.; Metcalf, Vern L.; Silcox, R. J.; Smith, Ralph C.; Wang, Yun
1994-01-01
A problem of continued interest concerns the control of vibrations in a flexible structure and the related problem of reducing structure-borne noise in structural acoustic systems. In both cases, piezoceramic patches bonded to the structures have been successfully used as control actuators. Through the application of a controlling voltage, the patches can be used to reduce structural vibrations which in turn lead to methods for reducing structure-borne noise. A PDE-based methodology for modeling, estimating physical parameters, and implementing a feedback control scheme for problems of this type is discussed. While the illustrating example is a circular plate, the methodology is sufficiently general so as to be applicable in a variety of structural and structural acoustic systems.
Problems related to the integration of fault tolerant aircraft electronic systems
NASA Technical Reports Server (NTRS)
Bannister, J. A.; Adlakha, V.; Triyedi, K.; Alspaugh, T. A., Jr.
1982-01-01
Problems related to the design of the hardware for an integrated aircraft electronic system are considered. Taxonomies of concurrent systems are reviewed and a new taxonomy is proposed. An informal methodology intended to identify feasible regions of the taxonomic design space is described. Specific tools are recommended for use in the methodology. Based on the methodology, a preliminary strawman integrated fault tolerant aircraft electronic system is proposed. Next, problems related to the programming and control of inegrated aircraft electronic systems are discussed. Issues of system resource management, including the scheduling and allocation of real time periodic tasks in a multiprocessor environment, are treated in detail. The role of software design in integrated fault tolerant aircraft electronic systems is discussed. Conclusions and recommendations for further work are included.
Meaning and Problems of Planning
ERIC Educational Resources Information Center
Brieve, Fred J.; Johnston, A. P.
1973-01-01
Examines the educational planning process. Discusses what planning is, how methodological planning can work in education, misunderstandings about planning, and difficulties in applying the planning methodology. (DN)
NASA Technical Reports Server (NTRS)
Farr, Rebecca A.; Chang, Chau-Lyan.; Jones, Jess H.; Dougherty, N. Sam
2015-01-01
The authors provide a brief overview of the classic tonal screech noise problem created by underexpanded supersonic jets, briefly describing the fluid dynamic-acoustics feedback mechanism that has been long established as the basis for this well-known aeroacoustics problem. This is followed by a description of the Long Penetration Mode (LPM) supersonic underexpanded counterflowing jet phenomenon which has been demonstrated in several wind tunnel tests and modeled in several computational fluid dynamics (CFD) simulations. The authors provide evidence from test and CFD analysis of LPM that indicates that acoustics feedback and fluid interaction seen in LPM are analogous to the aeroacoustics interactions seen in screech jets. Finally, the authors propose applying certain methodologies to LPM which have been developed and successfully demonstrated in the study of screech jets and mechanically induced excitation in fluid oscillators for decades. The authors conclude that the large body of work done on jet screech, other aeroacoustic phenomena, and fluid oscillators can have direct application to the study and applications of LPM counterflowing supersonic cold flow jets.
Design risk assessment for burst-prone mines: Application in a Canadian mine
NASA Astrophysics Data System (ADS)
Cheung, David J.
A proactive stance towards improving the effectiveness and consistency of risk assessments has been adopted recently by mining companies and industry. The next 10-20 years forecasts that ore deposits accessible using shallow mining techniques will diminish. The industry continues to strive for success in "deeper" mining projects in order to keep up with the continuing demand for raw materials. Although the returns are quite profitable, many projects have been sidelined due to high uncertainty and technical risk in the mining of the mineral deposit. Several hardrock mines have faced rockbursting and seismicity problems. Within those reported, mines in countries like South Africa, Australia and Canada have documented cases of severe rockburst conditions attributed to the mining depth. Severe rockburst conditions known as "burst-prone" can be effectively managed with design. Adopting a more robust design can ameliorate the exposure of workers and equipment to adverse conditions and minimize the economic consequences, which can hinder the bottom line of an operation. This thesis presents a methodology created for assessing the design risk in burst-prone mines. The methodology includes an evaluation of relative risk ratings for scenarios with options of risk reduction through several design principles. With rockbursts being a hazard of seismic events, the methodology is based on research in the area of mining seismicity factoring in rockmass failure mechanisms, which results from a combination of mining induced stress, geological structures, rockmass properties and mining influences. The methodology was applied to case studies at Craig Mine of Xstrata Nickel in Sudbury, Ontario, which is known to contain seismically active fault zones. A customized risk assessment was created and applied to rockburst case studies, evaluating the seismic vulnerability and consequence for each case. Application of the methodology to Craig Mine demonstrates that changes in the design can reduce both exposure risk (personnel and equipment), and economical risk (revenue and costs). Fatal and catastrophic consequences can be averted through robust planning and design. Two customized approaches were developed to conduct risk assessment of case studies at Craig Mine. Firstly, the Brownfield Approach utilizes the seismic database to determine the seismic hazard from a rating system that evaluates frequency-magnitude, event size, and event-blast relation. Secondly, the Greenfield Approach utilizes the seismic database, focusing on larger magnitude events, rocktype, and geological structure. The customized Greenfield Approach can also be applied in the evaluation of design risk in deep mines with the same setting and condition as Craig Mine. Other mines with different settings and conditions can apply the principles in the methodology to evaluate design alternatives and risk reduction strategies for burst-prone mines.
School-Based Methylphenidate Placebo Protocols: Methodological and Practical Issues.
ERIC Educational Resources Information Center
Hyman, Irwin A.; Wojtowicz, Alexandra; Lee, Kee Duk; Haffner, Mary Elizabeth; Fiorello, Catherine A.; And Others
1998-01-01
Focuses on methodological issues involved in choosing instruments to monitor behavior, once a comprehensive evaluation has suggested trials on Ritalin. Case examples illustrate problems of teacher compliance in filling out measures, supplying adequate placebos, and obtaining physical cooperation. Emerging school-based methodologies are discussed…
Bartolazzi, Armando; Bellotti, Carlo; Sciacchitano, Salvatore
2012-01-01
In the last decade, the β-galactosyl binding protein galectin-3 has been the object of extensive molecular, structural, and functional studies aimed to clarify its biological role in cancer. Multicenter studies also contributed to discover the potential clinical value of galectin-3 expression analysis in distinguishing, preoperatively, benign from malignant thyroid nodules. As a consequence galectin-3 is receiving significant attention as tumor marker for thyroid cancer diagnosis, but some conflicting results mostly owing to methodological problems have been published. The possibility to apply preoperatively a reliable galectin-3 test method on fine needle aspiration biopsy (FNA)-derived thyroid cells represents an important achievement. When correctly applied, the method reduces consistently the gray area of thyroid FNA cytology, contributing to avoid unnecessary thyroid surgery. Although the efficacy and reliability of the galectin-3 test method have been extensively proved in several studies, its translation in the clinical setting requires well-standardized reagents and procedures. After a decade of experimental work on galectin-3-related basic and translational research projects, the major methodological problems that may potentially impair the diagnostic performance of galectin-3 immunotargeting are highlighted and discussed in detail. A standardized protocol for a reliable galectin-3 expression analysis is finally provided. The aim of this contribution is to improve the clinical management of patients with thyroid nodules, promoting the preoperative use of a reliable galectin-3 test method as ancillary technique to conventional thyroid FNA cytology. The final goal is to decrease unnecessary thyroid surgery and its related social costs.
Use of Invariant Manifolds for Transfers Between Three-Body Systems
NASA Technical Reports Server (NTRS)
Beckman, Mark; Howell, Kathleen
2003-01-01
The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits does not exist. This paper presents the initial approaches to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing 7-dimensional invariant manifold data are presented. Some particular solutions are presented for the transfer problem, though the emphasis is on developing methodology for solving the general problem.
Representations of Invariant Manifolds for Applications in Three-Body Systems
NASA Technical Reports Server (NTRS)
Howell, K.; Beckman, M.; Patterson, C.; Folta, D.
2004-01-01
The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits is currently being studied. This paper presents an initial approach to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing the invariant manifold data are presented. Some particular solutions are presented for two types of transfer problems, though the emphasis is on developing the methodology for solving the general problem.
How to Select a Questionnaire with a Good Methodological Quality?
Paiva, Saul Martins; Perazzo, Matheus de França; Ortiz, Fernanda Ruffo; Pordeus, Isabela Almeida; Martins-Júnior, Paulo Antônio
2018-01-01
In the last decades, several instruments have been used to evaluate the impact of oral health problems on the oral health-related quality of life (OHRQoL) of individuals. However, some instruments lack thorough methodological validation or present conceptual differences that hinder comparisons with instruments. Thus, it can be difficult to clinicians and researchers to select a questionnaire that accurately reflect what are really meaningful to individuals. This short communication aimed to discuss the importance of use an appropriate checklist to select an instrument with a good methodological quality. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist was developed to provide tools for evidence-based instrument selection. The COSMIN checklist comprises ten boxes that evaluate whether a study meets the standard for good methodological quality and two additional boxes to meet studies that use the Item Response Theory method and general requirements for results generalization, resulting in four steps to be followed. In this way, it is required at least some expertise in psychometrics or clinimetrics to a wide-ranging use of this checklist. The COSMIN applications include its use to ensure the standardization of cross-cultural adaptations and safer comparisons between measurement studies and evaluation of methodological quality of systematic reviews of measurement properties. Also, it can be used by students when training about measurement properties and by editors and reviewers when revising manuscripts on this topic. The popularization of COSMIN checklist is therefore necessary to improve the selection and evaluation of health measurement instruments.
Wrinkle-free design of thin membrane structures using stress-based topology optimization
NASA Astrophysics Data System (ADS)
Luo, Yangjun; Xing, Jian; Niu, Yanzhuang; Li, Ming; Kang, Zhan
2017-05-01
Thin membrane structures would experience wrinkling due to local buckling deformation when compressive stresses are induced in some regions. Using the stress criterion for membranes in wrinkled and taut states, this paper proposed a new stress-based topology optimization methodology to seek the optimal wrinkle-free design of macro-scale thin membrane structures under stretching. Based on the continuum model and linearly elastic assumption in the taut state, the optimization problem is defined as to maximize the structural stiffness under membrane area and principal stress constraints. In order to make the problem computationally tractable, the stress constraints are reformulated into equivalent ones and relaxed by a cosine-type relaxation scheme. The reformulated optimization problem is solved by a standard gradient-based algorithm with the adjoint-variable sensitivity analysis. Several examples with post-bulking simulations and experimental tests are given to demonstrate the effectiveness of the proposed optimization model for eliminating stress-related wrinkles in the novel design of thin membrane structures.
Sexual behavior and its correlates after traumatic brain injury.
Turner, Daniel; Schöttle, Daniel; Krueger, Richard; Briken, Peer
2015-03-01
Traumatic brain injury (TBI) is one of the leading causes of permanent disability in young adults and is frequently accompanied by changes in sexual behaviors. Satisfying sexuality is an important factor for overall quality of life in people with disabilities. The purpose of this article is to review the studies evaluating the assessment, correlates and management of sexuality following TBI. The Brain Injury Questionnaire of Sexuality is the first validated questionnaire specifically developed for adults with TBI. A considerable amount of individuals with TBI show inappropriate sexual behaviors and sexual dysfunctions. Whereas inappropriate sexual behaviors are related to younger age, less social participation and more severe injuries, sexual dysfunctions show an association with higher fatigue, higher depression scores, less self-esteem and female sex. Healthcare professionals have suggested that because of discomfort at the individual or institutional level, sexual problems are often not sufficiently addressed and have suggested that a specialist should treat sexual problems. Although some important correlates of sexual problems could be identified, methodological differences across studies limit their comparability. Furthermore, there is an absence of evidence-based treatment strategies for addressing sexual problems. Therapeutic efforts should take into account the identified correlates of sexual problems following TBI.
A Scientific Approach to the Investigation on Anomalous Atmospheric Light Phenomena
NASA Astrophysics Data System (ADS)
Teodorani, M.
2011-12-01
Anomalous atmospheric light phenomena tend to occur recurrently in several places of our planet. Statistical studies show that a phenomenon's real recurrence area can be identified only after pondering reported cases on the population number and on the diffusion of communication media. The main scientific results that have been obtained so far after explorative instrumented missions have been carried out are presented, including the empirical models that have been set up in order to describe the observed reality. Subsequently, a focused theorization is discussed in order to attack the physical problem concerning the structure and the dynamics of "light balls" and the enigma related to the central force that maintains them in spherical shape. Finally, several important issues are discussed regarding methodology, strategy, tactics and interdisciplinary approaches.
Application of the boundary element method to the micromechanical analysis of composite materials
NASA Technical Reports Server (NTRS)
Goldberg, R. K.; Hopkins, D. A.
1995-01-01
A new boundary element formulation for the micromechanical analysis of composite materials is presented in this study. A unique feature of the formulation is the use of circular shape functions to convert the two-dimensional integrations of the composite fibers to one-dimensional integrations. To demonstrate the applicability of the formulations, several example problems including elastic and thermal analysis of laminated composites and elastic analyses of woven composites are presented and the boundary element results compared to experimental observations and/or results obtained through alternate analytical procedures. While several issues remain to be addressed in order to make the methodology more robust, the formulations presented here show the potential in providing an alternative to traditional finite element methods, particularly for complex composite architectures.
Transport mechanisms in Schottky diodes realized on GaN
NASA Astrophysics Data System (ADS)
Amor, Sarrah; Ahaitouf, Ali; Ahaitouf, Abdelaziz; Salvestrini, Jean Paul; Ougazzaden, Abdellah
2017-03-01
This work is focused on the conducted transport mechanisms involved on devices based in gallium nitride GaN and its alloys. With considering all conduction mechanisms of current, its possible to understanded these transport phenomena. Thanks to this methodology the current-voltage characteristics of structures with unusual behaviour are further understood and explain. Actually, the barrier height (SBH) is a complex problem since it depends on several parameters like the quality of the metal-semiconductor interface. This study is particularly interesting as solar cells are made on this material and their qualification is closely linked to their transport properties.
NASA Technical Reports Server (NTRS)
Barr, B. G.; Martinko, E. A.
1976-01-01
Activities of the Kansas Applied Remote Sensing Program (KARS) designed to establish interactions on cooperative projects with decision makers in Kansas agencies in the development and application of remote sensing procedures are reported. Cooperative demonstration projects undertaken with several different agencies involved three principal areas of effort: Wildlife Habitat and Environmental Analysis; Urban and Regional Analysis; Agricultural and Rural Analysis. These projects were designed to concentrate remote sensing concepts and methodologies on existing agency problems to insure the continued relevancy of the program and maximize the possibility for immediate operational use. Completed projects are briefly discussed.
Studies on young child malnutrition in Iraq: problems and insights, 1990-1999.
Garfield, R
2000-09-01
Many reports on Iraq proclaimed a rise in rates of death and disease since the Gulf War of January/February 1991. Several of the studies on nutritional status are not readily accessible, and few have been compared to identify secular trends. Here, 27 studies examining nutrition among Iraqi children in the 1990s are reviewed. Only five studies were found to be of comparable methodologic quality. These are analyzed to identify major trends in child nutrition between August 1991 and June 1999. Limitations of existing studies and recommendations for future studies are discussed.
On the detection of pornographic digital images
NASA Astrophysics Data System (ADS)
Schettini, Raimondo; Brambilla, Carla; Cusano, Claudio; Ciocca, Gianluigi
2003-06-01
The paper addresses the problem of distinguishing between pornographic and non-pornographic photographs, for the design of semantic filters for the web. Both, decision forests of trees built according to CART (Classification And Regression Trees) methodology and Support Vectors Machines (SVM), have been used to perform the classification. The photographs are described by a set of low-level features, features that can be automatically computed simply on gray-level and color representation of the image. The database used in our experiments contained 1500 photographs, 750 of which labeled as pornographic on the basis of the independent judgement of several viewers.
[Methodological problems in the scientific research on HIV /AIDS in Bolivia].
Hita, Susana Ramírez
2013-05-01
This paper discusses the methodological problems in the scientific research on HIV/AIDS in Bolivia, both in the areas of epidemiology and social sciences. Studies associated with this research served as the basis for the implementation of health programs run by The Global Fund, The Pan-American Health Organization, International Cooperation, Non-Governmental Organizations and the Bolivian Ministry of Health and Sports. An analysis of the methodological contradictions and weaknesses was made by reviewing the bibliography of the studies and by conducting qualitative methodological research, that was focused on the quality of health care available to people living with HIV/AIDS in public hospitals and health centers, and looked at how programs targeted at this sector of the population are designed and delivered. In this manner, it was possible to observe the shortcomings of the methodological design in the epidemiological and social science studies which serve as the basis for the implementation of these health programs.
Risal, Ajay; Manandhar, Kedar; Steiner, Timothy J; Holen, Are; Koju, Rajendra; Linde, Mattias
2014-08-15
Headache, anxiety and depression are major disorders of the brain in terms of their prevalence and the burdens and costs they impose on society. Nationwide population-based studies of these disorders are necessary to inform health policy but, in research-naïve and resource-poor countries such as Nepal, a host of methodological problems are encountered: cultural, geographic, logistic and philosophical. Expert consensus was sought among researchers from different professional and cultural backgrounds in planning and conceptualizing an epidemiological study and adapting established methods to the special situation and circumstances of Nepal. The methodological problems were sorted into different themes: study design; climate; geography, access and transport; sociocultural issues; safety of interviewers. Each of these was dealt with separately, and their inter-relationships explored, in finding solutions that were sometimes pragmatic. A cross-sectional questionnaire-based study, with teams of interviewers visiting households across the three physiographic divisions (with extremes in altitude) in each of the five development regions of the country, would enable national sampling with sociocultural representativeness. However, the study instruments and interviews would be in Nepali only. Transport and access challenges were considerable, and their solutions combined travel by air, bus, river and foot, with allowances for rain-damaged roads, collapsed bridges and cancelled scheduled flights. The monsoon would render many routes impassable, and therefore set an absolute time limitation. Engaging participants willingly in the enquiry would be the key to success, and several tactics would be employed to enhance the success of this, most importantly enlisting the support of local community volunteers in each study site. Anticipating problems in advance of investing substantial resources in a large nationwide epidemiological study in Nepal was a sensible precaution. The difficulties could be resolved or circumvented without expected compromise in scientific quality. Expert consensus was an effective means of achieving this outcome.
Kent, Angelle; Kobagi, Nadia; Huynh, Kha Tu; Clarke, Alix; Yoon, Minn N.
2017-01-01
Background Poor oral health has been a persistent problem in nursing home residents for decades, with severe consequences for residents and the health care system. Two major barriers to providing appropriate oral care are residents’ responsive behaviors to oral care and residents’ lack of ability or motivation to perform oral care on their own. Objectives To evaluate the effectiveness of strategies that nursing home care providers can apply to either prevent/overcome residents’ responsive behaviors to oral care, or enable/motivate residents to perform their own oral care. Materials and methods We searched the databases Medline, EMBASE, Evidence Based Reviews–Cochrane Central Register of Controlled Trials, CINAHL, and Web of Science for intervention studies assessing the effectiveness of eligible strategies. Two reviewers independently (a) screened titles, abstracts and retrieved full-texts; (b) searched key journal contents, key author publications, and reference lists of all included studies; and (c) assessed methodological quality of included studies. Discrepancies at any stage were resolved by consensus. We conducted a narrative synthesis of study results. Results We included three one-group pre-test, post-test studies, and one cross-sectional study. Methodological quality was low (n = 3) and low moderate (n = 1). Two studies assessed strategies to enable/motivate nursing home residents to perform their own oral care, and to studies assessed strategies to prevent or overcome responsive behaviors to oral care. All studies reported improvements of at least some of the outcomes measured, but interpretation is limited due to methodological problems. Conclusions Potentially promising strategies are available that nursing home care providers can apply to prevent/overcome residents’ responsive behaviors to oral care or to enable/motivate residents to perform their own oral care. However, studies assessing these strategies have a high risk for bias. To overcome oral health problems in nursing homes, care providers will need practical strategies whose effectiveness was assessed in robust studies. PMID:28609476
2014-01-01
Background Headache, anxiety and depression are major disorders of the brain in terms of their prevalence and the burdens and costs they impose on society. Nationwide population-based studies of these disorders are necessary to inform health policy but, in research-naïve and resource-poor countries such as Nepal, a host of methodological problems are encountered: cultural, geographic, logistic and philosophical. Methods Expert consensus was sought among researchers from different professional and cultural backgrounds in planning and conceptualizing an epidemiological study and adapting established methods to the special situation and circumstances of Nepal. Results The methodological problems were sorted into different themes: study design; climate; geography, access and transport; sociocultural issues; safety of interviewers. Each of these was dealt with separately, and their inter-relationships explored, in finding solutions that were sometimes pragmatic. A cross-sectional questionnaire-based study, with teams of interviewers visiting households across the three physiographic divisions (with extremes in altitude) in each of the five development regions of the country, would enable national sampling with sociocultural representativeness. However, the study instruments and interviews would be in Nepali only. Transport and access challenges were considerable, and their solutions combined travel by air, bus, river and foot, with allowances for rain-damaged roads, collapsed bridges and cancelled scheduled flights. The monsoon would render many routes impassable, and therefore set an absolute time limitation. Engaging participants willingly in the enquiry would be the key to success, and several tactics would be employed to enhance the success of this, most importantly enlisting the support of local community volunteers in each study site. Conclusion Anticipating problems in advance of investing substantial resources in a large nationwide epidemiological study in Nepal was a sensible precaution. The difficulties could be resolved or circumvented without expected compromise in scientific quality. Expert consensus was an effective means of achieving this outcome. PMID:25127664
Rapid Design of Gravity Assist Trajectories
NASA Technical Reports Server (NTRS)
Carrico, J.; Hooper, H. L.; Roszman, L.; Gramling, C.
1991-01-01
Several International Solar Terrestrial Physics (ISTP) missions require the design of complex gravity assisted trajectories in order to investigate the interaction of the solar wind with the Earth's magnetic field. These trajectories present a formidable trajectory design and optimization problem. The philosophy and methodology that enable an analyst to design and analyse such trajectories are discussed. The so called 'floating end point' targeting, which allows the inherently nonlinear multiple body problem to be solved with simple linear techniques, is described. The combination of floating end point targeting with analytic approximations with a Newton method targeter to achieve trajectory design goals quickly, even for the very sensitive double lunar swingby trajectories used by the ISTP missions, is demonstrated. A multiconic orbit integration scheme allows fast and accurate orbit propagation. A prototype software tool, Swingby, built for trajectory design and launch window analysis, is described.
Usage-Centered Design Approach in Design of Malaysia Sexuality Education (MSE) Courseware
NASA Astrophysics Data System (ADS)
Chan, S. L.; Jaafar, A.
The problems amongst juveniles increased every year, especially rape case of minor. Therefore, the government of Malaysia has introduced the National Sexuality Education Guideline on 2005. An early study related to the perception of teachers and students toward the sexuality education curriculum taught in secondary schools currently was carried out in 2008. The study showed that there are big gaps between the perception of the teachers and the students towards several issues of Malaysia sexuality education today. The Malaysia Sexuality Education (MSE) courseware was designed based on few learning theories approach. Then MSE was executed through a comprehensive methodology which the model ADDIE integrated with Usage-Centered Design to achieve high usability courseware. In conclusion, the effort of developing the MSE is hopefully will be a solution to the current problem that happens in Malaysia sexuality education now.
NASA Astrophysics Data System (ADS)
Arias, E.; Florez, E.; Pérez-Torres, J. F.
2017-06-01
A new algorithm for the determination of equilibrium structures suitable for metal nanoclusters is proposed. The algorithm performs a stochastic search of the minima associated with the nuclear potential energy function restricted to a sphere (similar to the Thomson problem), in order to guess configurations of the nuclear positions. Subsequently, the guessed configurations are further optimized driven by the total energy function using the conventional gradient descent method. This methodology is equivalent to using the valence shell electron pair repulsion model in guessing initial configurations in the traditional molecular quantum chemistry. The framework is illustrated in several clusters of increasing complexity: Cu7, Cu9, and Cu11 as benchmark systems, and Cu38 and Ni9 as novel systems. New equilibrium structures for Cu9, Cu11, Cu38, and Ni9 are reported.
Mining the preferences of patients for ubiquitous clinic recommendation.
Chen, Tin-Chih Toly; Chiu, Min-Chi
2018-03-06
A challenge facing all ubiquitous clinic recommendation systems is that patients often have difficulty articulating their requirements. To overcome this problem, a ubiquitous clinic recommendation mechanism was designed in this study by mining the clinic preferences of patients. Their preferences were defined using the weights in the ubiquitous clinic recommendation mechanism. An integer nonlinear programming problem was solved to tune the values of the weights on a rolling basis. In addition, since it may take a long time to adjust the values of weights to their asymptotic values, the back propagation network (BPN)-response surface method (RSM) method is applied to estimate the asymptotic values of weights. The proposed methodology was tested in a regional study. Experimental results indicated that the ubiquitous clinic recommendation system outperformed several existing methods in improving the successful recommendation rate.
Arias, E; Florez, E; Pérez-Torres, J F
2017-06-28
A new algorithm for the determination of equilibrium structures suitable for metal nanoclusters is proposed. The algorithm performs a stochastic search of the minima associated with the nuclear potential energy function restricted to a sphere (similar to the Thomson problem), in order to guess configurations of the nuclear positions. Subsequently, the guessed configurations are further optimized driven by the total energy function using the conventional gradient descent method. This methodology is equivalent to using the valence shell electron pair repulsion model in guessing initial configurations in the traditional molecular quantum chemistry. The framework is illustrated in several clusters of increasing complexity: Cu 7 , Cu 9 , and Cu 11 as benchmark systems, and Cu 38 and Ni 9 as novel systems. New equilibrium structures for Cu 9 , Cu 11 , Cu 38 , and Ni 9 are reported.
Lees-Haley, Paul R; Greiffenstein, M Frank; Larrabee, Glenn J; Manning, Edward L
2004-08-01
Recently, Kaiser (2003) raised concerns over the increase in brain damage claims reportedly due to exposure to welding fumes. In the present article, we discuss methodological problems in conducting neuropsychological research on the effects of welding exposure, using a recent paper by Bowler et al. (2003) as an example to illustrate problems common in the neurotoxicity literature. Our analysis highlights difficulties in conducting such quasi-experimental investigations, including subject selection bias, litigation effects on symptom report and neuropsychological test performance, response bias, and scientifically inadequate casual reasoning.
Design and Diagnosis Problem Solving with Multifunctional Technical Knowledge Bases
1992-09-29
STRUCTURE METHODOLOGY Design problem solving is a complex activity involving a number of subtasks. and a number of alternative methods potentially available...Conference on Artificial Intelligence. London: The British Computer Society, pp. 621-633. Friedland, P. (1979). Knowledge-based experimental design ...Computing Milieuxl: Management of Computing and Information Systems- -ty,*m man- agement General Terms: Design . Methodology Additional Key Words and Phrases
William H. Cooke; Dennis M. Jacobs
2005-01-01
FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land use/land cover information....
A systematic review of health effects of electronic cigarettes.
Pisinger, Charlotta; Døssing, Martin
2014-12-01
To provide a systematic review of the existing literature on health consequences of vaporing of electronic cigarettes (ECs). Search in: PubMed, EMBASE and CINAHL. Original publications describing a health-related topic, published before 14 August 2014. PRISMA recommendations were followed. We identified 1101 studies; 271 relevant after screening; 94 eligible. We included 76 studies investigating content of fluid/vapor of ECs, reports on adverse events and human and animal experimental studies. Serious methodological problems were identified. In 34% of the articles the authors had a conflict of interest. Studies found fine/ultrafine particles, harmful metals, carcinogenic tobacco-specific nitrosamines, volatile organic compounds, carcinogenic carbonyls (some in high but most in low/trace concentrations), cytotoxicity and changed gene expression. Of special concern are compounds not found in conventional cigarettes, e.g. propylene glycol. Experimental studies found increased airway resistance after short-term exposure. Reports on short-term adverse events were often flawed by selection bias. Due to many methodological problems, severe conflicts of interest, the relatively few and often small studies, the inconsistencies and contradictions in results, and the lack of long-term follow-up no firm conclusions can be drawn on the safety of ECs. However, they can hardly be considered harmless. Copyright © 2014. Published by Elsevier Inc.
Psychiatric aspects of induced abortion.
Stotland, Nada L
2011-08-01
Approximately one third of the women in the United States have an abortion during their lives. In the year 2008, 1.21 million abortions were performed in the United States (Jones and Koolstra, Perspect Sex Reprod Health 43:41-50, 2011). The psychiatric outcomes of abortion are scientifically well established (Adler et al., Science 248:41-43, 1990). Despite assertions to the contrary, there is no evidence that abortion causes psychiatric problems (Dagg, Am J Psychiatry 148:578-585, 1991). Those studies that report psychiatric sequelae suffer from severe methodological defects (Lagakos, N Engl J Med 354:1667-1669, 2006). Methodologically sound studies have demonstrated that there is a very low incidence of frank psychiatric illness after an abortion; women experience a wide variety of feelings over time, including, for some, transient sadness and grieving. However, the circumstances that lead a woman to terminate a pregnancy, including previous and/or ongoing psychiatric illness, are independently stressful and increase the likelihood of psychiatric illness over the already high baseline incidence and prevalence of mood and anxiety disorders among women of childbearing age. For optimal psychological outcomes, women, including adolescents, need to make autonomous and supported decisions about problem pregnancies. Clinicians can help patients facing these decisions and those who are working through feelings about having had abortions in the past.
NASA Astrophysics Data System (ADS)
Reem, Daniel; De Pierro, Alvaro
2017-04-01
Many problems in science and engineering involve, as part of their solution process, the consideration of a separable function which is the sum of two convex functions, one of them possibly non-smooth. Recently a few works have discussed inexact versions of several accelerated proximal methods aiming at solving this minimization problem. This paper shows that inexact versions of a method of Beck and Teboulle (fast iterative shrinkable tresholding algorithm) preserve, in a Hilbert space setting, the same (non-asymptotic) rate of convergence under some assumptions on the decay rate of the error terms The notion of inexactness discussed here seems to be rather simple, but, interestingly, when comparing to related works, closely related decay rates of the errors terms yield closely related convergence rates. The derivation sheds some light on the somewhat mysterious origin of some parameters which appear in various accelerated methods. A consequence of the analysis is that the accelerated method is perturbation resilient, making it suitable, in principle, for the superiorization methodology. By taking this into account, we re-examine the superiorization methodology and significantly extend its scope. This work was supported by FAPESP 2013/19504-9. The second author was supported also by CNPq grant 306030/2014-4.
Improta, Giovanni; Balato, Giovanni; Romano, Maria; Carpentieri, Francesco; Bifulco, Paolo; Alessandro Russo, Mario; Rosa, Donato; Triassi, Maria; Cesarelli, Mario
2015-08-01
In 2012, health care spending in Italy reached €114.5 billion, accounting for 7.2% of the Gross Domestic Product (GDP) and 14.2% of total public spending. Therefore, reducing waste in health facilities could generate substantial cost savings. The objective of this study is to show that Lean Six Sigma represents an appropriate methodology for the development of a clinical pathway which allows to improve quality and to reduce costs in prosthetic hip replacement surgery. The methodology used for the development of a new clinical pathway was Lean Six Sigma. Problem solving in Lean Six Sigma is the DMAIC (Define, Measure, Analyse, Improve, Control) roadmap, characterized by five operational phases which make possible to reach fixed goals through a rigorous process of defining, measuring, analysing, improving and controlling business problems. The following project indicated several variables influencing the inappropriate prolongation of the length of stay for inpatient treatment and corrective actions were performed to improve the effectiveness and efficiency of the process of care. The average length of stay was reduced from 18.9 to 10.6 days (-44%). This article shows there is no trade-off between quality and costs: Lean Six Sigma improves quality and, at the same time, reduces costs. © 2015 John Wiley & Sons, Ltd.
Science and Television Commercials: Adding Relevance to the Research Methodology Course.
ERIC Educational Resources Information Center
Solomon, Paul R.
1979-01-01
Contends that research methodology courses can be relevant to issues outside of psychology and describes a method which relates the course to consumer problems. Students use experimental methodology to test claims made in television commercials advertising deodorant, bathroom tissues, and soft drinks. (KC)
NASA Technical Reports Server (NTRS)
Smalley, Kurt B.; Tinker, Michael L.; Fischer, Richard T.
2001-01-01
This paper is written for the purpose of providing an introduction and set of guidelines for the use of a methodology for NASTRAN eigenvalue modeling of thin film inflatable structures. It is hoped that this paper will spare the reader from the problems and headaches the authors were confronted with during their investigation by presenting here not only an introduction and verification of the methodology, but also a discussion of the problems that this methodology can ensue. Our goal in this investigation was to verify the basic methodology through the creation and correlation of a simple model. An overview of thin film structures, their history, and their applications is given. Previous modeling work is then briefly discussed. An introduction is then given for the method of modeling. The specific mechanics of the method are then discussed in parallel with a basic discussion of NASTRAN s implementation of these mechanics. The problems encountered with the method are then given along with suggestions for their work-a-rounds. The methodology is verified through the correlation between an analytical model and modal test results of a thin film strut. Recommendations are given for the needed advancement of our understanding of this method and ability to accurately model thin film structures. Finally, conclusions are drawn regarding the usefulness of the methodology.
Jammalamadaka, Rajanikanth
2009-01-01
This report consists of a dissertation submitted to the faculty of the Department of Electrical and Computer Engineering, in partial fulfillment of the requirements for the degree of Doctor of Philosophy, Graduate College, The University of Arizona, 2008. Spatio-temporal systems with heterogeneity in their structure and behavior have two major problems associated with them. The first one is that such complex real world systems extend over very large spatial and temporal domains and consume so many computational resources to simulate that they are infeasible to study with current computational platforms. The second one is that the data available for understanding such systems is limited because they are spread over space and time making it hard to obtain micro and macro measurements. This also makes it difficult to get the data for validation of their constituent processes while simultaneously considering their global behavior. For example, the valley fever fungus considered in this dissertation is spread over a large spatial grid in the arid Southwest and typically needs to be simulated over several decades of time to obtain useful information. It is also hard to get the temperature and moisture data (which are two critical factors on which the survival of the valley fever fungus depends) at every grid point of the spatial domain over the region of study. In order to address the first problem, we develop a method based on the discrete event system specification which exploits the heterogeneity in the activity of the spatio-temporal system and which has been shown to be effective in solving relatively simple partial differential equation systems. The benefit of addressing the first problem is that it now makes it feasible to address the second problem. We address the second problem by making use of a multilevel methodology based on modeling and simulation and systems theory. This methodology helps us in the construction of models with different resolutions (base and lumped models). This allows us to refine an initially constructed lumped model with detailed physics-based process models and assess whether they improve on the original lumped models. For that assessment, we use the concept of experimental frame to delimit where the improvement is needed. This allows us to work with the available data, improve the component models in their own experimental frame and then move them to the overall frame. In this dissertation, we develop a multilevel methodology and apply it to a valley fever model. Moreover, we study the model's behavior in a particular experimental frame of interest, namely the formation of new sporing sites.
NASA Technical Reports Server (NTRS)
Hermann, Robert
1997-01-01
The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.
Intelligent systems engineering methodology
NASA Technical Reports Server (NTRS)
Fouse, Scott
1990-01-01
An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.
Model-Based Thermal System Design Optimization for the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.
2017-01-01
Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.
Scalable tuning of building models to hourly data
Garrett, Aaron; New, Joshua Ryan
2015-03-31
Energy models of existing buildings are unreliable unless calibrated so they correlate well with actual energy usage. Manual tuning requires a skilled professional, is prohibitively expensive for small projects, imperfect, non-repeatable, non-transferable, and not scalable to the dozens of sensor channels that smart meters, smart appliances, and cheap/ubiquitous sensors are beginning to make available today. A scalable, automated methodology is needed to quickly and intelligently calibrate building energy models to all available data, increase the usefulness of those models, and facilitate speed-and-scale penetration of simulation-based capabilities into the marketplace for actualized energy savings. The "Autotune'' project is a novel, model-agnosticmore » methodology which leverages supercomputing, large simulation ensembles, and big data mining with multiple machine learning algorithms to allow automatic calibration of simulations that match measured experimental data in a way that is deployable on commodity hardware. This paper shares several methodologies employed to reduce the combinatorial complexity to a computationally tractable search problem for hundreds of input parameters. Furthermore, accuracy metrics are provided which quantify model error to measured data for either monthly or hourly electrical usage from a highly-instrumented, emulated-occupancy research home.« less
Model-based thermal system design optimization for the James Webb Space Telescope
NASA Astrophysics Data System (ADS)
Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.
2017-10-01
Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.
Varney, Shawn; Hirshon, Jon Mark; Dischinger, Patricia; Mackenzie, Colin
2006-01-01
The Haddon Matrix offers a classic epidemiological model for studying injury prevention. This methodology places the public health concepts of agent, host, and environment within the three sequential phases of an injury-producing incident-pre-event, event, and postevent. This study uses this methodology to illustrate how it could be applied in systematically preparing for a mass casualty disaster such as an unconventional sarin attack in a major urban setting. Nineteen city, state, federal, and military agencies responded to the Haddon Matrix chemical terrorism preparedness exercise and offered feedback in the data review session. Four injury prevention strategies (education, engineering, enforcement, and economics) were applied to the individual factors and event phases of the Haddon Matrix. The majority of factors identified in all phases were modifiable, primarily through educational interventions focused on individual healthcare providers and first responders. The Haddon Matrix provides a viable means of studying an unconventional problem, allowing for the identification of modifiable factors to decrease the type and severity of injuries following a mass casualty disaster such as a sarin release. This strategy could be successfully incorporated into disaster planning for other weapons attacks that could potentially cause mass casualties.
Zhu, Qiaohao; Carriere, K C
2016-01-01
Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.
Financial Support for the Humanities: A Special Methodological Report.
ERIC Educational Resources Information Center
Gomberg, Irene L.; Atelsek, Frank J.
Findings and methodological problems of a survey on financial support for humanities in higher education are discussed. Usable data were gathered from 351 of 671 Higher Education Panel member institutions. Two weighting methodologies were employed. The conventional method assumed that nonrespondents were similar to respondents, whereas a…
An automated methodology development. [software design for combat simulation
NASA Technical Reports Server (NTRS)
Hawley, L. R.
1985-01-01
The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.
Technology transfer methodology
NASA Technical Reports Server (NTRS)
Labotz, Rich
1991-01-01
Information on technology transfer methodology is given in viewgraph form. Topics covered include problems in economics, technology drivers, inhibitors to using improved technology in development, technology application opportunities, and co-sponsorship of technology.
Engineering tradeoff problems viewed as multiple objective optimizations and the VODCA methodology
NASA Astrophysics Data System (ADS)
Morgan, T. W.; Thurgood, R. L.
1984-05-01
This paper summarizes a rational model for making engineering tradeoff decisions. The model is a hybrid from the fields of social welfare economics, communications, and operations research. A solution methodology (Vector Optimization Decision Convergence Algorithm - VODCA) firmly grounded in the economic model is developed both conceptually and mathematically. The primary objective for developing the VODCA methodology was to improve the process for extracting relative value information about the objectives from the appropriate decision makers. This objective was accomplished by employing data filtering techniques to increase the consistency of the relative value information and decrease the amount of information required. VODCA is applied to a simplified hypothetical tradeoff decision problem. Possible use of multiple objective analysis concepts and the VODCA methodology in product-line development and market research are discussed.
Peirlinck, Mathias; De Beule, Matthieu; Segers, Patrick; Rebelo, Nuno
2018-05-28
Patient-specific biomechanical modeling of the cardiovascular system is complicated by the presence of a physiological pressure load given that the imaged tissue is in a pre-stressed and -strained state. Neglect of this prestressed state into solid tissue mechanics models leads to erroneous metrics (e.g. wall deformation, peak stress, wall shear stress) which in their turn are used for device design choices, risk assessment (e.g. procedure, rupture) and surgery planning. It is thus of utmost importance to incorporate this deformed and loaded tissue state into the computational models, which implies solving an inverse problem (calculating an undeformed geometry given the load and the deformed geometry). Methodologies to solve this inverse problem can be categorized into iterative and direct methodologies, both having their inherent advantages and disadvantages. Direct methodologies are typically based on the inverse elastostatics (IE) approach and offer a computationally efficient single shot methodology to compute the in vivo stress state. However, cumbersome and problem-specific derivations of the formulations and non-trivial access to the finite element analysis (FEA) code, especially for commercial products, refrain a broad implementation of these methodologies. For that reason, we developed a novel, modular IE approach and implemented this methodology in a commercial FEA solver with minor user subroutine interventions. The accuracy of this methodology was demonstrated in an arterial tube and porcine biventricular myocardium model. The computational power and efficiency of the methodology was shown by computing the in vivo stress and strain state, and the corresponding unloaded geometry, for two models containing multiple interacting incompressible, anisotropic (fiber-embedded) and hyperelastic material behaviors: a patient-specific abdominal aortic aneurysm and a full 4-chamber heart model. Copyright © 2018 Elsevier Ltd. All rights reserved.
Varieties of second modernity: the cosmopolitan turn in social and political theory and research.
Beck, Ulrich; Grande, Edgar
2010-09-01
The theme of this special issue is the necessity of a cosmopolitan turn in social and political theory. The question at the heart of this introductory chapter takes the challenge of 'methodological cosmopolitanism', already addressed in a Special Issue on Cosmopolitan Sociology in this journal (Beck and Sznaider 2006), an important step further: How can social and political theory be opened up, theoretically as well as methodologically and normatively, to a historically new, entangled Modernity which threatens its own foundations? How can it account for the fundamental fragility, the mutability of societal dynamics (of unintended side effects, domination and power), shaped by the globalization of capital and risks at the beginning of the twenty-first century? What theoretical and methodological problems arise and how can they be addressed in empirical research? In the following, we will develop this 'cosmopolitan turn' in four steps: firstly, we present the major conceptual tools for a theory of cosmopolitan modernities; secondly, we de-construct Western modernity by using examples taken from research on individualization and risk; thirdly, we address the key problem of methodological cosmopolitanism, namely the problem of defining the appropriate unit of analysis; and finally,we discuss normative questions, perspectives, and dilemmas of a theory of cosmopolitan modernities, in particular problems of political agency and prospects of political realization.
A Bayesian approach to truncated data sets: An application to Malmquist bias in Supernova Cosmology
NASA Astrophysics Data System (ADS)
March, Marisa Cristina
2018-01-01
A problem commonly encountered in statistical analysis of data is that of truncated data sets. A truncated data set is one in which a number of data points are completely missing from a sample, this is in contrast to a censored sample in which partial information is missing from some data points. In astrophysics this problem is commonly seen in a magnitude limited survey such that the survey is incomplete at fainter magnitudes, that is, certain faint objects are simply not observed. The effect of this `missing data' is manifested as Malmquist bias and can result in biases in parameter inference if it is not accounted for. In Frequentist methodologies the Malmquist bias is often corrected for by analysing many simulations and computing the appropriate correction factors. One problem with this methodology is that the corrections are model dependent. In this poster we derive a Bayesian methodology for accounting for truncated data sets in problems of parameter inference and model selection. We first show the methodology for a simple Gaussian linear model and then go on to show the method for accounting for a truncated data set in the case for cosmological parameter inference with a magnitude limited supernova Ia survey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaminsky, J.; Tschanz, J.F.
In order to adress barriers to community energy-conservation efforts, DOE has established the Comprehensive Community Energy Management (CCEM) program. The role of CCEM is to provide direction and technical support for energy-conservation efforts at the local level. The program to date has included project efforts to develop combinations and variations of community energy planning and management tools applicable to communities of diverse characteristics. This paper describes the salient features of some of the tools and relates them to the testing program soon to begin in several pilot-study communities. Two methodologies that arose within such an actual planning context are takenmore » from DOE-sponsored projects in Clarksburg, West Virginia and the proposed new capital city for Alaska. Energy management in smaller communities and/or communities with limited funding and manpower resources has received special attention. One project of this type developed in general methodology that emphasizes efficient ways for small communities to reach agreement on local energy problems and potential solutions; by this guidance, the community is led to understand where it should concentrate its efforts in subsequent management activities. Another project concerns rapid growth of either a new or an existing community that could easily outstrip the management resources available locally. This methodology strives to enable the community to seize the opportunity for energy conservation through integrating the design of its energy systems and its development pattern. The last methodology creates applicable tools for comprehensive community energy planning. (MCW)« less
NASA Astrophysics Data System (ADS)
Aseev, Nikita; Agoshkov, Valery
2015-04-01
The report is devoted to the one approach to the problem of oil spill risk control of protected areas in the Baltic Sea (Aseev et al., 2014). By the problem of risk control is meant a problem of determination of optimal resources quantity which are necessary for decreasing the risk to some acceptable value. It is supposed that only moment of accident is a random variable. Mass of oil slick is chosen as a function of control. For the realization of the random variable the quadratic 'functional of cost' is introduced. It comprises cleaning costs and deviation of damage of oil pollution from its acceptable value. The problem of minimization of this functional is solved based on the methods of optimal control and the theory of adjoint equations (Agoshkov, 2003, Agoshkov et al., 2012). The solution of this problem is explicitly found. In order to solve the realistic problem of oil spill risk control in the Baltic Sea the 2d model of oil spill propagation on the sea surface based on the Seatrack Web model (Liungman, Mattson, 2011) is developed. The model takes into account such processes as oil transportation by sea currents and wind, turbulent diffusion, spreading, evaporation from sea surface, dispersion and formation of emulsion 'water-in-oil'. The model allows to calculate basic oil slick parameters: localization, mass, volume, thickness, density of oil, water content and viscosity of emulsion. The results of several numerical experiments in the Baltic Sea using the model and the methodology of oil spill risk control are presented. Along with moment of accident other parameters of oil spill and environment could be chosen as a random variables. The methodology of solution of oil spill risk control problem will remain the same but the computational complexity will increase. Conversion of the function of control to quantity of resources with a glance to methods of pollution removal should be processed. As a result, the developed 2d model of oil spill propagation combined with the methodology of solution of oil spill risk control problem could provide the basis for oil spill simulation systems, systems of evaluation and control of oil spill risk and damage in seas or decision support systems. References V.I. Agoshkov. The methods of optimal control and adjoint equations in problems of mathematical physics. // Moscow: INM RAS, 2003, 256 p. (in Russian). V.I. Agoshkov, N.A. Aseev, I.S. Novikov. The methods of investigation and solution of the problems of local sources and local or integral observations. // Moscow: INM RAS, 2012. 151 p. (in Russian). N.A. Aseev, V.I. Agoshkov, V.B. Zalesny, R. Aps, P. Kujala, and J. Rytkonen. The problem of control of oil pollution risk in the Baltic Sea // Russ. J. Numer. Analysis and Math. Modelling, 2014, V 29, No. 2, 93-105. O. Liungman, J. Mattson. Scientific documentation of Seatrack Web; physical processes, algorithms and references, 2011. // https://stw-helcom.smhi.se/
Schedule Risks Due to Delays in Advanced Technology Development
NASA Technical Reports Server (NTRS)
Reeves, John D. Jr.; Kayat, Kamal A.; Lim, Evan
2008-01-01
This paper discusses a methodology and modeling capability that probabilistically evaluates the likelihood and impacts of delays in advanced technology development prior to the start of design, development, test, and evaluation (DDT&E) of complex space systems. The challenges of understanding and modeling advanced technology development considerations are first outlined, followed by a discussion of the problem in the context of lunar surface architecture analysis. The current and planned methodologies to address the problem are then presented along with sample analyses and results. The methodology discussed herein provides decision-makers a thorough understanding of the schedule impacts resulting from the inclusion of various enabling advanced technology assumptions within system design.
Methodological Problems of Soviet Pedagogy
ERIC Educational Resources Information Center
Noah, Harold J., Ed.; Beach, Beatrice S., Ed.
1974-01-01
Selected papers presented at the First Scientific Conference of Pedagogical Scholars of Socialist Countries, Moscow, 1971, deal with methodology in relation to science, human development, sociology, psychology, cybernetics, and the learning process. (KM)
Montecinos, P; Rodewald, A M
1994-06-01
The aim this work was to assess and compare the achievements of medical students, subjected to problem based learning methodology. The information and comprehension categories of Bloom were tested in 17 medical students in four different occasions during the physiopathology course, using a multiple choice knowledge test. There was a significant improvement in the number of correct answers towards the end of the course. It is concluded that these medical students obtained adequate learning achievements in the information subcategory of Bloom using problem based learning methodology, during the physiopathology course.
Borodulin, V I; Gliantsev, S P
2017-07-01
The article considers particular key methodological aspects of problem of scientific clinical school in national medicine. These aspects have to do with notion of school, its profile, issues of pedagogues, teachings and followers, subsidiary schools and issue of ethical component of scientific school. The article is a polemic one hence one will find no definite answers to specified questions. The reader is proposed to ponder over answers independently adducing examples of pro and contra. The conclusion is made about necessity of studying scientific schools in other areas of medicine and further elaboration of problem.
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.
Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-09-12
Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.
NASA Astrophysics Data System (ADS)
Weatherwax Scott, Caroline; Tsareff, Christopher R.
1990-06-01
One of the main goals of process engineering in the semiconductor industry is to improve wafer fabrication productivity and throughput. Engineers must work continuously toward this goal in addition to performing sustaining and development tasks. To accomplish these objectives, managers must make efficient use of engineering resources. One of the tools being used to improve efficiency is the diagnostic expert system. Expert systems are knowledge based computer programs designed to lead the user through the analysis and solution of a problem. Several photolithography diagnostic expert systems have been implemented at the Hughes Technology Center to provide a systematic approach to process problem solving. This systematic approach was achieved by documenting cause and effect analyses for a wide variety of processing problems. This knowledge was organized in the form of IF-THEN rules, a common structure for knowledge representation in expert system technology. These rules form the knowledge base of the expert system which is stored in the computer. The systems also include the problem solving methodology used by the expert when addressing a problem in his area of expertise. Operators now use the expert systems to solve many process problems without engineering assistance. The systems also facilitate the collection of appropriate data to assist engineering in solving unanticipated problems. Currently, several expert systems have been implemented to cover all aspects of the photolithography process. The systems, which have been in use for over a year, include wafer surface preparation (HMDS), photoresist coat and softbake, align and expose on a wafer stepper, and develop inspection. These systems are part of a plan to implement an expert system diagnostic environment throughout the wafer fabrication facility. In this paper, the systems' construction is described, including knowledge acquisition, rule construction, knowledge refinement, testing, and evaluation. The roles played by the process engineering expert and the knowledge engineer are discussed. The features of the systems are shown, particularly the interactive quality of the consultations and the ease of system use.
ERIC Educational Resources Information Center
Riazi, A. Mehdi
2016-01-01
Mixed-methods research (MMR), as an inter-discourse (quantitative and qualitative) methodology, can provide applied linguistics researchers the opportunity to draw on and integrate the strengths of the two research methodological approaches in favour of making more rigorous inferences about research problems. In this article, the argument is made…
Active Methodologies in a Queueing Systems Course for Telecommunication Engineering Studies
ERIC Educational Resources Information Center
Garcia, J.; Hernandez, A.
2010-01-01
This paper presents the results of a one-year experiment in incorporating active methodologies in a Queueing Systems course as part of the Telecommunication Engineering degree at the University of Zaragoza, Spain, during the period of adaptation to the European Higher Education Area. A problem-based learning methodology has been introduced, and…
Miskowiak, Kamilla W; Carvalho, André F; Vieta, Eduard; Kessing, Lars V
2016-10-01
Cognitive dysfunction is an emerging treatment target in bipolar disorder (BD). Several trials have assessed the efficacy of novel pharmacological and psychological treatments on cognition in BD but the findings are contradictory and unclear. A systematic search following the PRISMA guidelines was conducted on PubMed and PsychInfo. Eligible articles reported randomized, controlled or open-label trials investigating pharmacological or psychological treatments targeting cognitive dysfunction in BD. The quality of the identified randomized controlled trials (RCTs) was evaluated with the Cochrane Collaboration's Risk of Bias tool. We identified 19 eligible studies of which 13 were RCTs and six were open-label or non-randomized studies. The findings regarding efficacy on cognition were overall disappointing or preliminary, possibly due to several methodological challenges. For the RCTs, the risk of bias was high in nine cases, unclear in one case and low in three cases. Key reasons for the high risk of bias were lack of details on the randomization process, suboptimal handling of missing data and lack of a priori priority between cognition outcomes. Other challenges were the lack of consensus on whether and how to screen for cognitive impairment and on how to assess efficacy on cognition. In conclusion, methodological problems are likely to impede the success rates of cognition trials in BD. We recommend adherence to the CONSORT guidelines for RCTs, screening for cognitive impairment before inclusion of trial participants and selection of one primary cognition outcome. Future implementation of a 'neurocircuitry-based' biomarker model to evaluate neural target engagement is warranted. Copyright © 2016 Elsevier B.V. and ECNP. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Xiangqi; Zhang, Yingchen
This paper presents an optimal voltage control methodology with coordination among different voltage-regulating resources, including controllable loads, distributed energy resources such as energy storage and photovoltaics (PV), and utility voltage-regulating devices such as voltage regulators and capacitors. The proposed methodology could effectively tackle the overvoltage and voltage regulation device distortion problems brought by high penetrations of PV to improve grid operation reliability. A voltage-load sensitivity matrix and voltage-regulator sensitivity matrix are used to deploy the resources along the feeder to achieve the control objectives. Mixed-integer nonlinear programming is used to solve the formulated optimization control problem. The methodology has beenmore » tested on the IEEE 123-feeder test system, and the results demonstrate that the proposed approach could actively tackle the voltage problem brought about by high penetrations of PV and improve the reliability of distribution system operation.« less
Topology optimization for nonlinear dynamic problems: Considerations for automotive crashworthiness
NASA Astrophysics Data System (ADS)
Kaushik, Anshul; Ramani, Anand
2014-04-01
Crashworthiness of automotive structures is most often engineered after an optimal topology has been arrived at using other design considerations. This study is an attempt to incorporate crashworthiness requirements upfront in the topology synthesis process using a mathematically consistent framework. It proposes the use of equivalent linear systems from the nonlinear dynamic simulation in conjunction with a discrete-material topology optimizer. Velocity and acceleration constraints are consistently incorporated in the optimization set-up. Issues specific to crash problems due to the explicit solution methodology employed, nature of the boundary conditions imposed on the structure, etc. are discussed and possible resolutions are proposed. A demonstration of the methodology on two-dimensional problems that address some of the structural requirements and the types of loading typical of frontal and side impact is provided in order to show that this methodology has the potential for topology synthesis incorporating crashworthiness requirements.
Artificial intelligence and design: Opportunities, research problems and directions
NASA Technical Reports Server (NTRS)
Amarel, Saul
1990-01-01
The issues of industrial productivity and economic competitiveness are of major significance in the U.S. at present. By advancing the science of design, and by creating a broad computer-based methodology for automating the design of artifacts and of industrial processes, we can attain dramatic improvements in productivity. It is our thesis that developments in computer science, especially in Artificial Intelligence (AI) and in related areas of advanced computing, provide us with a unique opportunity to push beyond the present level of computer aided automation technology and to attain substantial advances in the understanding and mechanization of design processes. To attain these goals, we need to build on top of the present state of AI, and to accelerate research and development in areas that are especially relevant to design problems of realistic complexity. We propose an approach to the special challenges in this area, which combines 'core work' in AI with the development of systems for handling significant design tasks. We discuss the general nature of design problems, the scientific issues involved in studying them with the help of AI approaches, and the methodological/technical issues that one must face in developing AI systems for handling advanced design tasks. Looking at basic work in AI from the perspective of design automation, we identify a number of research problems that need special attention. These include finding solution methods for handling multiple interacting goals, formation problems, problem decompositions, and redesign problems; choosing representations for design problems with emphasis on the concept of a design record; and developing approaches for the acquisition and structuring of domain knowledge with emphasis on finding useful approximations to domain theories. Progress in handling these research problems will have major impact both on our understanding of design processes and their automation, and also on several fundamental questions that are of intrinsic concern to AI. We present examples of current AI work on specific design tasks, and discuss new directions of research, both as extensions of current work and in the context of new design tasks where domain knowledge is either intractable or incomplete. The domains discussed include Digital Circuit Design, Mechanical Design of Rotational Transmissions, Design of Computer Architectures, Marine Design, Aircraft Design, and Design of Chemical Processes and Materials. Work in these domains is significant on technical grounds, and it is also important for economic and policy reasons.
Prokop, Anna; Pilc, Andrzej
2015-01-01
The problem of drug shortages has been reported worldwide, gaining prominence in multiple domains and several countries in recent years. The aim of the study was to analyze, characterise and assess this problem in Belgium and France, while also adopting a wider perspective from the European Union. A qualitative methodological approach was employed, including semi-structured interviews with the representatives of respective national health authorities, pharmaceutical companies and wholesalers, as well as hospital and community pharmacists. The research was conducted in early 2014. Four themes, which were identified through the interviews, were addressed in the paper, i.e. a) defining drug shortages, b) their dynamics and perception, c) their determinants, d) the role of the European and national institutions in coping with the problem. Three groups of determinants of drug shortages were identified throughout this study: manufacturing problems, distribution and supply problems, and problems related to economic aspects. Currently, the Member States of the European Union are striving to resolve the problem very much on their own, although a far more focused and dedicated collaboration may well prove instrumental in coping with drug shortages throughout Europe more effectively. To the best of the authors’ knowledge, this is the first qualitative study to investigate the characteristics, key determinants, and the problem drivers of drug shortages, focusing on this particular group of countries, while also adopting the European Union’s perspective. PMID:25942432
Health effects of exposure to waste incinerator emissions:a review of epidemiological studies.
Franchini, Michela; Rial, Michela; Buiatti, Eva; Bianchi, Fabrizio
2004-01-01
This review evaluates the epidemiological literature on health effects in relation to incineration facilities. Several adverse health effects have been reported. Significant exposure-disease associations are reported by two thirds of the papers focusing on cancer (lung and larynx cancer, non-Hodgkin's lymphoma). Positive associations were found for congenital malformations and residence near incinerators. Exposure to PCB and heavy metals were associated with several health outcomes and in particular with reduction of thyroid hormones. Findings on non-carcinogen pathologies are inconclusive. Effect of biases and confounding factors must be considered in the explanation of findings. Methodological problems and insufficient exposure information generate difficulties on study results. Research needs include a better definition of exposure in qualitative and quantitative terms in particular by developing the use of biomarkers and by implementing environmental measurements.
NASA Astrophysics Data System (ADS)
Añel, Juan A.
2017-03-01
Nowadays, the majority of the scientific community is not aware of the risks and problems associated with an inadequate use of computer systems for research, mostly for reproducibility of scientific results. Such reproducibility can be compromised by the lack of clear standards and insufficient methodological description of the computational details involved in an experiment. In addition, the inappropriate application or ignorance of copyright laws can have undesirable effects on access to aspects of great importance of the design of experiments and therefore to the interpretation of results.
Tracking problem solving by multivariate pattern analysis and Hidden Markov Model algorithms.
Anderson, John R
2012-03-01
Multivariate pattern analysis can be combined with Hidden Markov Model algorithms to track the second-by-second thinking as people solve complex problems. Two applications of this methodology are illustrated with a data set taken from children as they interacted with an intelligent tutoring system for algebra. The first "mind reading" application involves using fMRI activity to track what students are doing as they solve a sequence of algebra problems. The methodology achieves considerable accuracy at determining both what problem-solving step the students are taking and whether they are performing that step correctly. The second "model discovery" application involves using statistical model evaluation to determine how many substates are involved in performing a step of algebraic problem solving. This research indicates that different steps involve different numbers of substates and these substates are associated with different fluency in algebra problem solving. Copyright © 2011 Elsevier Ltd. All rights reserved.
Perspective: Quantum mechanical methods in biochemistry and biophysics.
Cui, Qiang
2016-10-14
In this perspective article, I discuss several research topics relevant to quantum mechanical (QM) methods in biophysical and biochemical applications. Due to the immense complexity of biological problems, the key is to develop methods that are able to strike the proper balance of computational efficiency and accuracy for the problem of interest. Therefore, in addition to the development of novel ab initio and density functional theory based QM methods for the study of reactive events that involve complex motifs such as transition metal clusters in metalloenzymes, it is equally important to develop inexpensive QM methods and advanced classical or quantal force fields to describe different physicochemical properties of biomolecules and their behaviors in complex environments. Maintaining a solid connection of these more approximate methods with rigorous QM methods is essential to their transferability and robustness. Comparison to diverse experimental observables helps validate computational models and mechanistic hypotheses as well as driving further development of computational methodologies.
LCP method for a planar passive dynamic walker based on an event-driven scheme
NASA Astrophysics Data System (ADS)
Zheng, Xu-Dong; Wang, Qi
2018-06-01
The main purpose of this paper is to present a linear complementarity problem (LCP) method for a planar passive dynamic walker with round feet based on an event-driven scheme. The passive dynamic walker is treated as a planar multi-rigid-body system. The dynamic equations of the passive dynamic walker are obtained by using Lagrange's equations of the second kind. The normal forces and frictional forces acting on the feet of the passive walker are described based on a modified Hertz contact model and Coulomb's law of dry friction. The state transition problem of stick-slip between feet and floor is formulated as an LCP, which is solved with an event-driven scheme. Finally, to validate the methodology, four gaits of the walker are simulated: the stance leg neither slips nor bounces; the stance leg slips without bouncing; the stance leg bounces without slipping; the walker stands after walking several steps.
LCP method for a planar passive dynamic walker based on an event-driven scheme
NASA Astrophysics Data System (ADS)
Zheng, Xu-Dong; Wang, Qi
2018-02-01
The main purpose of this paper is to present a linear complementarity problem (LCP) method for a planar passive dynamic walker with round feet based on an event-driven scheme. The passive dynamic walker is treated as a planar multi-rigid-body system. The dynamic equations of the passive dynamic walker are obtained by using Lagrange's equations of the second kind. The normal forces and frictional forces acting on the feet of the passive walker are described based on a modified Hertz contact model and Coulomb's law of dry friction. The state transition problem of stick-slip between feet and floor is formulated as an LCP, which is solved with an event-driven scheme. Finally, to validate the methodology, four gaits of the walker are simulated: the stance leg neither slips nor bounces; the stance leg slips without bouncing; the stance leg bounces without slipping; the walker stands after walking several steps.
Emery, R E
1989-02-01
Researchers and policymakers have begun to recognize the extent and severity of family violence in recent years, particularly its effects on children. Despite a flurry of research, however, there is much disagreement about the definition of violence, its development, the consequences for victims, and the most effective avenues for intervention. Similar conceptual, methodological, and practical problems are faced by those working in the areas of physical child abuse, child sex abuse, and child witnesses to spouse abuse. In further research on these complex problems, researchers are encouraged to use operational definitions that avoid terms like abuse and violence, to focus new efforts on emotional mediators of violent actions, to evaluate the effects of violence on the entire family system, and to redouble efforts to conduct systematic outcome research. Those professionals who are currently responsible for intervention are encouraged to use definitions of and responses to family violence that match those used for assaults between strangers.
False-positive tangible outcomes of functional analyses.
Rooker, Griffin W; Iwata, Brian A; Harper, Jill M; Fahmie, Tara A; Camp, Erin M
2011-01-01
Functional analysis (FA) methodology is the most precise method for identifying variables that maintain problem behavior. Occasionally, however, results of an FA may be influenced by idiosyncratic sensitivity to aspects of the assessment conditions. For example, data from several studies suggest that inclusion of a tangible condition during an FA may be prone to a false-positive outcome, although the extent to which tangible reinforcement routinely produces such outcomes is unknown. We examined susceptibility to tangible reinforcement by determining whether a new response was acquired more readily when exposed to a tangible contingency relative to others commonly used in an FA (Study 1), and whether problem behavior known not to have a social function nevertheless emerged when exposed to tangible reinforcement (Study 2). Results indicated that inclusion of items in the tangible condition should be done with care and that selection should be based on those items typically found in the individual's environment.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems.
Mahadevan, Vijay S; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-08-06
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.
Data reliability in complex directed networks
NASA Astrophysics Data System (ADS)
Sanz, Joaquín; Cozzo, Emanuele; Moreno, Yamir
2013-12-01
The availability of data from many different sources and fields of science has made it possible to map out an increasing number of networks of contacts and interactions. However, quantifying how reliable these data are remains an open problem. From Biology to Sociology and Economics, the identification of false and missing positives has become a problem that calls for a solution. In this work we extend one of the newest, best performing models—due to Guimerá and Sales-Pardo in 2009—to directed networks. The new methodology is able to identify missing and spurious directed interactions with more precision than previous approaches, which renders it particularly useful for analyzing data reliability in systems like trophic webs, gene regulatory networks, communication patterns and several social systems. We also show, using real-world networks, how the method can be employed to help search for new interactions in an efficient way.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-01-01
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework. PMID:24982250
Creativity and psychopathology: a systematic review.
Thys, Erik; Sabbe, Bernard; De Hert, Marc
2014-01-01
The possible link between creativity and psychopathology has been a long-time focus of research up to the present day. However, the research results in this field are heterogeneous and contradictory. Links between creativity and specific psychiatric disorders have been confirmed and refuted in different studies. This disparity is partly explained by the methodological challenges peculiar to this field. In this systematic review of the literature from 1950, research articles in the field of creativity and psychopathology are presented, focusing on the methodology and results of the collected studies. This review confirms the methodological problems and the heterogeneity of the study designs and results. The assessment of psychopathology, but more so of creativity, remains a fundamental challenge. On the whole, study results cautiously confirm an association between creativity and both bipolar disorder and schizotypy. The research on creativity and psychopathology is hampered by serious methodological problems. Study results are to be interpreted with caution and future research needs more methodological rigor. © 2014 S. Karger AG, Basel.
Analyzing Problem's Difficulty Based on Neural Networks and Knowledge Map
ERIC Educational Resources Information Center
Kuo, Rita; Lien, Wei-Peng; Chang, Maiga; Heh, Jia-Sheng
2004-01-01
This paper proposes a methodology to calculate both the difficulty of the basic problems and the difficulty of solving a problem. The method to calculate the difficulty of problem is according to the process of constructing a problem, including Concept Selection, Unknown Designation, and Proposition Construction. Some necessary measures observed…
Mesquita, D P; Dias, O; Amaral, A L; Ferreira, E C
2009-04-01
In recent years, a great deal of attention has been focused on the research of activated sludge processes, where the solid-liquid separation phase is frequently considered of critical importance, due to the different problems that severely affect the compaction and the settling of the sludge. Bearing that in mind, in this work, image analysis routines were developed in Matlab environment, allowing the identification and characterization of microbial aggregates and protruding filaments in eight different wastewater treatment plants, for a combined period of 2 years. The monitoring of the activated sludge contents allowed for the detection of bulking events proving that the developed image analysis methodology is adequate for a continuous examination of the morphological changes in microbial aggregates and subsequent estimation of the sludge volume index. In fact, the obtained results proved that the developed image analysis methodology is a feasible method for the continuous monitoring of activated sludge systems and identification of disturbances.
Rapid Cost Assessment of Space Mission Concepts through Application of Complexity Indices
NASA Technical Reports Server (NTRS)
Peterson, Craig; Cutts, James; Balint, Tibor; Hall, James B.
2008-01-01
In 2005, the Solar System Exploration Strategic Roadmap Conmrittee (chartered by NASA to develop the roadmap for Solar System Exploration Missions for the coming decades) found itself posed with the difficult problem of sorting through several mission concepts and determining their relative costs. While detailed mission studies are the normal approach to costing, neither the budget nor schedule allotted to the conmrittee could support such studies. Members of the Jet Propulsion Laboratory (JPL) supporting the conmrittee were given the challenge of developing a semi-quantitative approach that could provide the relative costs of these missions, without requiring an in depth study of the missions. In response to this challenge, a rapid cost assessment methodology based on a set of mission cost/complexity indexes was developed. This methodology also underwent two separate validations, one comparing its results when applied to historical missions, and another comparing its estimates against those of veteran space mission managers. Remarkably good agreement was achieved, suggesting that this approach provides an effective early indication of space mission costs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, M.D.; Beck, R.N.
1988-06-01
This document describes several years research to improve PET imaging and diagnostic techniques in man. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefitmore » from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. The reports in the study were processed separately for the data bases. (TEM)« less
Statistical data mining of streaming motion data for fall detection in assistive environments.
Tasoulis, S K; Doukas, C N; Maglogiannis, I; Plagianakos, V P
2011-01-01
The analysis of human motion data is interesting for the purpose of activity recognition or emergency event detection, especially in the case of elderly or disabled people living independently in their homes. Several techniques have been proposed for identifying such distress situations using either motion, audio or video sensors on the monitored subject (wearable sensors) or the surrounding environment. The output of such sensors is data streams that require real time recognition, especially in emergency situations, thus traditional classification approaches may not be applicable for immediate alarm triggering or fall prevention. This paper presents a statistical mining methodology that may be used for the specific problem of real time fall detection. Visual data captured from the user's environment, using overhead cameras along with motion data are collected from accelerometers on the subject's body and are fed to the fall detection system. The paper includes the details of the stream data mining methodology incorporated in the system along with an initial evaluation of the achieved accuracy in detecting falls.
Towards a general object-oriented software development methodology
NASA Technical Reports Server (NTRS)
Seidewitz, ED; Stark, Mike
1986-01-01
An object is an abstract software model of a problem domain entity. Objects are packages of both data and operations of that data (Goldberg 83, Booch 83). The Ada (tm) package construct is representative of this general notion of an object. Object-oriented design is the technique of using objects as the basic unit of modularity in systems design. The Software Engineering Laboratory at the Goddard Space Flight Center is currently involved in a pilot program to develop a flight dynamics simulator in Ada (approximately 40,000 statements) using object-oriented methods. Several authors have applied object-oriented concepts to Ada (e.g., Booch 83, Cherry 85). It was found that these methodologies are limited. As a result a more general approach was synthesized with allows a designer to apply powerful object-oriented principles to a wide range of applications and at all stages of design. An overview is provided of this approach. Further, how object-oriented design fits into the overall software life-cycle is considered.
Geometric stiffening in multibody dynamics formulations
NASA Technical Reports Server (NTRS)
Sharf, Inna
1993-01-01
In this paper we discuss the issue of geometric stiffening as it arises in the context of multibody dynamics. This topic has been treated in a number of previous publications in this journal and appears to be a debated subject. The controversy revolves primarily around the 'correct' methodology for incorporating the stiffening effect into dynamics formulations. The main goal of this work is to present the different approaches that have been developed for this problem through an in-depth review of several publications dealing with this subject. This is done with the goal of contributing to a precise understanding of the existing methodologies for modelling the stiffening effects in multibody systems. Thus, in presenting the material we attempt to illuminate the key characteristics of the various methods as well as show how they relate to each other. In addition, we offer a number of novel insights and clarifying interpretations of these schemes. The paper is completed with a general classification and comparison of the different approaches.
Strategies and Methodologies for Developing Microbial Detoxification Systems to Mitigate Mycotoxins
Zhu, Yan; Hassan, Yousef I.; Lepp, Dion; Shao, Suqin; Zhou, Ting
2017-01-01
Mycotoxins, the secondary metabolites of mycotoxigenic fungi, have been found in almost all agricultural commodities worldwide, causing enormous economic losses in livestock production and severe human health problems. Compared to traditional physical adsorption and chemical reactions, interest in biological detoxification methods that are environmentally sound, safe and highly efficient has seen a significant increase in recent years. However, researchers in this field have been facing tremendous unexpected challenges and are eager to find solutions. This review summarizes and assesses the research strategies and methodologies in each phase of the development of microbiological solutions for mycotoxin mitigation. These include screening of functional microbial consortia from natural samples, isolation and identification of single colonies with biotransformation activity, investigation of the physiological characteristics of isolated strains, identification and assessment of the toxicities of biotransformation products, purification of functional enzymes and the application of mycotoxin decontamination to feed/food production. A full understanding and appropriate application of this tool box should be helpful towards the development of novel microbiological solutions on mycotoxin detoxification. PMID:28387743
An exploratory fNIRS study with immersive virtual reality: a new method for technical implementation
Seraglia, Bruno; Gamberini, Luciano; Priftis, Konstantinos; Scatturin, Pietro; Martinelli, Massimiliano; Cutini, Simone
2011-01-01
For over two decades Virtual Reality (VR) has been used as a useful tool in several fields, from medical and psychological treatments, to industrial and military applications. Only in recent years researchers have begun to study the neural correlates that subtend VR experiences. Even if the functional Magnetic Resonance Imaging (fMRI) is the most common and used technique, it suffers several limitations and problems. Here we present a methodology that involves the use of a new and growing brain imaging technique, functional Near-infrared Spectroscopy (fNIRS), while participants experience immersive VR. In order to allow a proper fNIRS probe application, a custom-made VR helmet was created. To test the adapted helmet, a virtual version of the line bisection task was used. Participants could bisect the lines in a virtual peripersonal or extrapersonal space, through the manipulation of a Nintendo Wiimote ® controller in order for the participants to move a virtual laser pointer. Although no neural correlates of the dissociation between peripersonal and extrapersonal space were found, a significant hemodynamic activity with respect to the baseline was present in the right parietal and occipital areas. Both advantages and disadvantages of the presented methodology are discussed. PMID:22207843
Ligand diffusion in proteins via enhanced sampling in molecular dynamics.
Rydzewski, J; Nowak, W
2017-12-01
Computational simulations in biophysics describe the dynamics and functions of biological macromolecules at the atomic level. Among motions particularly important for life are the transport processes in heterogeneous media. The process of ligand diffusion inside proteins is an example of a complex rare event that can be modeled using molecular dynamics simulations. The study of physical interactions between a ligand and its biological target is of paramount importance for the design of novel drugs and enzymes. Unfortunately, the process of ligand diffusion is difficult to study experimentally. The need for identifying the ligand egress pathways and understanding how ligands migrate through protein tunnels has spurred the development of several methodological approaches to this problem. The complex topology of protein channels and the transient nature of the ligand passage pose difficulties in the modeling of the ligand entry/escape pathways by canonical molecular dynamics simulations. In this review, we report a methodology involving a reconstruction of the ligand diffusion reaction coordinates and the free-energy profiles along these reaction coordinates using enhanced sampling of conformational space. We illustrate the above methods on several ligand-protein systems, including cytochromes and G-protein-coupled receptors. The methods are general and may be adopted to other transport processes in living matter. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary
1996-01-01
We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.
Kolko, David J; Perrin, Ellen
2014-01-01
Because the integration of mental or behavioral health services in pediatric primary care is a national priority, a description and evaluation of the interventions applied in the healthcare setting is warranted. This article examines several intervention research studies based on alternative models for delivering behavioral health care in conjunction with comprehensive pediatric care. This review describes the diverse methods applied to different clinical problems, such as brief mental health skills, clinical guidelines, and evidence-based practices, and the empirical outcomes of this research literature. Next, several key treatment considerations are discussed to maximize the efficiency and effectiveness of these interventions. Some practical suggestions for overcoming key service barriers are provided to enhance the capacity of the practice to deliver behavioral health care. There is moderate empirical support for the feasibility, acceptability, and clinical utility of these interventions for treating internalizing and externalizing behavior problems. Practical strategies to extend this work and address methodological limitations are provided that draw upon recent frameworks designed to simplify the treatment enterprise (e.g., common elements). Pediatric primary care has become an important venue for providing mental health services to children and adolescents due, in part, to its many desirable features (e.g., no stigma, local setting, familiar providers). Further adaptation of existing delivery models may promote the delivery of effective integrated interventions with primary care providers as partners designed to address mental health problems in pediatric healthcare.
Innovation design of medical equipment based on TRIZ.
Gao, Changqing; Guo, Leiming; Gao, Fenglan; Yang, Bo
2015-01-01
Medical equipment is closely related to personal health and safety, and this can be of concern to the equipment user. Furthermore, there is much competition among medical equipment manufacturers. Innovative design is the key to success for those enterprises. The design of medical equipment usually covers vastly different domains of knowledge. The application of modern design methodology in medical equipment and technology invention is an urgent requirement. TRIZ (Russian abbreviation of what can be translated as `theory of inventive problem solving') was born in Russia, which contain some problem-solving methods developed by patent analysis around the world, including Conflict Matrix, Substance Field Analysis, Standard Solution, Effects, etc. TRIZ is an inventive methodology for problems solving. As an Engineering example, infusion system is analyzed and re-designed by TRIZ. The innovative idea is generated to liberate the caretaker from the infusion bag watching out. The research in this paper shows the process of the application of TRIZ in medical device inventions. It is proved that TRIZ is an inventive methodology for problems solving and can be used widely in medical device development.
Intelligent Systems For Aerospace Engineering: An Overview
NASA Technical Reports Server (NTRS)
KrishnaKumar, K.
2003-01-01
Intelligent systems are nature-inspired, mathematically sound, computationally intensive problem solving tools and methodologies that have become extremely important for advancing the current trends in information technology. Artificially intelligent systems currently utilize computers to emulate various faculties of human intelligence and biological metaphors. They use a combination of symbolic and sub-symbolic systems capable of evolving human cognitive skills and intelligence, not just systems capable of doing things humans do not do well. Intelligent systems are ideally suited for tasks such as search and optimization, pattern recognition and matching, planning, uncertainty management, control, and adaptation. In this paper, the intelligent system technologies and their application potential are highlighted via several examples.
A Technique for Measuring Rotocraft Dynamic Stability in the 40 by 80 Foot Wind Tunnel
NASA Technical Reports Server (NTRS)
Gupta, N. K.; Bohn, J. G.
1977-01-01
An on-line technique is described for the measurement of tilt rotor aircraft dynamic stability in the Ames 40- by 80-Foot Wind Tunnel. The technique is based on advanced system identification methodology and uses the instrumental variables approach. It is particulary applicable to real time estimation problems with limited amounts of noise-contaminated data. Several simulations are used to evaluate the algorithm. Estimated natural frequencies and damping ratios are compared with simulation values. The algorithm is also applied to wind tunnel data in an off-line mode. The results are used to develop preliminary guidelines for effective use of the algorithm.
A Linear-Elasticity Solver for Higher-Order Space-Time Mesh Deformation
NASA Technical Reports Server (NTRS)
Diosady, Laslo T.; Murman, Scott M.
2018-01-01
A linear-elasticity approach is presented for the generation of meshes appropriate for a higher-order space-time discontinuous finite-element method. The equations of linear-elasticity are discretized using a higher-order, spatially-continuous, finite-element method. Given an initial finite-element mesh, and a specified boundary displacement, we solve for the mesh displacements to obtain a higher-order curvilinear mesh. Alternatively, for moving-domain problems we use the linear-elasticity approach to solve for a temporally discontinuous mesh velocity on each time-slab and recover a continuous mesh deformation by integrating the velocity. The applicability of this methodology is presented for several benchmark test cases.
Least-squares luma-chroma demultiplexing algorithm for Bayer demosaicking.
Leung, Brian; Jeon, Gwanggil; Dubois, Eric
2011-07-01
This paper addresses the problem of interpolating missing color components at the output of a Bayer color filter array (CFA), a process known as demosaicking. A luma-chroma demultiplexing algorithm is presented in detail, using a least-squares design methodology for the required bandpass filters. A systematic study of objective demosaicking performance and system complexity is carried out, and several system configurations are recommended. The method is compared with other benchmark algorithms in terms of CPSNR and S-CIELAB ∆E∗ objective quality measures and demosaicking speed. It was found to provide excellent performance and the best quality-speed tradeoff among the methods studied.
Tenerife revisited: the critical role of dentistry.
Brannon, R B; Morlang, W M
2001-05-01
The authors record the contribution of dentistry to the identification of victims of one of the most significant disasters in the history of aviation-the March 1977 collision of two Boeing 747 jumbo jets in the Canary Islands, which resulted in 583 fatalities. Dental identification was the primary method of victim identification because a high percentage of the bodies were severely burned. Virtually all aspects of the U.S. identification efforts have been reported with the exception of the valuable role of dentistry. The dental team's organization, methodology, and significant contributions to forensic dentistry and a variety of remarkable problems that the team encountered are documented.
New Control Paradigms for Resources Saving: An Approach for Mobile Robots Navigation.
Socas, Rafael; Dormido, Raquel; Dormido, Sebastián
2018-01-18
In this work, an event-based control scheme is presented. The proposed system has been developed to solve control problems appearing in the field of Networked Control Systems (NCS). Several models and methodologies have been proposed to measure different resources consumptions. The use of bandwidth, computational load and energy resources have been investigated. This analysis shows how the parameters of the system impacts on the resources efficiency. Moreover, the proposed system has been compared with its equivalent discrete-time solution. In the experiments, an application of NCS for mobile robots navigation has been set up and its resource usage efficiency has been analysed.
Performance-costs evaluation for urban storm drainage.
Baptista, M; Barraud, S; Alfakih, E; Nascimento, N; Fernandes, W; Moura, P; Castro, L
2005-01-01
The design process of urban stormwater systems incorporating BMPs involves more complexity unlike the design of classic drainage systems for which just the technique of pipes is likely to be used. This paper presents a simple decision aid methodology and an associated software (AvDren) concerning urban stormwater systems, devoted to the evaluation and the comparison of drainage scenarios using BMPs according to different technical, sanitary, social environmental and economical aspects. This kind of tool is particularly interesting so as to help the decision makers to select the appropriate alternative and to plan the investments especially for developing countries, with important sanitary problems and severe budget restrictions.
Doing more harm than good: negative health effects of intimate-partner violence campaigns.
West, Jean Jaymes
2013-01-01
This study investigates unintended negative effects of health communication campaigns surrounding intimate-partner violence. Major health organizations have identified this issue as an urgent health problem for women, but the effects of these campaigns have rarely been tested with the target audience most affected by the issue. Using qualitative methodology, 10 focus groups were conducted with female survivors of intimate-partner violence. It was found that this group viewed the campaigns as emotionally harmful, inaccurate, and misleading. The results of this research suggest these campaigns may do more harm than good for the audience most severely affected by this issue.
Intelligent Systems for Aerospace Engineering: An Overview
NASA Technical Reports Server (NTRS)
Krishnakumar, Kalmanje
2002-01-01
Intelligent systems are nature-inspired, mathematically sound, computationally intensive problem solving tools and methodologies that have become extremely important for advancing the current trends in information technology. Artificially intelligent systems currently utilize computers to emulate various faculties of human intelligence and biological metaphors. They use a combination of symbolic and sub-symbolic systems capable of evolving human cognitive skills and intelligence, not just systems capable of doing things humans do not do well. Intelligent systems are ideally suited for tasks such as search and optimization, pattern recognition and matching, planning, uncertainty management, control, and adaptation. In this paper, the intelligent system technologies and their application potential are highlighted via several examples.
New Control Paradigms for Resources Saving: An Approach for Mobile Robots Navigation
2018-01-01
In this work, an event-based control scheme is presented. The proposed system has been developed to solve control problems appearing in the field of Networked Control Systems (NCS). Several models and methodologies have been proposed to measure different resources consumptions. The use of bandwidth, computational load and energy resources have been investigated. This analysis shows how the parameters of the system impacts on the resources efficiency. Moreover, the proposed system has been compared with its equivalent discrete-time solution. In the experiments, an application of NCS for mobile robots navigation has been set up and its resource usage efficiency has been analysed. PMID:29346321
A technological approach to studying motor planning ability in children at high risk for ASD.
Taffoni, F; Focaroli, V; Keller, F; Iverson, J M
2014-01-01
In this work we propose a new method to study the development of motor planning abilities in children and, in particular, in children at high risk for ASD. Although several modified motor signs have been found in children with ASD, no specific markers enabling the early assessment of risk have been found yet. In this work, we discuss the problem posed by objective and quantitative behavioral analysis in non-structured environment. After an initial description of the main constraints imposed by the ecological approach, a technological and methodological solution to these issues is presented. Preliminary results on 12 children are reported and briefly discussed.
A risk assessment methodology for critical transportation infrastructure.
DOT National Transportation Integrated Search
2002-01-01
Infrastructure protection typifies a problem of risk assessment and management in a large-scale system. This study offers a methodological framework to identify, prioritize, assess, and manage risks. It includes the following major considerations: (1...
Methodology for nonwork travel analysis in suburban communities.
DOT National Transportation Integrated Search
1994-01-01
The increase in the number of nonwork trips during the past decade has contributed substantially to congestion and to environmental problems. Data collection methodologies, descriptive information, and reliable models of nonwork travel behavior are n...
ERIC Educational Resources Information Center
Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna
2004-01-01
Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…
2017-06-15
the methodology of reducing the online-algorithm-selecting problem as a contextual bandit problem, which is yet another interactive learning...KH2016a] Kuan-Hao Huang and Hsuan-Tien Lin. Linear upper confidence bound algorithm for contextual bandit problem with piled rewards. In Proceedings
Inverse problems in complex material design: Applications to non-crystalline solids
NASA Astrophysics Data System (ADS)
Biswas, Parthapratim; Drabold, David; Elliott, Stephen
The design of complex amorphous materials is one of the fundamental problems in disordered condensed-matter science. While impressive developments of ab-initio simulation methods during the past several decades have brought tremendous success in understanding materials property from micro- to mesoscopic length scales, a major drawback is that they fail to incorporate existing knowledge of the materials in simulation methodologies. Since an essential feature of materials design is the synergy between experiment and theory, a properly developed approach to design materials should be able to exploit all available knowledge of the materials from measured experimental data. In this talk, we will address the design of complex disordered materials as an inverse problem involving experimental data and available empirical information. We show that the problem can be posed as a multi-objective non-convex optimization program, which can be addressed using a number of recently-developed bio-inspired global optimization techniques. In particular, we will discuss how a population-based stochastic search procedure can be used to determine the structure of non-crystalline solids (e.g. a-SiH, a-SiO2, amorphous graphene, and Fe and Ni clusters). The work is partially supported by NSF under Grant Nos. DMR 1507166 and 1507670.
Cost-benefit analysis of space technology
NASA Technical Reports Server (NTRS)
Hein, G. F.; Stevenson, S. M.; Sivo, J. N.
1976-01-01
A discussion of the implications and problems associated with the use of cost-benefit techniques is presented. Knowledge of these problems is useful in the structure of a decision making process. A methodology of cost-benefit analysis is presented for the evaluation of space technology. The use of the methodology is demonstrated with an evaluation of ion thrusters for north-south stationkeeping aboard geosynchronous communication satellites. A critique of the concept of consumers surplus for measuring benefits is also presented.
ERIC Educational Resources Information Center
Lee, Chwee Beng; Ling, Keck Voon; Reimann, Peter; Diponegoro, Yudho Ahmad; Koh, Chia Heng; Chew, Derwin
2014-01-01
Purpose: The purpose of this paper is to argue for the need to develop pre-service teachers' problem solving ability, in particular, in the context of real-world complex problems. Design/methodology/approach: To argue for the need to develop pre-service teachers' problem solving skills, the authors describe a web-based problem representation…
Using Problem-Based Learning to Enhance Team and Player Development in Youth Soccer
ERIC Educational Resources Information Center
Hubball, Harry; Robertson, Scott
2004-01-01
Problem-based learning (PBL) is a coaching and teaching methodology that develops knowledge, abilities, and skills. It also encourages participation, collaborative investigation, and the resolution of authentic, "ill-structured" problems through the use of problem definition, teamwork, communication, data collection, decision-making,…
Stock Market Index Data and indicators for Day Trading as a Binary Classification problem.
Bruni, Renato
2017-02-01
Classification is the attribution of labels to records according to a criterion automatically learned from a training set of labeled records. This task is needed in a huge number of practical applications, and consequently it has been studied intensively and several classification algorithms are available today. In finance, a stock market index is a measurement of value of a section of the stock market. It is often used to describe the aggregate trend of a market. One basic financial issue would be forecasting this trend. Clearly, such a stochastic value is very difficult to predict. However, technical analysis is a security analysis methodology developed to forecast the direction of prices through the study of past market data. Day trading consists in buying and selling financial instruments within the same trading day. In this case, one interesting problem is the automatic individuation of favorable days for trading. We model this problem as a binary classification problem, and we provide datasets containing daily index values, the corresponding values of a selection of technical indicators, and the class label, which is 1 if the subsequent time period is favorable for day trading and 0 otherwise. These datasets can be used to test the behavior of different approaches in solving the day trading problem.
A methodology to find the elementary landscape decomposition of combinatorial optimization problems.
Chicano, Francisco; Whitley, L Darrell; Alba, Enrique
2011-01-01
A small number of combinatorial optimization problems have search spaces that correspond to elementary landscapes, where the objective function f is an eigenfunction of the Laplacian that describes the neighborhood structure of the search space. Many problems are not elementary; however, the objective function of a combinatorial optimization problem can always be expressed as a superposition of multiple elementary landscapes if the underlying neighborhood used is symmetric. This paper presents theoretical results that provide the foundation for algebraic methods that can be used to decompose the objective function of an arbitrary combinatorial optimization problem into a sum of subfunctions, where each subfunction is an elementary landscape. Many steps of this process can be automated, and indeed a software tool could be developed that assists the researcher in finding a landscape decomposition. This methodology is then used to show that the subset sum problem is a superposition of two elementary landscapes, and to show that the quadratic assignment problem is a superposition of three elementary landscapes.
Mixture-based gatekeeping procedures in adaptive clinical trials.
Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji
2018-01-01
Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.
Fault Tree in the Trenches, A Success Story
NASA Technical Reports Server (NTRS)
Long, R. Allen; Goodson, Amanda (Technical Monitor)
2000-01-01
Getting caught up in the explanation of Fault Tree Analysis (FTA) minutiae is easy. In fact, most FTA literature tends to address FTA concepts and methodology. Yet there seems to be few articles addressing actual design changes resulting from the successful application of fault tree analysis. This paper demonstrates how fault tree analysis was used to identify and solve a potentially catastrophic mechanical problem at a rocket motor manufacturer. While developing the fault tree given in this example, the analyst was told by several organizations that the piece of equipment in question had been evaluated by several committees and organizations, and that the analyst was wasting his time. The fault tree/cutset analysis resulted in a joint-redesign of the control system by the tool engineering group and the fault tree analyst, as well as bragging rights for the analyst. (That the fault tree found problems where other engineering reviews had failed was not lost on the other engineering groups.) Even more interesting was that this was the analyst's first fault tree which further demonstrates how effective fault tree analysis can be in guiding (i.e., forcing) the analyst to take a methodical approach in evaluating complex systems.
On the generalized VIP time integral methodology for transient thermal problems
NASA Technical Reports Server (NTRS)
Mei, Youping; Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong
1993-01-01
The paper describes the development and applicability of a generalized VIrtual-Pulse (VIP) time integral method of computation for thermal problems. Unlike past approaches for general heat transfer computations, and with the advent of high speed computing technology and the importance of parallel computations for efficient use of computing environments, a major motivation via the developments described in this paper is the need for developing explicit computational procedures with improved accuracy and stability characteristics. As a consequence, a new and effective VIP methodology is described which inherits these improved characteristics. Numerical illustrative examples are provided to demonstrate the developments and validate the results obtained for thermal problems.
A robust optimization methodology for preliminary aircraft design
NASA Astrophysics Data System (ADS)
Prigent, S.; Maréchal, P.; Rondepierre, A.; Druot, T.; Belleville, M.
2016-05-01
This article focuses on a robust optimization of an aircraft preliminary design under operational constraints. According to engineers' know-how, the aircraft preliminary design problem can be modelled as an uncertain optimization problem whose objective (the cost or the fuel consumption) is almost affine, and whose constraints are convex. It is shown that this uncertain optimization problem can be approximated in a conservative manner by an uncertain linear optimization program, which enables the use of the techniques of robust linear programming of Ben-Tal, El Ghaoui, and Nemirovski [Robust Optimization, Princeton University Press, 2009]. This methodology is then applied to two real cases of aircraft design and numerical results are presented.
Heuristic evaluation of infusion pumps: implications for patient safety in Intensive Care Units.
Graham, Mark J; Kubose, Tate K; Jordan, Desmond; Zhang, Jiajie; Johnson, Todd R; Patel, Vimla L
2004-11-01
The goal of this research was to use a heuristic evaluation methodology to uncover design and interface deficiencies of infusion pumps that are currently in use in Intensive Care Units (ICUs). Because these infusion systems cannot be readily replaced due to lease agreements and large-scale institutional purchasing procedures, we argue that it is essential to systematically identify the existing usability problems so that the possible causes of errors can be better understood, passed on to the end-users (e.g., critical care nurses), and used to make policy recommendations. Four raters conducted the heuristic evaluation of the three-channel infusion pump interface. Three raters had a cognitive science background as well as experience with the heuristic evaluation methodology. The fourth rater was a veteran critical care nurse who had extensive experience operating the pumps. The usability experts and the domain expert independently evaluated the user interface and physical design of the infusion pump and generated a list of heuristic violations based upon a set of 14 heuristics developed in previous research. The lists were compiled and then rated on the severity of the violation. From 14 usability heuristics considered in this evaluation of the Infusion Pump, there were 231 violations. Two heuristics, "Consistency" and "Language", were found to have the most violations. The one with fewest violations was "Document". While some heuristic evaluation categories had more violations than others, the most severe ones were not confined to one type. The Primary interface location (e.g., where loading the pump, changing doses, and confirming drug settings takes place) had the most occurrences of heuristic violations. We believe that the Heuristic Evaluation methodology provides a simple and cost-effective approach to discovering medical device deficiencies that affect a patient's general well being. While this methodology provides information for the infusion pump designs of the future, it also identifies important insights concerning equipment that is currently in use in critical care environments.
National estimates of Australian gambling prevalence: f indings from a dual-frame omnibus survey.
Dowling, N A; Youssef, G J; Jackson, A C; Pennay, D W; Francis, K L; Pennay, A; Lubman, D I
2016-03-01
The increase in mobile telephone-only households may be a source of bias for traditional landline gambling prevalence surveys. Aims were to: (1) identify Australian gambling participation and problem gambling prevalence using a dual-frame (50% landline and 50% mobile telephone) computer-assisted telephone interviewing methodology; (2) explore the predictors of sample frame and telephone status; and (3) explore the degree to which sample frame and telephone status moderate the relationships between respondent characteristics and problem gambling. A total of 2000 adult respondents residing in Australia were interviewed from March to April 2013. Participation in multiple gambling activities and Problem Gambling Severity Index (PGSI). Estimates were: gambling participation [63.9%, 95% confidence interval (CI) = 61.4-66.3], problem gambling (0.4%, 95% CI = 0.2-0.8), moderate-risk gambling (1.9%, 95% CI = 1.3-2.6) and low-risk gambling (3.0%, 95% CI = 2.2-4.0). Relative to the landline frame, the mobile frame was more likely to gamble on horse/greyhound races [odds ratio (OR) = 1.4], casino table games (OR = 5.0), sporting events (OR = 2.2), private games (OR = 1.9) and the internet (OR = 6.5); less likely to gamble on lotteries (OR = 0.6); and more likely to gamble on five or more activities (OR = 2.4), display problem gambling (OR = 6.4) and endorse PGSI items (OR = 2.4-6.1). Only casino table gambling (OR = 2.9) and internet gambling (OR = 3.5) independently predicted mobile frame membership. Telephone status (landline frame versus mobile dual users and mobile-only users) displayed similar findings. Finally, sample frame and/or telephone status moderated the relationship between gender, relationship status, health and problem gambling (OR = 2.9-7.6). Given expected future increases in the mobile telephone-only population, best practice in population gambling research should use dual frame sampling methodologies (at least 50% landline and 50% mobile telephone) for telephone interviewing. © 2015 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.
National estimates of Australian gambling prevalence: f indings from a dual‐frame omnibus survey
Youssef, G. J.; Jackson, A. C.; Pennay, D. W.; Francis, K. L.; Pennay, A.; Lubman, D. I.
2016-01-01
Abstract Background, aims and design The increase in mobile telephone‐only households may be a source of bias for traditional landline gambling prevalence surveys. Aims were to: (1) identify Australian gambling participation and problem gambling prevalence using a dual‐frame (50% landline and 50% mobile telephone) computer‐assisted telephone interviewing methodology; (2) explore the predictors of sample frame and telephone status; and (3) explore the degree to which sample frame and telephone status moderate the relationships between respondent characteristics and problem gambling. Setting and participants A total of 2000 adult respondents residing in Australia were interviewed from March to April 2013. Measurements Participation in multiple gambling activities and Problem Gambling Severity Index (PGSI). Findings Estimates were: gambling participation [63.9%, 95% confidence interval (CI) = 61.4–66.3], problem gambling (0.4%, 95% CI = 0.2–0.8), moderate‐risk gambling (1.9%, 95% CI = 1.3–2.6) and low‐risk gambling (3.0%, 95% CI = 2.2–4.0). Relative to the landline frame, the mobile frame was more likely to gamble on horse/greyhound races [odds ratio (OR) = 1.4], casino table games (OR = 5.0), sporting events (OR = 2.2), private games (OR = 1.9) and the internet (OR = 6.5); less likely to gamble on lotteries (OR = 0.6); and more likely to gamble on five or more activities (OR = 2.4), display problem gambling (OR = 6.4) and endorse PGSI items (OR = 2.4‐6.1). Only casino table gambling (OR = 2.9) and internet gambling (OR = 3.5) independently predicted mobile frame membership. Telephone status (landline frame versus mobile dual users and mobile‐only users) displayed similar findings. Finally, sample frame and/or telephone status moderated the relationship between gender, relationship status, health and problem gambling (OR = 2.9–7.6). Conclusion Given expected future increases in the mobile telephone‐only population, best practice in population gambling research should use dual frame sampling methodologies (at least 50% landline and 50% mobile telephone) for telephone interviewing. PMID:26381314
DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS
Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...
Using soft systems methodology to develop a simulation of out-patient services.
Lehaney, B; Paul, R J
1994-10-01
Discrete event simulation is an approach to modelling a system in the form of a set of mathematical equations and logical relationships, usually used for complex problems, which are difficult to address by using analytical or numerical methods. Managing out-patient services is such a problem. However, simulation is not in itself a systemic approach, in that it provides no methodology by which system boundaries and system activities may be identified. The investigation considers the use of soft systems methodology as an aid to drawing system boundaries and identifying system activities, for the purpose of simulating the outpatients' department at a local hospital. The long term aims are to examine the effects that the participative nature of soft systems methodology has on the acceptability of the simulation model, and to provide analysts and managers with a process that may assist in planning strategies for health care.
Slepoy, A; Peters, M D; Thompson, A P
2007-11-30
Molecular dynamics and other molecular simulation methods rely on a potential energy function, based only on the relative coordinates of the atomic nuclei. Such a function, called a force field, approximately represents the electronic structure interactions of a condensed matter system. Developing such approximate functions and fitting their parameters remains an arduous, time-consuming process, relying on expert physical intuition. To address this problem, a functional programming methodology was developed that may enable automated discovery of entirely new force-field functional forms, while simultaneously fitting parameter values. The method uses a combination of genetic programming, Metropolis Monte Carlo importance sampling and parallel tempering, to efficiently search a large space of candidate functional forms and parameters. The methodology was tested using a nontrivial problem with a well-defined globally optimal solution: a small set of atomic configurations was generated and the energy of each configuration was calculated using the Lennard-Jones pair potential. Starting with a population of random functions, our fully automated, massively parallel implementation of the method reproducibly discovered the original Lennard-Jones pair potential by searching for several hours on 100 processors, sampling only a minuscule portion of the total search space. This result indicates that, with further improvement, the method may be suitable for unsupervised development of more accurate force fields with completely new functional forms. Copyright (c) 2007 Wiley Periodicals, Inc.
Reliability-Productivity Curve, a Tool for Adaptation Measures Identification
NASA Astrophysics Data System (ADS)
Chávez-Jiménez, A.; Granados, A.; Garrote, L. M.
2015-12-01
Due to climate change effects, water scarcity problems would intensify in several regions. These problems are going to impact negatively in the water low-priority demands, since these will be reduced in favor of those with high-priority. An example would be the reduction of agriculture water resources in favor of the urban ones. Then, it is important the evaluation of adaptation measures for a better water resources management. An important tool to face this challenge is the economic valuation of the water demands' impact within a water resources system. In agriculture this valuation is usually performed through the water productivity evaluation. The water productivity evaluation requires detailed information regarding the different crops like the applied technology, the agricultural supplies management, the water availability, etc. This is a restriction for an evaluation at basin scale due to the difficulty of gathers this level of detailed information. Besides, only the water availability is taken into account, but not the period when the water is distributed (i.e. water resources reliability). Water resources reliability is one of the most important variables in water resources management. This research proposes a methodology to determine the agriculture water productivity, using as variables the crops information, the crops price, the water resources availability, and the water resources reliability, at a basin scale. This methodology would allow identifying general water resources adaptation measures, providing the basis for further detailed studies in critical regions.
NASA Astrophysics Data System (ADS)
Nebot, Àngela; Mugica, Francisco
2012-10-01
Fuzzy inductive reasoning (FIR) is a modelling and simulation methodology derived from the General Systems Problem Solver. It compares favourably with other soft computing methodologies, such as neural networks, genetic or neuro-fuzzy systems, and with hard computing methodologies, such as AR, ARIMA, or NARMAX, when it is used to predict future behaviour of different kinds of systems. This paper contains an overview of the FIR methodology, its historical background, and its evolution.
A Dimensionally Reduced Clustering Methodology for Heterogeneous Occupational Medicine Data Mining.
Saâdaoui, Foued; Bertrand, Pierre R; Boudet, Gil; Rouffiac, Karine; Dutheil, Frédéric; Chamoux, Alain
2015-10-01
Clustering is a set of techniques of the statistical learning aimed at finding structures of heterogeneous partitions grouping homogenous data called clusters. There are several fields in which clustering was successfully applied, such as medicine, biology, finance, economics, etc. In this paper, we introduce the notion of clustering in multifactorial data analysis problems. A case study is conducted for an occupational medicine problem with the purpose of analyzing patterns in a population of 813 individuals. To reduce the data set dimensionality, we base our approach on the Principal Component Analysis (PCA), which is the statistical tool most commonly used in factorial analysis. However, the problems in nature, especially in medicine, are often based on heterogeneous-type qualitative-quantitative measurements, whereas PCA only processes quantitative ones. Besides, qualitative data are originally unobservable quantitative responses that are usually binary-coded. Hence, we propose a new set of strategies allowing to simultaneously handle quantitative and qualitative data. The principle of this approach is to perform a projection of the qualitative variables on the subspaces spanned by quantitative ones. Subsequently, an optimal model is allocated to the resulting PCA-regressed subspaces.
Luo, Haoxiang; Mittal, Rajat; Zheng, Xudong; Bielamowicz, Steven A.; Walsh, Raymond J.; Hahn, James K.
2008-01-01
A new numerical approach for modeling a class of flow–structure interaction problems typically encountered in biological systems is presented. In this approach, a previously developed, sharp-interface, immersed-boundary method for incompressible flows is used to model the fluid flow and a new, sharp-interface Cartesian grid, immersed boundary method is devised to solve the equations of linear viscoelasticity that governs the solid. The two solvers are coupled to model flow–structure interaction. This coupled solver has the advantage of simple grid generation and efficient computation on simple, single-block structured grids. The accuracy of the solid-mechanics solver is examined by applying it to a canonical problem. The solution methodology is then applied to the problem of laryngeal aerodynamics and vocal fold vibration during human phonation. This includes a three-dimensional eigen analysis for a multi-layered vocal fold prototype as well as two-dimensional, flow-induced vocal fold vibration in a modeled larynx. Several salient features of the aerodynamics as well as vocal-fold dynamics are presented. PMID:19936017
Adaptive Finite Element Methods for Continuum Damage Modeling
NASA Technical Reports Server (NTRS)
Min, J. B.; Tworzydlo, W. W.; Xiques, K. E.
1995-01-01
The paper presents an application of adaptive finite element methods to the modeling of low-cycle continuum damage and life prediction of high-temperature components. The major objective is to provide automated and accurate modeling of damaged zones through adaptive mesh refinement and adaptive time-stepping methods. The damage modeling methodology is implemented in an usual way by embedding damage evolution in the transient nonlinear solution of elasto-viscoplastic deformation problems. This nonlinear boundary-value problem is discretized by adaptive finite element methods. The automated h-adaptive mesh refinements are driven by error indicators, based on selected principal variables in the problem (stresses, non-elastic strains, damage, etc.). In the time domain, adaptive time-stepping is used, combined with a predictor-corrector time marching algorithm. The time selection is controlled by required time accuracy. In order to take into account strong temperature dependency of material parameters, the nonlinear structural solution a coupled with thermal analyses (one-way coupling). Several test examples illustrate the importance and benefits of adaptive mesh refinements in accurate prediction of damage levels and failure time.
Food insecurity and child behavior problems in fragile families.
King, Christian
2018-02-01
Food insecurity remains a persistent problem in the United States. Several studies have shown that food insecurity is associated with child externalizing and internalizing behavior problems. However, some potential methodological limitations remain. For example, most studies use a household measure of food insecurity while there is evidence that children, especially younger ones, tend to be shielded by their parents from experiencing food insecurity. In addition, the mechanisms through which food insecurity affects children are not well understood. This study uses longitudinal data from the Fragile Families and Child Wellbeing Study to address these limitations. Fixed-effects models show that the association is even larger using a measure of child food insecurity instead of a household one. Correlated-random effects models show a large difference in child behavior problems between food secure and food insecure children due to unobserved heterogeneity. In addition, the association between child food insecurity and child externalizing behaviors remains largely unexplained while food insecurity among adults explains almost all the variation in the association with child internalizing behaviors. Food insecure children and parents are at risk of micronutrient deficiencies, which may lead to behavior problems in young children. These findings underscore the need for greater focus on reducing the risk of food insecurity, especially for children in fragile families, in order to reduce behavior problems and improve their educational attainment. Copyright © 2017 Elsevier B.V. All rights reserved.
Birth-death prior on phylogeny and speed dating
2008-01-01
Background In recent years there has been a trend of leaving the strict molecular clock in order to infer dating of speciations and other evolutionary events. Explicit modeling of substitution rates and divergence times makes formulation of informative prior distributions for branch lengths possible. Models with birth-death priors on tree branching and auto-correlated or iid substitution rates among lineages have been proposed, enabling simultaneous inference of substitution rates and divergence times. This problem has, however, mainly been analysed in the Markov chain Monte Carlo (MCMC) framework, an approach requiring computation times of hours or days when applied to large phylogenies. Results We demonstrate that a hill-climbing maximum a posteriori (MAP) adaptation of the MCMC scheme results in considerable gain in computational efficiency. We demonstrate also that a novel dynamic programming (DP) algorithm for branch length factorization, useful both in the hill-climbing and in the MCMC setting, further reduces computation time. For the problem of inferring rates and times parameters on a fixed tree, we perform simulations, comparisons between hill-climbing and MCMC on a plant rbcL gene dataset, and dating analysis on an animal mtDNA dataset, showing that our methodology enables efficient, highly accurate analysis of very large trees. Datasets requiring a computation time of several days with MCMC can with our MAP algorithm be accurately analysed in less than a minute. From the results of our example analyses, we conclude that our methodology generally avoids getting trapped early in local optima. For the cases where this nevertheless can be a problem, for instance when we in addition to the parameters also infer the tree topology, we show that the problem can be evaded by using a simulated-annealing like (SAL) method in which we favour tree swaps early in the inference while biasing our focus towards rate and time parameter changes later on. Conclusion Our contribution leaves the field open for fast and accurate dating analysis of nucleotide sequence data. Modeling branch substitutions rates and divergence times separately allows us to include birth-death priors on the times without the assumption of a molecular clock. The methodology is easily adapted to take data from fossil records into account and it can be used together with a broad range of rate and substitution models. PMID:18318893
Birth-death prior on phylogeny and speed dating.
Akerborg, Orjan; Sennblad, Bengt; Lagergren, Jens
2008-03-04
In recent years there has been a trend of leaving the strict molecular clock in order to infer dating of speciations and other evolutionary events. Explicit modeling of substitution rates and divergence times makes formulation of informative prior distributions for branch lengths possible. Models with birth-death priors on tree branching and auto-correlated or iid substitution rates among lineages have been proposed, enabling simultaneous inference of substitution rates and divergence times. This problem has, however, mainly been analysed in the Markov chain Monte Carlo (MCMC) framework, an approach requiring computation times of hours or days when applied to large phylogenies. We demonstrate that a hill-climbing maximum a posteriori (MAP) adaptation of the MCMC scheme results in considerable gain in computational efficiency. We demonstrate also that a novel dynamic programming (DP) algorithm for branch length factorization, useful both in the hill-climbing and in the MCMC setting, further reduces computation time. For the problem of inferring rates and times parameters on a fixed tree, we perform simulations, comparisons between hill-climbing and MCMC on a plant rbcL gene dataset, and dating analysis on an animal mtDNA dataset, showing that our methodology enables efficient, highly accurate analysis of very large trees. Datasets requiring a computation time of several days with MCMC can with our MAP algorithm be accurately analysed in less than a minute. From the results of our example analyses, we conclude that our methodology generally avoids getting trapped early in local optima. For the cases where this nevertheless can be a problem, for instance when we in addition to the parameters also infer the tree topology, we show that the problem can be evaded by using a simulated-annealing like (SAL) method in which we favour tree swaps early in the inference while biasing our focus towards rate and time parameter changes later on. Our contribution leaves the field open for fast and accurate dating analysis of nucleotide sequence data. Modeling branch substitutions rates and divergence times separately allows us to include birth-death priors on the times without the assumption of a molecular clock. The methodology is easily adapted to take data from fossil records into account and it can be used together with a broad range of rate and substitution models.
OPUS: Optimal Projection for Uncertain Systems. Volume 1
1991-09-01
unifiedI control- design methodology that directly addresses these technology issues. 1 In particular, optimal projection theory addresses the need for...effects, and limited identification accuracy in a 1-g environment. The principal contribution of OPUS is a unified design methodology that...characterizing solutions to constrained control- design problems. Transforming OPUS into a practi- cal design methodology requires the development of
Efficacy of a Self-Help Treatment for At-Risk and Pathological Gamblers.
Boudreault, Catherine; Giroux, Isabelle; Jacques, Christian; Goulet, Annie; Simoneau, Hélène; Ladouceur, Robert
2018-06-01
Available evidence suggests that self-help treatments may reduce problem gambling severity but inconsistencies of results across clinical trials leave the extent of their benefits unclear. Moreover, no self-help treatment has yet been validated within a French Canadian setting. The current study therefore assesses the efficacy of a French language self-help treatment including three motivational telephone interviews spread over an 11-week period and a cognitive-behavioral self-help workbook. At-risk and pathological gamblers were randomly assigned to the treatment group (n = 31) or the waiting list (n = 31). Relative to the waiting list, the treatment group showed a statistically significant reduction in the number of DSM-5 gambling disorder criteria met, gambling habits, and gambling consequences at Week 11. Perceived self-efficacy and life satisfaction also significantly improved after 11 weeks for the treatment group, but not for the waiting list group. At Week 11, 13% of participants had dropped out of the study. All significant changes reported for the treatment group were maintained throughout 1, 6 and 12-month follow-ups. Results support the efficacy of the self-help treatment to reduce problem gambling severity, gambling behaviour and to improve overall functioning among a sample of French Canadian problem gamblers over short, medium and long term. Findings from this study lend support to the appropriateness of self-help treatments for problem gamblers and help clarify inconsistencies found in the literature. The low dropout rate is discussed with respect to the advantages of the self-help format. Clinical and methodological implications of the results are put forth.
Suggested criteria for evaluating systems engineering methodologies
NASA Technical Reports Server (NTRS)
Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.
1989-01-01
Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.
Fuzzy Linear Programming and its Application in Home Textile Firm
NASA Astrophysics Data System (ADS)
Vasant, P.; Ganesan, T.; Elamvazuthi, I.
2011-06-01
In this paper, new fuzzy linear programming (FLP) based methodology using a specific membership function, named as modified logistic membership function is proposed. The modified logistic membership function is first formulated and its flexibility in taking up vagueness in parameter is established by an analytical approach. The developed methodology of FLP has provided a confidence in applying to real life industrial production planning problem. This approach of solving industrial production planning problem can have feedback with the decision maker, the implementer and the analyst.
Robust Feedback Control of Flow Induced Structural Radiation of Sound
NASA Technical Reports Server (NTRS)
Heatwole, Craig M.; Bernhard, Robert J.; Franchek, Matthew A.
1997-01-01
A significant component of the interior noise of aircraft and automobiles is a result of turbulent boundary layer excitation of the vehicular structure. In this work, active robust feedback control of the noise due to this non-predictable excitation is investigated. Both an analytical model and experimental investigations are used to determine the characteristics of the flow induced structural sound radiation problem. The problem is shown to be broadband in nature with large system uncertainties associated with the various operating conditions. Furthermore the delay associated with sound propagation is shown to restrict the use of microphone feedback. The state of the art control methodologies, IL synthesis and adaptive feedback control, are evaluated and shown to have limited success for solving this problem. A robust frequency domain controller design methodology is developed for the problem of sound radiated from turbulent flow driven plates. The control design methodology uses frequency domain sequential loop shaping techniques. System uncertainty, sound pressure level reduction performance, and actuator constraints are included in the design process. Using this design method, phase lag was added using non-minimum phase zeros such that the beneficial plant dynamics could be used. This general control approach has application to lightly damped vibration and sound radiation problems where there are high bandwidth control objectives requiring a low controller DC gain and controller order.
Applying Lakatos' Theory to the Theory of Mathematical Problem Solving.
ERIC Educational Resources Information Center
Nunokawa, Kazuhiko
1996-01-01
The relation between Lakatos' theory and issues in mathematics education, especially mathematical problem solving, is investigated by examining Lakatos' methodology of a scientific research program. (AIM)
Mining EEG with SVM for Understanding Cognitive Underpinnings of Math Problem Solving Strategies
López, Julio
2018-01-01
We have developed a new methodology for examining and extracting patterns from brain electric activity by using data mining and machine learning techniques. Data was collected from experiments focused on the study of cognitive processes that might evoke different specific strategies in the resolution of math problems. A binary classification problem was constructed using correlations and phase synchronization between different electroencephalographic channels as characteristics and, as labels or classes, the math performances of individuals participating in specially designed experiments. The proposed methodology is based on using well-established procedures of feature selection, which were used to determine a suitable brain functional network size related to math problem solving strategies and also to discover the most relevant links in this network without including noisy connections or excluding significant connections. PMID:29670667
Mining EEG with SVM for Understanding Cognitive Underpinnings of Math Problem Solving Strategies.
Bosch, Paul; Herrera, Mauricio; López, Julio; Maldonado, Sebastián
2018-01-01
We have developed a new methodology for examining and extracting patterns from brain electric activity by using data mining and machine learning techniques. Data was collected from experiments focused on the study of cognitive processes that might evoke different specific strategies in the resolution of math problems. A binary classification problem was constructed using correlations and phase synchronization between different electroencephalographic channels as characteristics and, as labels or classes, the math performances of individuals participating in specially designed experiments. The proposed methodology is based on using well-established procedures of feature selection, which were used to determine a suitable brain functional network size related to math problem solving strategies and also to discover the most relevant links in this network without including noisy connections or excluding significant connections.
Acceptance and commitment therapy in the treatment of anxiety: a systematic review.
Swain, Jessica; Hancock, Karen; Hainsworth, Cassandra; Bowman, Jenny
2013-12-01
With a lifetime prevalence of approximately 17% among community-dwelling adults, anxiety disorders are among the most pervasive of contemporary psychiatric afflictions. Traditional Cognitive Behaviour Therapy (CBT) is currently the first line evidence-based psychosocial intervention for the treatment of anxiety. Previous research, however, has found that a significant proportion of patients do not respond to traditional CBT or exhibit residual symptomatology at treatment cessation. Additionally, there is a paucity of evidence among child populations and for the comparative effectiveness of alternative interventions. Acceptance and Commitment Therapy (ACT) has a growing empirical base demonstrating its efficacy for an array of problems. A systematic review was conducted to examine the evidence for ACT in the treatment of anxiety. PsycInfo, PsycArticles, PsycExtra, Medline and Proquest databases were searched, reference lists examined and citation searches conducted. Two independent reviewers analysed results, determined study eligibility and assessed methodological quality. Thirty-eight studies met inclusion criteria (total n=323). The spectrum of DSM-IV anxiety disorders as well as test and public speaking anxiety were examined. Studies were predominantly between-group design and case studies, with few employing control comparisons. Several methodological issues limit conclusions; however results provide preliminary support for ACT. Larger scale, methodologically rigorous trials are needed to consolidate these findings. © 2013.
On the Analysis of Two-Person Problem Solving Protocols.
ERIC Educational Resources Information Center
Schoenfeld, Alan H.
Methodological issues in the use of protocol analysis for research into human problem solving processes are examined through a case study in which two students were videotaped as they worked together to solve mathematical problems "out loud." The students' chosen strategic or executive behavior in examining and solving a problem was…
Problem? "No Problem!" Solving Technical Contradictions
ERIC Educational Resources Information Center
Kutz, K. Scott; Stefan, Victor
2007-01-01
TRIZ (pronounced TREES), the Russian acronym for the theory of inventive problem solving, enables a person to focus his attention on finding genuine, potential solutions in contrast to searching for ideas that "may" work through a happenstance way. It is a patent database-backed methodology that helps to reduce time spent on the problem,…
ERIC Educational Resources Information Center
Cormas, Peter C.
2016-01-01
Preservice teachers (N = 27) in two sections of a sequenced, methodological and process integrated mathematics/science course solved a levers problem with three similar learning processes and a problem-solving approach, and identified a problem-solving approach through one different learning process. Similar learning processes used included:…
A TAPS Interactive Multimedia Package to Solve Engineering Dynamics Problem
ERIC Educational Resources Information Center
Sidhu, S. Manjit; Selvanathan, N.
2005-01-01
Purpose: To expose engineering students to using modern technologies, such as multimedia packages, to learn, visualize and solve engineering problems, such as in mechanics dynamics. Design/methodology/approach: A multimedia problem-solving prototype package is developed to help students solve an engineering problem in a step-by-step approach. A…
Hardware proofs using EHDM and the RSRE verification methodology
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Sjogren, Jon A.
1988-01-01
Examined is a methodology for hardware verification developed by Royal Signals and Radar Establishment (RSRE) in the context of the SRI International's Enhanced Hierarchical Design Methodology (EHDM) specification/verification system. The methodology utilizes a four-level specification hierarchy with the following levels: functional level, finite automata model, block model, and circuit level. The properties of a level are proved as theorems in the level below it. This methodology is applied to a 6-bit counter problem and is critically examined. The specifications are written in EHDM's specification language, Extended Special, and the proofs are improving both the RSRE methodology and the EHDM system.
Multi-disciplinary rehabilitation for acquired brain injury in adults of working age.
Turner-Stokes, Lynne; Pick, Anton; Nair, Ajoy; Disler, Peter B; Wade, Derick T
2015-12-22
Evidence from systematic reviews demonstrates that multi-disciplinary rehabilitation is effective in the stroke population, in which older adults predominate. However, the evidence base for the effectiveness of rehabilitation following acquired brain injury (ABI) in younger adults has not been established, perhaps because this scenario presents different methodological challenges in research. To assess the effects of multi-disciplinary rehabilitation following ABI in adults 16 to 65 years of age. We ran the most recent search on 14 September 2015. We searched the Cochrane Injuries Group Specialised Register, The Cochrane Library, Ovid MEDLINE(R), Ovid MEDLINE(R) In-Process & Other Non-Indexed Citations, Ovid MEDLINE(R) Daily and Ovid OLDMEDLINE(R), Embase Classic+Embase (OvidSP), Web of Science (ISI WOS) databases, clinical trials registers, and we screened reference lists. Randomised controlled trials (RCTs) comparing multi-disciplinary rehabilitation versus routinely available local services or lower levels of intervention; or trials comparing an intervention in different settings, of different intensities or of different timing of onset. Controlled clinical trials were included, provided they met pre-defined methodological criteria. Three review authors independently selected trials and rated their methodological quality. A fourth review author would have arbitrated if consensus could not be reached by discussion, but in fact, this did not occur. As in previous versions of this review, we used the method described by Van Tulder 1997 to rate the quality of trials and to perform a 'best evidence' synthesis by attributing levels of evidence on the basis of methodological quality. Risk of bias assessments were performed in parallel using standard Cochrane methodology. However, the Van Tulder system provided a more discriminative evaluation of rehabilitation trials, so we have continued to use it for our primary synthesis of evidence. We subdivided trials in terms of severity of brain injury, setting and type and timing of rehabilitation offered. We identified a total of 19 studies involving 3480 people. Twelve studies were of good methodological quality and seven were of lower quality, according to the van Tulder scoring system. Within the subgroup of predominantly mild brain injury, 'strong evidence' suggested that most individuals made a good recovery when appropriate information was provided, without the need for additional specific interventions. For moderate to severe injury, 'strong evidence' showed benefit from formal intervention, and 'limited evidence' indicated that commencing rehabilitation early after injury results in better outcomes. For participants with moderate to severe ABI already in rehabilitation, 'strong evidence' revealed that more intensive programmes are associated with earlier functional gains, and 'moderate evidence' suggested that continued outpatient therapy could help to sustain gains made in early post-acute rehabilitation. The context of multi-disciplinary rehabilitation appears to influence outcomes. 'Strong evidence' supports the use of a milieu-oriented model for patients with severe brain injury, in which comprehensive cognitive rehabilitation takes place in a therapeutic environment and involves a peer group of patients. 'Limited evidence' shows that specialist in-patient rehabilitation and specialist multi-disciplinary community rehabilitation may provide additional functional gains, but studies serve to highlight the particular practical and ethical restraints imposed on randomisation of severely affected individuals for whom no realistic alternatives to specialist intervention are available. Problems following ABI vary. Consequently, different interventions and combinations of interventions are required to meet the needs of patients with different problems. Patients who present acutely to hospital with mild brain injury benefit from follow-up and appropriate information and advice. Those with moderate to severe brain injury benefit from routine follow-up so their needs for rehabilitation can be assessed. Intensive intervention appears to lead to earlier gains, and earlier intervention whilst still in emergency and acute care has been supported by limited evidence. The balance between intensity and cost-effectiveness has yet to be determined. Patients discharged from in-patient rehabilitation benefit from access to out-patient or community-based services appropriate to their needs. Group-based rehabilitation in a therapeutic milieu (where patients undergo neuropsychological rehabilitation in a therapeutic environment with a peer group of individuals facing similar challenges) represents an effective approach for patients requiring neuropsychological rehabilitation following severe brain injury. Not all questions in rehabilitation can be addressed by randomised controlled trials or other experimental approaches. For example, trial-based literature does not tell us which treatments work best for which patients over the long term, and which models of service represent value for money in the context of life-long care. In the future, such questions will need to be considered alongside practice-based evidence gathered from large systematic longitudinal cohort studies conducted in the context of routine clinical practice.
How Root Cause Analysis Can Improve the Value Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wixson, James Robert
2002-05-01
Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can bemore » developed in the creativity phase because the team better understands the problems associated with these functions.« less
Ambient Vibration Testing for Story Stiffness Estimation of a Heritage Timber Building
Min, Kyung-Won; Kim, Junhee; Park, Sung-Ah; Park, Chan-Soo
2013-01-01
This paper investigates dynamic characteristics of a historic wooden structure by ambient vibration testing, presenting a novel estimation methodology of story stiffness for the purpose of vibration-based structural health monitoring. As for the ambient vibration testing, measured structural responses are analyzed by two output-only system identification methods (i.e., frequency domain decomposition and stochastic subspace identification) to estimate modal parameters. The proposed methodology of story stiffness is estimation based on an eigenvalue problem derived from a vibratory rigid body model. Using the identified natural frequencies, the eigenvalue problem is efficiently solved and uniquely yields story stiffness. It is noteworthy that application of the proposed methodology is not necessarily confined to the wooden structure exampled in the paper. PMID:24227999
Object-oriented programming for the biosciences.
Wiechert, W; Joksch, B; Wittig, R; Hartbrich, A; Höner, T; Möllney, M
1995-10-01
The development of software systems for the biosciences is always closely connected to experimental practice. Programs must be able to handle the inherent complexity and heterogeneous structure of biological systems in combination with the measuring equipment. Moreover, a high degree of flexibility is required to treat rapidly changing experimental conditions. Object-oriented methodology seems to be well suited for this purpose. It enables an evolutionary approach to software development that still maintains a high degree of modularity. This paper presents experience with object-oriented technology gathered during several years of programming in the fields of bioprocess development and metabolic engineering. It concentrates on the aspects of experimental support, data analysis, interaction and visualization. Several examples are presented and discussed in the general context of the experimental cycle of knowledge acquisition, thus pointing out the benefits and problems of object-oriented technology in the specific application field of the biosciences. Finally, some strategies for future development are described.
Advanced Computational Modeling Approaches for Shock Response Prediction
NASA Technical Reports Server (NTRS)
Derkevorkian, Armen; Kolaini, Ali R.; Peterson, Lee
2015-01-01
Motivation: (1) The activation of pyroshock devices such as explosives, separation nuts, pin-pullers, etc. produces high frequency transient structural response, typically from few tens of Hz to several hundreds of kHz. (2) Lack of reliable analytical tools makes the prediction of appropriate design and qualification test levels a challenge. (3) In the past few decades, several attempts have been made to develop methodologies that predict the structural responses to shock environments. (4) Currently, there is no validated approach that is viable to predict shock environments overt the full frequency range (i.e., 100 Hz to 10 kHz). Scope: (1) Model, analyze, and interpret space structural systems with complex interfaces and discontinuities, subjected to shock loads. (2) Assess the viability of a suite of numerical tools to simulate transient, non-linear solid mechanics and structural dynamics problems, such as shock wave propagation.
Craig, Hugh; Berretta, Regina; Moscato, Pablo
2016-01-01
In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416
[Problem-based learning, a strategy to employ it].
Guillamet Lloveras, Ana; Celma Vicente, Matilde; González Carrión, Pilar; Cano-Caballero Gálvez, Ma Dolores; Pérez Ramírez, Francisca
2009-02-01
The Virgen de las Nieves University School of Nursing has adopted the methodology of Problem-Based Learning (ABP in Spanish acronym) as a supplementary method to gain specific transversal competencies. In so doing, all basic required/obligatory subjects necessary for a degree have been partially affected. With the objective of identifying and administering all the structural and cultural barriers which could impede the success or effectiveness of its adoption, a strategic analysis at the School was carried out. This technique was based on a) knowing the strong and weak points the School has for adopting the Problem-Based Learning methodology; b) describing the structural problems and necessities to carry out this teaching innovation; c) to discover the needs professors have regarding knowledge and skills related to Problem-Based Learning; d) to prepare students by informing them about the characteristics of Problem-Based Learning; e) to evaluate the results obtained by means of professor and student opinions, f) to adopt the improvements identified. The stages followed were: strategic analysis, preparation, pilot program, adoption and evaluation.
A Nursing Process Methodology.
ERIC Educational Resources Information Center
Ryan-Wenger, Nancy M.
1990-01-01
A nursing methodology developed by the faculty at The Ohio State University teaches nursing students problem-solving techniques applicable to any nursing situation. It also provides faculty and students with a basis for measuring students' progress and ability in applying the nursing process. (Author)
Software production methodology tested project
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1976-01-01
The history and results of a 3 1/2-year study in software development methodology are reported. The findings of this study have become the basis for DSN software development guidelines and standard practices. The article discusses accomplishments, discoveries, problems, recommendations and future directions.
STUDYING FOREST ROOT SYSTEMS - AN OVERVIEW OF METHODOLOGICAL PROBLEMS
The study of tree root systems is central to understanding forest ecosystem carbon and nutrient cycles, nutrient and water uptake, C allocation patterns by trees, soil microbial populations, adaptation of trees to stress, soil organic matter production, etc. Methodological probl...
[Primary care in maternal-child health].
Pedreira Massa, J L
1986-07-01
The theoretical and methodological elements of primary health care (PHC) include a philosophy of work and an epistemological focus toward the processes of health and illness, as well as a practical medical anthropological knowledge of the culture-specific aspects of disease. The work methodology of PHC requires care of the individual as a bio-psycho-socio-affective being integrated into a particular environment; none of the aspects of being should be neglected or given priority. Care should also be integrated in the sense of providing preventive health care as well as curative and rehabilitative services, in all phases from training of health personnel to record keeping. The primary health care team is multidisciplinary in constitution and interdisciplinary in function. PHC assumes that health care will be accessible to users and that continuity of care will be provided. The need for community participation in all phases of health care has been reiterated in several international health declarations. A well-functioning PHC system will require new types of pre- and postgraduate health education in a changing social and professional system and continuing education under adequate supervision for health workers. Research capability for identifying community health problems, a rigorous evaluation system, and epidemiologic surveillance are also needed. All of these elements are applicable to the field of maternal and child health as well as to PHC. The most appropriate place to intervene in order to correct existing imbalances in access to health care for mothers and children is in the PHC system. Examples of areas that should be stressed include vaccinations, nutrition, psychomotor development, early diagnosis and treatment for handicapped children, prevention of childhood accidents, school health and absenteeism, all aspects of health education, adoption and alternatives to abandonment of children, alcoholism and addiction, adolescent pregnancy and family planning, dental health, and mental problems. Trained primary care pediatricians working within the community as part of the PHC system will be required to confront and solve complex health problems. The training needed does not signify a new speciality or subspeciality, but rather a training methodology and a new type of professional practice.
Morgan, Z; Brugha, T; Fryers, T; Stewart-Brown, S
2012-11-01
Abusive and neglectful parenting is an established determinant of adult mental illness, but longitudinal studies of the impact of less severe problems with parenting have yielded inconsistent findings. In the face of growing interest in mental health promotion, it is important to establish the impact of this potentially remediable risk factor. 8,405 participants in the 1958 UK birth cohort study, and 5,058 in the 1970 birth cohort study questionnaires relating to the quality of relationships with parents completed at age 16 years. 12-item General Health Questionnaire and the Malaise Inventory collected at age 42 years (1958 cohort) and 30 years (1970 cohort). Statistical methodology: logistic regression analyses adjusting for sex, social class and teenage mental health problems. 1958 cohort: relationships with both mother and father predicted mental health problems in adulthood; increasingly poor relationships were associated with increasing mental health problems at age 42 years. 1970 cohort: positive items derived from the Parental Bonding Instrument predicted reduced risk of mental health problems; negative aspects predicted increased risk at age 30 years. Odds of mental health problems were increased between 20 and 80% in fully adjusted models. Results support the hypothesis that problems with parent-child relationships that fall short of abuse and neglect play a part in determining adult mental health and suggest that interventions to support parenting now being implemented in many parts of the Western world may reduce the prevalence of mental illness in adulthood.
A hierarchical methodology for urban facade parsing from TLS point clouds
NASA Astrophysics Data System (ADS)
Li, Zhuqiang; Zhang, Liqiang; Mathiopoulos, P. Takis; Liu, Fangyu; Zhang, Liang; Li, Shuaipeng; Liu, Hao
2017-01-01
The effective and automated parsing of building facades from terrestrial laser scanning (TLS) point clouds of urban environments is an important research topic in the GIS and remote sensing fields. It is also challenging because of the complexity and great variety of the available 3D building facade layouts as well as the noise and data missing of the input TLS point clouds. In this paper, we introduce a novel methodology for the accurate and computationally efficient parsing of urban building facades from TLS point clouds. The main novelty of the proposed methodology is that it is a systematic and hierarchical approach that considers, in an adaptive way, the semantic and underlying structures of the urban facades for segmentation and subsequent accurate modeling. Firstly, the available input point cloud is decomposed into depth planes based on a data-driven method; such layer decomposition enables similarity detection in each depth plane layer. Secondly, the labeling of the facade elements is performed using the SVM classifier in combination with our proposed BieS-ScSPM algorithm. The labeling outcome is then augmented with weak architectural knowledge. Thirdly, least-squares fitted normalized gray accumulative curves are applied to detect regular structures, and a binarization dilation extraction algorithm is used to partition facade elements. A dynamic line-by-line division is further applied to extract the boundaries of the elements. The 3D geometrical façade models are then reconstructed by optimizing facade elements across depth plane layers. We have evaluated the performance of the proposed method using several TLS facade datasets. Qualitative and quantitative performance comparisons with several other state-of-the-art methods dealing with the same facade parsing problem have demonstrated its superiority in performance and its effectiveness in improving segmentation accuracy.
Project management practices in engineering university
NASA Astrophysics Data System (ADS)
Sirazitdinova, Y.; Dulzon, A.; Mueller, B.
2015-10-01
The article presents the analysis of usage of project management methodology in Tomsk Polytechnic University, in particular the experience with the course Project management which started 15 years ago. The article presents the discussion around advantages of project management methodology for engineering education and administration of the university in general and the problems impeding extensive implementation of this methodology in teaching, research and management in the university.
NASA Technical Reports Server (NTRS)
David, J. W.; Mitchell, L. D.
1982-01-01
Difficulties in solution methodology to be used to deal with the potentially higher nonlinear rotor equations when dynamic coupling is included. A solution methodology is selected to solve the nonlinear differential equations. The selected method was verified to give good results even at large nonlinearity levels. The transfer matrix methodology is extended to the solution of nonlinear problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krummel, J.R.; Markin, J.B.; O'Neill, R.V.
Regional analyses of the interaction between human populations and natural resources must integrate landscape scale environmental problems. An approach that considers human culture, environmental processes, and resource needs offers an appropriate methodology. With this methodology, we analyze problems of food availability in African cattle-keeping societies. The analysis interrelates cattle biomass, forage availability, milk and blood production, crop yields, gathering, food subsidies, population, and variable precipitation. While an excess of cattle leads to overgrazing, cattle also serve as valuable food storage mechanisms during low rainfall periods. Food subsidies support higher population levels but do not alter drought-induced population fluctuations. Variable precipitationmore » patterns require solutions that stabilize year-to-year food production and also address problems of overpopulation.« less
Automating the packing heuristic design process with genetic programming.
Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John
2012-01-01
The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.
An assessment of the potential of PFEM-2 for solving long real-time industrial applications
NASA Astrophysics Data System (ADS)
Gimenez, Juan M.; Ramajo, Damián E.; Márquez Damián, Santiago; Nigro, Norberto M.; Idelsohn, Sergio R.
2017-07-01
The latest generation of the particle finite element method (PFEM-2) is a numerical method based on the Lagrangian formulation of the equations, which presents advantages in terms of robustness and efficiency over classical Eulerian methodologies when certain kind of flows are simulated, especially those where convection plays an important role. These situations are often encountered in real engineering problems, where very complex geometries and operating conditions require very large and long computations. The advantages that the parallelism introduced in the computational fluid dynamics making affordable computations with very fine spatial discretizations are well known. However, it is not possible to have the time parallelized, despite the effort that is being dedicated to use space-time formulations. In this sense, PFEM-2 adds a valuable feature in that its strong stability with little loss of accuracy provides an interesting way of satisfying the real-life computation needs. After having already demonstrated in previous publications its ability to achieve academic-based solutions with a good compromise between accuracy and efficiency, in this work, the method is revisited and employed to solve several nonacademic problems of technological interest, which fall into that category. Simulations concerning oil-water separation, waste-water treatment, metallurgical foundries, and safety assessment are presented. These cases are selected due to their particular requirements of long simulation times and or intensive interface treatment. Thus, large time-steps may be employed with PFEM-2 without compromising the accuracy and robustness of the simulation, as occurs with Eulerian alternatives, showing the potentiality of the methodology for solving not only academic tests but also real engineering problems.
Volk, Martin; Lautenbach, Sven; van Delden, Hedwig; Newham, Lachlan T H; Seppelt, Ralf
2010-12-01
This article analyses the benefits and shortcomings of the recently developed decision support systems (DSS) FLUMAGIS, Elbe-DSS, CatchMODS, and MedAction. The analysis elaborates on the following aspects: (i) application area/decision problem, (ii) stakeholder interaction/users involved, (iii) structure of DSS/model structure, (iv) usage of the DSS, and finally (v) most important shortcomings. On the basis of this analysis, we formulate four criteria that we consider essential for the successful use of DSS in landscape and river basin management. The criteria relate to (i) system quality, (ii) user support and user training, (iii) perceived usefulness and (iv) user satisfaction. We can show that the availability of tools and technologies for DSS in landscape and river basin management is good to excellent. However, our investigations indicate that several problems have to be tackled. First of all, data availability and homogenisation, uncertainty analysis and uncertainty propagation and problems with model integration require further attention. Furthermore, the appropriate and methodological stakeholder interaction and the definition of 'what end-users really need and want' have been documented as general shortcomings of all four examples of DSS. Thus, we propose an iterative development process that enables social learning of the different groups involved in the development process, because it is easier to design a DSS for a group of stakeholders who actively participate in an iterative process. We also identify two important lines of further development in DSS: the use of interactive visualization tools and the methodology of optimization to inform scenario elaboration and evaluate trade-offs among environmental measures and management alternatives.
NASA Astrophysics Data System (ADS)
Volk, Martin; Lautenbach, Sven; van Delden, Hedwig; Newham, Lachlan T. H.; Seppelt, Ralf
2010-12-01
This article analyses the benefits and shortcomings of the recently developed decision support systems (DSS) FLUMAGIS, Elbe-DSS, CatchMODS, and MedAction. The analysis elaborates on the following aspects: (i) application area/decision problem, (ii) stakeholder interaction/users involved, (iii) structure of DSS/model structure, (iv) usage of the DSS, and finally (v) most important shortcomings. On the basis of this analysis, we formulate four criteria that we consider essential for the successful use of DSS in landscape and river basin management. The criteria relate to (i) system quality, (ii) user support and user training, (iii) perceived usefulness and (iv) user satisfaction. We can show that the availability of tools and technologies for DSS in landscape and river basin management is good to excellent. However, our investigations indicate that several problems have to be tackled. First of all, data availability and homogenisation, uncertainty analysis and uncertainty propagation and problems with model integration require further attention. Furthermore, the appropriate and methodological stakeholder interaction and the definition of `what end-users really need and want' have been documented as general shortcomings of all four examples of DSS. Thus, we propose an iterative development process that enables social learning of the different groups involved in the development process, because it is easier to design a DSS for a group of stakeholders who actively participate in an iterative process. We also identify two important lines of further development in DSS: the use of interactive visualization tools and the methodology of optimization to inform scenario elaboration and evaluate trade-offs among environmental measures and management alternatives.
Boiocchi, Riccardo; Gernaey, Krist V; Sin, Gürkan
2016-10-01
A methodology is developed to systematically design the membership functions of fuzzy-logic controllers for multivariable systems. The methodology consists of a systematic derivation of the critical points of the membership functions as a function of predefined control objectives. Several constrained optimization problems corresponding to different qualitative operation states of the system are defined and solved to identify, in a consistent manner, the critical points of the membership functions for the input variables. The consistently identified critical points, together with the linguistic rules, determine the long term reachability of the control objectives by the fuzzy logic controller. The methodology is highlighted using a single-stage side-stream partial nitritation/Anammox reactor as a case study. As a result, a new fuzzy-logic controller for high and stable total nitrogen removal efficiency is designed. Rigorous simulations are carried out to evaluate and benchmark the performance of the controller. The results demonstrate that the novel control strategy is capable of rejecting the long-term influent disturbances, and can achieve a stable and high TN removal efficiency. Additionally, the controller was tested, and showed robustness, against measurement noise levels typical for wastewater sensors. A feedforward-feedback configuration using the present controller would give even better performance. In comparison, a previously developed fuzzy-logic controller using merely expert and intuitive knowledge performed worse. This proved the importance of using a systematic methodology for the derivation of the membership functions for multivariable systems. These results are promising for future applications of the controller in real full-scale plants. Furthermore, the methodology can be used as a tool to help systematically design fuzzy logic control applications for other biological processes. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Riasi, S.; Huang, G.; Montemagno, C.; Yeghiazarian, L.
2013-12-01
Micro-scale modeling of multiphase flow in porous media is critical to characterize porous materials. Several modeling techniques have been implemented to date, but none can be used as a general strategy for all porous media applications due to challenges presented by non-smooth high-curvature solid surfaces, and by a wide range of pore sizes and porosities. Finite approaches like the finite volume method require a high quality, problem-dependent mesh, while particle-based approaches like the lattice Boltzmann require too many particles to achieve a stable meaningful solution. Both come at a large computational cost. Other methods such as pore network modeling (PNM) have been developed to accelerate the solution process by simplifying the solution domain, but so far a unique and straightforward methodology to implement PNM is lacking. We have developed a general, stable and fast methodology to model multi-phase fluid flow in porous materials, irrespective of their porosity and solid phase topology. We have applied this methodology to highly porous fibrous materials in which void spaces are not distinctly separated, and where simplifying the geometry into a network of pore bodies and throats, as in PNM, does not result in a topology-consistent network. To this end, we have reduced the complexity of the 3-D void space geometry by working with its medial surface. We have used a non-iterative fast medial surface finder algorithm to determine a voxel-wide medial surface of the void space, and then solved the quasi-static drainage and imbibition on the resulting domain. The medial surface accurately represents the topology of the porous structure including corners, irregular cross sections, etc. This methodology is capable of capturing corner menisci and the snap-off mechanism numerically. It also allows for calculation of pore size distribution, permeability and capillary pressure-saturation-specific interfacial area surface of the porous structure. To show the capability of this method to numerically estimate the capillary pressure in irregular cross sections, we compared our results with analytical solutions available for capillary tubes with non-circular cross sections. We also validated this approach by implementing it on well-known benchmark problems such as a bundle of cylinders and packed spheres.
The Dogma of "The" Scientific Method.
ERIC Educational Resources Information Center
Wivagg, Dan; Allchin, Douglas
2002-01-01
Points out major problems with the scientific method as a model for learning about methodology in science and suggests teaching about the scientists' toolbox to remedy problems with the conventional scientific method. (KHR)
Fuzzy multi objective transportation problem – evolutionary algorithm approach
NASA Astrophysics Data System (ADS)
Karthy, T.; Ganesan, K.
2018-04-01
This paper deals with fuzzy multi objective transportation problem. An fuzzy optimal compromise solution is obtained by using Fuzzy Genetic Algorithm. A numerical example is provided to illustrate the methodology.
NASA Technical Reports Server (NTRS)
Newman, James C., III
1995-01-01
The limiting factor in simulating flows past realistic configurations of interest has been the discretization of the physical domain on which the governing equations of fluid flow may be solved. In an attempt to circumvent this problem, many Computational Fluid Dynamic (CFD) methodologies that are based on different grid generation and domain decomposition techniques have been developed. However, due to the costs involved and expertise required, very few comparative studies between these methods have been performed. In the present work, the two CFD methodologies which show the most promise for treating complex three-dimensional configurations as well as unsteady moving boundary problems are evaluated. These are namely the structured-overlapped and the unstructured grid schemes. Both methods use a cell centered, finite volume, upwind approach. The structured-overlapped algorithm uses an approximately factored, alternating direction implicit scheme to perform the time integration, whereas, the unstructured algorithm uses an explicit Runge-Kutta method. To examine the accuracy, efficiency, and limitations of each scheme, they are applied to the same steady complex multicomponent configurations and unsteady moving boundary problems. The steady complex cases consist of computing the subsonic flow about a two-dimensional high-lift multielement airfoil and the transonic flow about a three-dimensional wing/pylon/finned store assembly. The unsteady moving boundary problems are a forced pitching oscillation of an airfoil in a transonic freestream and a two-dimensional, subsonic airfoil/store separation sequence. Accuracy was accessed through the comparison of computed and experimentally measured pressure coefficient data on several of the wing/pylon/finned store assembly's components and at numerous angles-of-attack for the pitching airfoil. From this study, it was found that both the structured-overlapped and the unstructured grid schemes yielded flow solutions of comparable accuracy for these simulations. This study also indicated that, overall, the structured-overlapped scheme was slightly more CPU efficient than the unstructured approach.
Review of the use of povidone-iodine (PVP-I) in the treatment of burns.
Steen, M
1993-01-01
Local infection and burn wound sepsis are one of the most severe problems in the treatment of thermally injured patients. Early surgical treatment and the use of topical antiseptics led to a decrease in the infection rate and significantly improved the survival rate of burns patients within the last twenty-five years. Many antiseptics are used in the treatment of burns. Silver nitrate, silver sulphadiazine, sulfamylon and povidone-iodine (PVP-I) are the most common substances used worldwide in burn care facilities. Clinical studies demonstrate that treatment with PVP-I is the most effective against bacterial and fungal infection. Several methodological problems however arise from direct comparison between these antiseptics, and local and systemic adverse effects can make the right choice difficult. Some case reports documented possible side effects in the treatment of patients with PVP-I, leading to general concerns about this treatment. Absorption of iodine and possible changes in thyroid hormones are well known, but evaluation of the clinical consequences is controversial. Reports of severe metabolic acidosis and renal insufficiency with lethal results have condemned the use of PVP-I in the treatment of extensive burns. The case reports, however, dealt with patients suffering from general morbidity and sepsis and therefore these single reports may not be generally valid. Local treatment of burns may cause further problems. The beneficial effect of a decrease of bacterial counts in deeper tissue may be confounded by other effects delaying wound healing, as shown in some experimental studies. Controlled clinical investigations on burn patients however are still missing. The paper will discuss these topics in detail referring to the treatment of burns with PVP-I. It is based on a critical review of the literature and the author's own experience in burns therapy.
Robust optimization modelling with applications to industry and environmental problems
NASA Astrophysics Data System (ADS)
Chaerani, Diah; Dewanto, Stanley P.; Lesmana, Eman
2017-10-01
Robust Optimization (RO) modeling is one of the existing methodology for handling data uncertainty in optimization problem. The main challenge in this RO methodology is how and when we can reformulate the robust counterpart of uncertain problems as a computationally tractable optimization problem or at least approximate the robust counterpart by a tractable problem. Due to its definition the robust counterpart highly depends on how we choose the uncertainty set. As a consequence we can meet this challenge only if this set is chosen in a suitable way. The development on RO grows fast, since 2004, a new approach of RO called Adjustable Robust Optimization (ARO) is introduced to handle uncertain problems when the decision variables must be decided as a ”wait and see” decision variables. Different than the classic Robust Optimization (RO) that models decision variables as ”here and now”. In ARO, the uncertain problems can be considered as a multistage decision problem, thus decision variables involved are now become the wait and see decision variables. In this paper we present the applications of both RO and ARO. We present briefly all results to strengthen the importance of RO and ARO in many real life problems.
Pedagogy and/or technology: Making difference in improving students' problem solving skills
NASA Astrophysics Data System (ADS)
Hrepic, Zdeslav; Lodder, Katherine; Shaw, Kimberly A.
2013-01-01
Pen input computers combined with interactive software may have substantial potential for promoting active instructional methodologies and for facilitating students' problem solving ability. An excellent example is a study in which introductory physics students improved retention, conceptual understanding and problem solving abilities when one of three weekly lectures was replaced with group problem solving sessions facilitated with Tablet PCs and DyKnow software [1,2]. The research goal of the present study was to isolate the effect of the methodology itself (using additional time to teach problem solving) from that of the involved technology. In Fall 2011 we compared the performance of students taking the same introductory physics lecture course while enrolled in two separate problem-solving sections. One section used pen-based computing to facilitate group problem solving while the other section used low-tech methods for one third of the semester (covering Kinematics), and then traded technologies for the middle third of the term (covering Dynamics). Analysis of quiz, exam and standardized pre-post test results indicated no significant difference in scores of the two groups. Combining this result with those of previous studies implies primacy of pedagogy (collaborative problem solving itself) over technology for student learning in problem solving recitations.
Evaluating Writing Programs: Paradigms, Problems, Possibilities.
ERIC Educational Resources Information Center
McLeod, Susan H.
1992-01-01
Describes two methodological approaches (qualitative and quantitative) that grow out of two different research examples. Suggests the problems these methods present. Discusses the ways in which an awareness of these problems can help teachers to understand how to work with researchers in designing useful evaluations of writing programs. (PRA)
[Scientific and methodologic approaches to evaluating medical management for workers of Kazakhstan].
2012-01-01
The article covers topical problems of workers' health preservation. Complex research results enabled to evaluate and analyze occupational risks in leading industries of Kazakhstan, for improving scientific and methodologic approaches to medical management for workers subjected to hazardous conditions.
Global/local methods research using a common structural analysis framework
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.
1991-01-01
Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.
Terminology and Methodology Related to the Use of Heart Rate Responsivity in Infancy Research
ERIC Educational Resources Information Center
Woodcock, James M.
1971-01-01
Methodological problems in measuring and interpreting infantile heart rate reactivity in research are discussed. Various ways of describing cardiac activity are listed. Attention is given to the relationship between resting state and heart rate responsivity. (Author/WY)
Employee Turnover: An Empirical and Methodological Assessment.
ERIC Educational Resources Information Center
Muchinsky, Paul M.; Tuttle, Mark L.
1979-01-01
Reviews research on the prediction of employee turnover. Groups predictor variables into five general categories: attitudinal (job satisfaction), biodata, work-related, personal, and test-score predictors. Consistent relationships between common predictor variables and turnover were found for four categories. Eight methodological problems/issues…
ERIC Educational Resources Information Center
And Others; Rynders, John E.
1978-01-01
For many years, the educational capabilities of Down's syndrome persons have been underestimated because a large number of studies purporting to give an accurate picture of Down's syndrome persons' developmental capabilities have had serious methodological flaws. (Author)
Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows
NASA Astrophysics Data System (ADS)
Jittamai, Phongchai
This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.
NASA Astrophysics Data System (ADS)
Dib, Alain; Kavvas, M. Levent
2018-03-01
The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.
Radial rescaling approach for the eigenvalue problem of a particle in an arbitrarily shaped box.
Lijnen, Erwin; Chibotaru, Liviu F; Ceulemans, Arnout
2008-01-01
In the present work we introduce a methodology for solving a quantum billiard with Dirichlet boundary conditions. The procedure starts from the exactly known solutions for the particle in a circular disk, which are subsequently radially rescaled in such a way that they obey the new boundary conditions. In this way one constructs a complete basis set which can be used to obtain the eigenstates and eigenenergies of the corresponding quantum billiard to a high level of precision. Test calculations for several regular polygons show the efficiency of the method which often requires one or two basis functions to describe the lowest eigenstates with high accuracy.
Case mix reimbursement for nursing homes.
Schlenker, R E
1986-01-01
Nursing home care is growing in importance as the population ages and as Medicare's prospective payment system encourages earlier discharges from acute care settings to nursing homes. Nursing home reimbursement policy is primarily a Medicaid issue, since Medicaid pays for about half the nation's nursing home care. The research reviewed in this article suggests a strong association between case mix and cost, and a weaker but still positive association between quality and cost. The research also implies that traditional nursing home reimbursement methodologies may impede access and may lower quality for Medicaid (and Medicare) recipients. To offset these problems, several states have recently begun to incorporate case mix directly into the reimbursement process. These systems deserve careful policy consideration.
Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A
2017-12-01
Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.
Tereno, Susana; Guedeney, Nicole; Dugravier, Romain; Greacen, Tim; Saïas, Thomas; Tubach, Florence; Guédeney, Antoine
2013-06-01
Attachment is a long-term emotional link between infants and their mothers. Attachment quality influences subsequent psychosocial relationships, the ability to manage stress and, consequently, later mental health. Home intervention programmes targeting infant attachment have been implemented in several contexts with varying degrees of efficacy. Within the CAPEDP study (Parental Skills and Attachment in Early Childhood: reduction of risks linked to mental health problems and promotion of resilience), a subsample of 120 families were recruited with the objective of assessing the impact of this home-visiting programme on infant attachment organisation using the Strange Situation Procedure. The present paper describes the methodology used in this ancillary study.
Comparing the Correlation Length of Grain Markets in China and France
NASA Astrophysics Data System (ADS)
Roehner, Bertrand M.; Shiue, Carol H.
In economics, comparative analysis plays the same role as experimental research in physics. In this paper, we closely examine several methodological problems related to comparative analysis by investigating the specific example of grain markets in China and France respectively. This enables us to answer a question in economic history which has so far remained pending, namely whether or not market integration progressed in the 18th century. In economics as in physics, before any new result being accepted, it has to be checked and re-checked by different researchers. This is what we call the replication and comparison procedures. We show how these procedures should (and can) be implemented.
Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.
Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander
2018-04-10
A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing techniques and assessment of soft computing techniques to predict reliability. The parameter considered while estimating and prediction of reliability are also discussed. This study can be used in estimation and prediction of the reliability of various instruments used in the medical system, software engineering, computer engineering and mechanical engineering also. These concepts can be applied to both software and hardware, to predict the reliability using CBSE.
Alonso, Jordi; Vilagut, Gemma; Chatterji, Somnath; Heeringa, Steven; Schoenbaum, Michael; Üstün, T. Bedirhan; Rojas-Farreras, Sonia; Angermeyer, Matthias; Bromet, Evelyn; Bruffaerts, Ronny; de Girolamo, Giovanni; Gureje, Oye; Haro, Josep Maria; Karam, Aimee N.; Kovess, Viviane; Levinson, Daphna; Liu, Zhaorui; Mora, Maria Elena Medina; Ormel, J.; Posada-Villa, Jose; Uda, Hidenori; Kessler, Ronald C.
2010-01-01
Background The methodology commonly used to estimate disease burden, featuring ratings of severity of individual conditions, has been criticized for ignoring comorbidity. A methodology that addresses this problem is proposed and illustrated here with data from the WHO World Mental Health Surveys. Although the analysis is based on self-reports about one’s own conditions in a community survey, the logic applies equally well to analysis of hypothetical vignettes describing comorbid condition profiles. Methods Face-to-face interviews in 13 countries (six developing, nine developed; n = 31,067; response rate = 69.6%) assessed 10 classes of chronic physical and 9 of mental conditions. A visual analog scale (VAS) was used to assess overall perceived health. Multiple regression analysis with interactions for comorbidity was used to estimate associations of conditions with VAS. Simulation was used to estimate condition-specific effects. Results The best-fitting model included condition main effects and interactions of types by numbers of conditions. Neurological conditions, insomnia, and major depression were rated most severe. Adjustment for comorbidity reduced condition-specific estimates with substantial between-condition variation (.24–.70 ratios of condition-specific estimates with and without adjustment for comorbidity). The societal-level burden rankings were quite different from the individual-level rankings, with the highest societal-level rankings associated with conditions having high prevalence rather than high individual-level severity. Conclusions Plausible estimates of disorder-specific effects on VAS can be obtained using methods that adjust for comorbidity. These adjustments substantially influence condition-specific ratings. PMID:20553636
Rosati, Alexandra G; Warneken, Felix
2016-06-01
We recently reported a study (Warneken & Rosati Proceedings of the Royal Society B, 282, 20150229, 2015) examining whether chimpanzees possess several cognitive capacities that are critical to engage in cooking. In a subsequent commentary, Beran, Hopper, de Waal, Sayers, and Brosnan Learning & Behavior (2015) asserted that our paper has several flaws. Their commentary (1) critiques some aspects of our methodology and argues that our work does not constitute evidence that chimpanzees can actually cook; (2) claims that these results are old news, as previous work had already demonstrated that chimpanzees possess most or all of these capacities; and, finally, (3) argues that comparative psychological studies of chimpanzees cannot adequately address questions about human evolution, anyway. However, their critique of the premise of our study simply reiterates several points we made in the original paper. To quote ourselves: "As chimpanzees neither control fire nor cook food in their natural behavior, these experiments therefore focus not on whether chimpanzees can actually cook food, but rather whether they can apply their cognitive skills to novel problems that emulate cooking" (Warneken & Rosati Proceedings of the Royal Society B, 282, 20150229, 2015, p. 2). Furthermore, the methodological issues they raise are standard points about psychological research with animals-many of which were addressed synthetically across our 9 experiments, or else are orthogonal to our claims. Finally, we argue that comparative studies of extant apes (and other nonhuman species) are a powerful and indispensable method for understanding human cognitive evolution.
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
Decision Support Model for Optimal Management of Coastal Gate
NASA Astrophysics Data System (ADS)
Ditthakit, Pakorn; Chittaladakorn, Suwatana
2010-05-01
The coastal areas are intensely settled by human beings owing to their fertility of natural resources. However, at present those areas are facing with water scarcity problems: inadequate water and poor water quality as a result of saltwater intrusion and inappropriate land-use management. To solve these problems, several measures have been exploited. The coastal gate construction is a structural measure widely performed in several countries. This manner requires the plan for suitably operating coastal gates. Coastal gate operation is a complicated task and usually concerns with the management of multiple purposes, which are generally conflicted one another. This paper delineates the methodology and used theories for developing decision support modeling for coastal gate operation scheduling. The developed model was based on coupling simulation and optimization model. The weighting optimization technique based on Differential Evolution (DE) was selected herein for solving multiple objective problems. The hydrodynamic and water quality models were repeatedly invoked during searching the optimal gate operations. In addition, two forecasting models:- Auto Regressive model (AR model) and Harmonic Analysis model (HA model) were applied for forecasting water levels and tide levels, respectively. To demonstrate the applicability of the developed model, it was applied to plan the operations for hypothetical system of Pak Phanang coastal gate system, located in Nakhon Si Thammarat province, southern part of Thailand. It was found that the proposed model could satisfyingly assist decision-makers for operating coastal gates under various environmental, ecological and hydraulic conditions.
RBT-GA: a novel metaheuristic for solving the Multiple Sequence Alignment problem.
Taheri, Javid; Zomaya, Albert Y
2009-07-07
Multiple Sequence Alignment (MSA) has always been an active area of research in Bioinformatics. MSA is mainly focused on discovering biologically meaningful relationships among different sequences or proteins in order to investigate the underlying main characteristics/functions. This information is also used to generate phylogenetic trees. This paper presents a novel approach, namely RBT-GA, to solve the MSA problem using a hybrid solution methodology combining the Rubber Band Technique (RBT) and the Genetic Algorithm (GA) metaheuristic. RBT is inspired by the behavior of an elastic Rubber Band (RB) on a plate with several poles, which is analogues to locations in the input sequences that could potentially be biologically related. A GA attempts to mimic the evolutionary processes of life in order to locate optimal solutions in an often very complex landscape. RBT-GA is a population based optimization algorithm designed to find the optimal alignment for a set of input protein sequences. In this novel technique, each alignment answer is modeled as a chromosome consisting of several poles in the RBT framework. These poles resemble locations in the input sequences that are most likely to be correlated and/or biologically related. A GA-based optimization process improves these chromosomes gradually yielding a set of mostly optimal answers for the MSA problem. RBT-GA is tested with one of the well-known benchmarks suites (BALiBASE 2.0) in this area. The obtained results show that the superiority of the proposed technique even in the case of formidable sequences.
Development of Contemporary Problem-Based Learning Projects in Particle Technology
ERIC Educational Resources Information Center
Harris, Andrew T.
2009-01-01
The University of Sydney has offered an undergraduate course in particle technology using a contemporary problem based learning (PBL) methodology since 2005. Student learning is developed through the solution of complex, open-ended problems drawn from modern chemical engineering practice. Two examples are presented; i) zero emission electricity…
The Study of Socio-Biospheric Problems.
ERIC Educational Resources Information Center
Scott, Andrew M.
Concepts, tools, and a methodology are needed which will permit the analysis of emergent socio-biospheric problems and facilitate their effective management. Many contemporary problems may be characterized as socio-biospheric; for example, pollution of the seas, acid rain, the growth of cities, and an atmosphere loaded with carcinogens. However,…
Atwood's Machine as a Tool to Introduce Variable Mass Systems
ERIC Educational Resources Information Center
de Sousa, Celia A.
2012-01-01
This article discusses an instructional strategy which explores eventual similarities and/or analogies between familiar problems and more sophisticated systems. In this context, the Atwood's machine problem is used to introduce students to more complex problems involving ropes and chains. The methodology proposed helps students to develop the…
Methodology of modeling and measuring computer architectures for plasma simulations
NASA Technical Reports Server (NTRS)
Wang, L. P. T.
1977-01-01
A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.
ERIC Educational Resources Information Center
Mosher, Paul H.
1979-01-01
Reviews the history, literature, and methodology of collection evaluation or assessment in American research libraries; discusses current problems, tools, and methodology of evaluation; and describes an ongoing collection evaluation program at the Stanford University Libraries. (Author/MBR)
NASA Astrophysics Data System (ADS)
Ghosh, Sukanya; Roy, Souvanic; Sanyal, Manas Kumar
2016-09-01
With the help of a case study, the article has explored current practices of implementation of governmental affordable housing programme for urban poor in a slum of India. This work shows that the issues associated with the problems of governmental affordable housing programme has to be addressed to with a suitable methodology as complexities are not only dealing with quantitative data but qualitative data also. The Hard System Methodologies (HSM), which is conventionally applied to address the issues, deals with real and known problems which can be directly solved. Since most of the issues of affordable housing programme as found in the case study are subjective and complex in nature, Soft System Methodology (SSM) has been tried for better representation from subjective points of views. The article explored drawing of Rich Picture as an SSM approach for better understanding and analysing complex issues and constraints of affordable housing programme so that further exploration of the issues is possible.
Expert System Development Methodology (ESDM)
NASA Technical Reports Server (NTRS)
Sary, Charisse; Gilstrap, Lewey; Hull, Larry G.
1990-01-01
The Expert System Development Methodology (ESDM) provides an approach to developing expert system software. Because of the uncertainty associated with this process, an element of risk is involved. ESDM is designed to address the issue of risk and to acquire the information needed for this purpose in an evolutionary manner. ESDM presents a life cycle in which a prototype evolves through five stages of development. Each stage consists of five steps, leading to a prototype for that stage. Development may proceed to a conventional development methodology (CDM) at any time if enough has been learned about the problem to write requirements. ESDM produces requirements so that a product may be built with a CDM. ESDM is considered preliminary because is has not yet been applied to actual projects. It has been retrospectively evaluated by comparing the methods used in two ongoing expert system development projects that did not explicitly choose to use this methodology but which provided useful insights into actual expert system development practices and problems.
Udod, Sonia A; Racine, Louise
2017-12-01
To draw on the findings of a grounded theory study aimed at exploring how power is exercised in nurse-manager relationships in the hospital setting, this paper examines the empirical and pragmatic adequacy of grounded theory as a methodology to advance the concept of empowerment in the area of nursing leadership and management. The evidence on staff nurse empowerment has highlighted the magnitude of individual and organisational outcomes, but has not fully explicated the micro-level processes underlying how power is exercised, shared or created within the nurse-manager relationship. Although grounded theory is a widely adopted nursing research methodology, it remains less used in nursing leadership because of the dominance of quantitative approaches to research. Grounded theory methodology provides the empirical and pragmatic relevance to inform nursing practice and policy. Grounded theory is a relevant qualitative approach to use in leadership research as it provides a fine and detailed analysis of the process underlying complexity and bureaucracy. Discursive paper. A critical examination of the empirical and pragmatic relevance of grounded theory by (Corbin & Strauss, , ) as a method for analysing and solving problems in nurses' practice is provided. This paper provides evidence to support the empirical and pragmatic adequacy of grounded theory methodology. Although the application of the ontological, epistemological and methodological assumptions of grounded theory is challenging, this methodology is useful to address real-life problems in nursing practice by developing theoretical explanations of nurse empowerment, or lack thereof, in the workplace. Grounded theory represents a relevant methodology to inform nursing leadership research. Grounded theory is anchored in the reality of practice. The strength of grounded theory is to provide results that can be readily applied to clinical practice and policy as they arise from problems that affect practice and that are meaningful to nurses. © 2017 John Wiley & Sons Ltd.
Kodra, Yllka; Kondili, Loreta A; Ferraroni, Alessia; Serra, Maria Antonietta; Caretto, Flavia; Ricci, Maria Antonietta; Taruscio, Domenica
2016-01-01
Prader-Willi syndrome (PWS) is a rare genetic disorder characterized by severe hypotonia during the neonatal period and the first two years of life, the onset of hyperphagia with a risk of obesity during infancy and adulthood, learning difficulties and behavioral or severe psychiatric problems. This complex disease has severe consequences and difficult management issues also for patients' families. Parents of children with PWS need appropriate psychoeducational intervention in order to better manage their children with PWS. The purpose of this study was the implementation and evaluation of a PWS psychoeducational parent training program. The Italian National Center for Rare Diseases implemented a pilot parent training program offered to parents of children with PWS. The intervention's effects was evaluated using questionnaires comprised of 11 items rated on a 7 point Likert scale. The intervention was offered to 43 parents. The behavior problems management, dietary restrictions, autonomy and relationships were indicated by parents as the priority topics which needed to be addressed. Evaluations, immediately post-intervention and after 6 months, were reported by parents, fulfilling specific questionnaires. 90% of parents involved in the study, appreciated the methodology, 86% felt more informed about PWS, 47-62% felt more capable to better approach behaviour's problems, 20-25% felt better about the child's health situation and future expectations. Feeling more capable to help the child autonomy and relationships were reported in 62% and 63% of parents respectively, which decreased significantly (p < 0.05) according to the evaluation 6 months after the intervention. Younger age of parents (< 44 years of age) was significantly correlated with better understanding on how to help the child's autonomy (OR: 0.05; CI: 0.04-0.8) and to better collaborate with the child's teachers (OR: 0.02; CI: 0.001-0.9). Parent training is a promising intervention for parents of children with behavior's problems. Interventions with a behaviorally oriented program, addressed to parents of PWS affected children, is a useful tool in increasing their ability to manage the problems related to the disease.
a New Hybrid Yin-Yang Swarm Optimization Algorithm for Uncapacitated Warehouse Location Problems
NASA Astrophysics Data System (ADS)
Heidari, A. A.; Kazemizade, O.; Hakimpour, F.
2017-09-01
Yin-Yang-pair optimization (YYPO) is one of the latest metaheuristic algorithms (MA) proposed in 2015 that tries to inspire the philosophy of balance between conflicting concepts. Particle swarm optimizer (PSO) is one of the first population-based MA inspired by social behaviors of birds. In spite of PSO, the YYPO is not a nature inspired optimizer. It has a low complexity and starts with only two initial positions and can produce more points with regard to the dimension of target problem. Due to unique advantages of these methodologies and to mitigate the immature convergence and local optima (LO) stagnation problems in PSO, in this work, a continuous hybrid strategy based on the behaviors of PSO and YYPO is proposed to attain the suboptimal solutions of uncapacitated warehouse location (UWL) problems. This efficient hierarchical PSO-based optimizer (PSOYPO) can improve the effectiveness of PSO on spatial optimization tasks such as the family of UWL problems. The performance of the proposed PSOYPO is verified according to some UWL benchmark cases. These test cases have been used in several works to evaluate the efficacy of different MA. Then, the PSOYPO is compared to the standard PSO, genetic algorithm (GA), harmony search (HS), modified HS (OBCHS), and evolutionary simulated annealing (ESA). The experimental results demonstrate that the PSOYPO can reveal a better or competitive efficacy compared to the PSO and other MA.
Dowling, N A; Merkouris, S S; Greenwood, C J; Oldenhof, E; Toumbourou, J W; Youssef, G J
2017-02-01
This systematic review aimed to identify early risk and protective factors (in childhood, adolescence or young adulthood) longitudinally associated with the subsequent development of gambling problems. A systematic search of peer-reviewed and grey literature from 1990 to 2015 identified 15 studies published in 23 articles. Meta-analyses quantified the effect size of 13 individual risk factors (alcohol use frequency, antisocial behaviours, depression, male gender, cannabis use, illicit drug use, impulsivity, number of gambling activities, problem gambling severity, sensation seeking, tobacco use, violence, undercontrolled temperament), one relationship risk factor (peer antisocial behaviours), one community risk factor (poor academic performance), one individual protective factor (socio-economic status) and two relationship protective factors (parent supervision, social problems). Effect sizes were on average small to medium and sensitivity analyses revealed that the results were generally robust to the quality of methodological approaches of the included articles. These findings highlight the need for global prevention efforts that reduce risk factors and screen young people with high-risk profiles. There is insufficient investigation of protective factors to adequately guide prevention initiatives. Future longitudinal research is required to identify additional risk and protective factors associated with problem gambling, particularly within the relationship, community, and societal levels of the socio-ecological model. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
The Unintended Consequences of Social Media in Healthcare: New Problems and New Solutions
Atique, S.; Mayer, M. A.; Denecke, K.; Merolli, M.; Househ, M.
2016-01-01
Summary Objectives Social media is increasingly being used in conjunction with health information technology (health IT). The objective of this paper is to identify some of the undesirable outcomes that arise from this integration and to suggest solutions to these problems. Methodology After a discussion with experts to elicit the topics that should be included in the survey, we performed a narrative review based on recent literature and interviewed multidisciplinary experts from different areas. In each case, we identified and analyzed the unintended effects of social media in health IT. Results Each analyzed topic provided a different set of unintended consequences. Most relevant consequences include lack of privacy with ethical and legal issues, patient confusion in disease management, poor information accuracy in crowdsourcing, unclear responsibilities, misleading and biased information in the prevention and detection of epidemics, and demotivation in gamified health solutions with social components. Conclusions Using social media in healthcare offers several benefits, but it is not exempt of potential problems, and not all of these problems have clear solutions. We recommend careful design of digital systems in order to minimize patient’s feelings of demotivation and frustration and we recommend following specific guidelines that should be created by all stakeholders in the healthcare ecosystem. PMID:27830230
Prediction of invasion from the early stage of an epidemic
Pérez-Reche, Francisco J.; Neri, Franco M.; Taraskin, Sergei N.; Gilligan, Christopher A.
2012-01-01
Predictability of undesired events is a question of great interest in many scientific disciplines including seismology, economy and epidemiology. Here, we focus on the predictability of invasion of a broad class of epidemics caused by diseases that lead to permanent immunity of infected hosts after recovery or death. We approach the problem from the perspective of the science of complexity by proposing and testing several strategies for the estimation of important characteristics of epidemics, such as the probability of invasion. Our results suggest that parsimonious approximate methodologies may lead to the most reliable and robust predictions. The proposed methodologies are first applied to analysis of experimentally observed epidemics: invasion of the fungal plant pathogen Rhizoctonia solani in replicated host microcosms. We then consider numerical experiments of the susceptible–infected–removed model to investigate the performance of the proposed methods in further detail. The suggested framework can be used as a valuable tool for quick assessment of epidemic threat at the stage when epidemics only start developing. Moreover, our work amplifies the significance of the small-scale and finite-time microcosm realizations of epidemics revealing their predictive power. PMID:22513723
NASA Astrophysics Data System (ADS)
Ohlsson, Stellan; Cosejo, David G.
2014-07-01
The problem of how people process novel and unexpected information— deep learning (Ohlsson in Deep learning: how the mind overrides experience. Cambridge University Press, New York, 2011)—is central to several fields of research, including creativity, belief revision, and conceptual change. Researchers have not converged on a single theory for conceptual change, nor has any one theory been decisively falsified. One contributing reason is the difficulty of collecting informative data in this field. We propose that the commonly used methodologies of historical analysis, classroom interventions, and developmental studies, although indispensible, can be supplemented with studies of laboratory models of conceptual change. We introduce re- categorization, an experimental paradigm in which learners transition from one definition of a categorical concept to another, incompatible definition of the same concept, a simple form of conceptual change. We describe a re-categorization experiment, report some descriptive findings pertaining to the effects of category complexity, the temporal unfolding of learning, and the nature of the learner's final knowledge state. We end with a brief discussion of ways in which the re-categorization model can be improved.
Sum-of-Squares-Based Region of Attraction Analysis for Gain-Scheduled Three-Loop Autopilot
NASA Astrophysics Data System (ADS)
Seo, Min-Won; Kwon, Hyuck-Hoon; Choi, Han-Lim
2018-04-01
A conventional method of designing a missile autopilot is to linearize the original nonlinear dynamics at several trim points, then to determine linear controllers for each linearized model, and finally implement gain-scheduling technique. The validation of such a controller is often based on linear system analysis for the linear closed-loop system at the trim conditions. Although this type of gain-scheduled linear autopilot works well in practice, validation based solely on linear analysis may not be sufficient to fully characterize the closed-loop system especially when the aerodynamic coefficients exhibit substantial nonlinearity with respect to the flight condition. The purpose of this paper is to present a methodology for analyzing the stability of a gain-scheduled controller in a setting close to the original nonlinear setting. The method is based on sum-of-squares (SOS) optimization that can be used to characterize the region of attraction of a polynomial system by solving convex optimization problems. The applicability of the proposed SOS-based methodology is verified on a short-period autopilot of a skid-to-turn missile.
NASA Astrophysics Data System (ADS)
Wang, Lei; Xiong, Chuang; Wang, Xiaojun; Li, Yunlong; Xu, Menghui
2018-04-01
Considering that multi-source uncertainties from inherent nature as well as the external environment are unavoidable and severely affect the controller performance, the dynamic safety assessment with high confidence is of great significance for scientists and engineers. In view of this, the uncertainty quantification analysis and time-variant reliability estimation corresponding to the closed-loop control problems are conducted in this study under a mixture of random, interval, and convex uncertainties. By combining the state-space transformation and the natural set expansion, the boundary laws of controlled response histories are first confirmed with specific implementation of random items. For nonlinear cases, the collocation set methodology and fourth Rounge-Kutta algorithm are introduced as well. Enlightened by the first-passage model in random process theory as well as by the static probabilistic reliability ideas, a new definition of the hybrid time-variant reliability measurement is provided for the vibration control systems and the related solution details are further expounded. Two engineering examples are eventually presented to demonstrate the validity and applicability of the methodology developed.
Reproducibility and replicability of rodent phenotyping in preclinical studies.
Kafkafi, Neri; Agassi, Joseph; Chesler, Elissa J; Crabbe, John C; Crusio, Wim E; Eilam, David; Gerlai, Robert; Golani, Ilan; Gomez-Marin, Alex; Heller, Ruth; Iraqi, Fuad; Jaljuli, Iman; Karp, Natasha A; Morgan, Hugh; Nicholson, George; Pfaff, Donald W; Richter, S Helene; Stark, Philip B; Stiedl, Oliver; Stodden, Victoria; Tarantino, Lisa M; Tucci, Valter; Valdar, William; Williams, Robert W; Würbel, Hanno; Benjamini, Yoav
2018-04-01
The scientific community is increasingly concerned with the proportion of published "discoveries" that are not replicated in subsequent studies. The field of rodent behavioral phenotyping was one of the first to raise this concern, and to relate it to other methodological issues: the complex interaction between genotype and environment; the definitions of behavioral constructs; and the use of laboratory mice and rats as model species for investigating human health and disease mechanisms. In January 2015, researchers from various disciplines gathered at Tel Aviv University to discuss these issues. The general consensus was that the issue is prevalent and of concern, and should be addressed at the statistical, methodological and policy levels, but is not so severe as to call into question the validity and the usefulness of model organisms as a whole. Well-organized community efforts, coupled with improved data and metadata sharing, have a key role in identifying specific problems and promoting effective solutions. Replicability is closely related to validity, may affect generalizability and translation of findings, and has important ethical implications. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Nonlinear data assimilation: towards a prediction of the solar cycle
NASA Astrophysics Data System (ADS)
Svedin, Andreas
The solar cycle is the cyclic variation of solar activity, with a span of 9-14 years. The prediction of the solar cycle is an important and unsolved problem with implications for communications, aviation and other aspects of our high-tech society. Our interest is model-based prediction, and we present a self-consistent procedure for parameter estimation and model state estimation, even when only one of several model variables can be observed. Data assimilation is the art of comparing, combining and transferring observed data into a mathematical model or computer simulation. We use the 3DVAR methodology, based on the notion of least squares, to present an implementation of a traditional data assimilation. Using the Shadowing Filter — a recently developed method for nonlinear data assimilation — we outline a path towards model based prediction of the solar cycle. To achieve this end we solve a number of methodological challenges related to unobserved variables. We also provide a new framework for interpretation that can guide future predictions of the Sun and other astrophysical objects.
Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy
NASA Astrophysics Data System (ADS)
Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.
2011-08-01
The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.
An overview of prevention research: issues, answers, and new agendas.
Howard, J; Taylor, J A; Ganikos, M L; Holder, H D; Godwin, D F; Taylor, E D
1988-01-01
Efforts to curtail alcohol abuse and alcoholism can be divided into primary, secondary, and tertiary prevention. Primary prevention attempts to stop a problem or illness from occurring in the first place. Secondary prevention identifies persons in the early stages of problematic or illness behavior and refers them for counseling or treatment, which is considered tertiary prevention. Five research areas concerned with primary and secondary prevention are selected for discussion: youth, the mass media, the worksite, blacks and Hispanics, and alcohol-related behavior that increases the risk of AIDS. Several of these themes have been in the forefront of alcohol prevention research; others such as AIDS are emergent areas of injury. The discussion to follow briefly summarizes research approaches, key findings, methodological shortcomings, and suggested issues for future investigation. Although scientifically solid prevention studies have been conducted, more rigorous, more comprehensive, and more innovative research is needed. Given the dynamic sociocultural and economic systems in which prevention occurs, research techniques that can address this complexity are required. A range of appropriate methodologies is described. PMID:3141964
[Research on psychosomatic disease. Various theoretical and methodologic aspects].
Barbosa, A; Castanheira, J L; Cordeiro, J C
1992-07-01
This article mentions ther present main lines of psychosomatic research either in what concerns the elimination of the concept of psychosomatic illness, or in what concerns its etiological understanding of the peculiar ways of therapeutic approach. We specify some methodological problems resulting from using several instruments to collect data and measure them. We analyse the theoric relevance of the constructs: depressive equivalents and, specially, the alexithymia one. Starting from the consensual phenomonological description of this construct, we explain its psychodynamic understanding, its neurophysiologic basis and sociocultural determination. We question the relationship between alexithymia and psychosomatic illness. We point out the pertinency of its utilization as a risk or maintainance factor and the possibility of its modelling by ambiance factors. We clarify the main heuristic contributions of this construct to psychosomatic investigation and we analyse, critically and concisely, the validity and fidelity of some instruments of measure built to measure it. It is necessary to pay prior attention to psychosomatic investigation in the health area. We propose lines of investigation to be developed in our country that should have a multidisciplinary perspective.
Quality assessment of urban environment
NASA Astrophysics Data System (ADS)
Ovsiannikova, T. Y.; Nikolaenko, M. N.
2015-01-01
This paper is dedicated to the research applicability of quality management problems of construction products. It is offered to expand quality management borders in construction, transferring its principles to urban systems as economic systems of higher level, which qualitative characteristics are substantially defined by quality of construction product. Buildings and structures form spatial-material basis of cities and the most important component of life sphere - urban environment. Authors justify the need for the assessment of urban environment quality as an important factor of social welfare and life quality in urban areas. The authors suggest definition of a term "urban environment". The methodology of quality assessment of urban environment is based on integrated approach which includes the system analysis of all factors and application of both quantitative methods of assessment (calculation of particular and integrated indicators) and qualitative methods (expert estimates and surveys). The authors propose the system of indicators, characterizing quality of the urban environment. This indicators fall into four classes. The authors show the methodology of their definition. The paper presents results of quality assessment of urban environment for several Siberian regions and comparative analysis of these results.
ERIC Educational Resources Information Center
Bird, Anne Marie; Ross, Diane
1984-01-01
A brief history of research in sport psychology based on Lander's (1982) analysis is presented. A systematic approach to theory building is offered. Previous methodological inadequacies are identified using examples of observational learning and anxiety. (Author/DF)
Structural Equation Modeling of School Violence Data: Methodological Considerations
ERIC Educational Resources Information Center
Mayer, Matthew J.
2004-01-01
Methodological challenges associated with structural equation modeling (SEM) and structured means modeling (SMM) in research on school violence and related topics in the social and behavioral sciences are examined. Problems associated with multiyear implementations of large-scale surveys are discussed. Complex sample designs, part of any…
A spatially constrained ecological classification: rationale, methodology and implementation
Franz Mora; Louis Iverson; Louis Iverson
2002-01-01
The theory, methodology and implementation for an ecological and spatially constrained classification are presented. Ecological and spatial relationships among several landscape variables are analyzed in order to define a new approach for a landscape classification. Using ecological and geostatistical analyses, several ecological and spatial weights are derived to...
Yeast as a potential vehicle for neglected tropical disease drug discovery.
Denny, P W; Steel, P G
2015-01-01
High-throughput screening (HTS) efforts for neglected tropical disease (NTD) drug discovery have recently received increased attention because several initiatives have begun to attempt to reduce the deficit in new and clinically acceptable therapies for this spectrum of infectious diseases. HTS primarily uses two basic approaches, cell-based and in vitro target-directed screening. Both of these approaches have problems; for example, cell-based screening does not reveal the target or targets that are hit, whereas in vitro methodologies lack a cellular context. Furthermore, both can be technically challenging, expensive, and difficult to miniaturize for ultra-HTS [(u)HTS]. The application of yeast-based systems may overcome some of these problems and offer a cost-effective platform for target-directed screening within a eukaryotic cell context. Here, we review the advantages and limitations of the technologies that may be used in yeast cell-based, target-directed screening protocols, and we discuss how these are beginning to be used in NTD drug discovery. © 2014 Society for Laboratory Automation and Screening.
Krahe, Thomas E.; Wang, Weili; Medina, Alexandre E.
2009-01-01
Background Fetal alcohol spectrum disorders (FASD) are the leading cause of mental retardation in the western world and children with FASD present altered somatosensory, auditory and visual processing. There is growing evidence that some of these sensory processing problems may be related to altered cortical maps caused by impaired developmental neuronal plasticity. Methodology/Principal Findings Here we show that the primary visual cortex of ferrets exposed to alcohol during the third trimester equivalent of human gestation have decreased CREB phosphorylation and poor orientation selectivity revealed by western blotting, optical imaging of intrinsic signals and single-unit extracellular recording techniques. Treating animals several days after the period of alcohol exposure with a phosphodiesterase type 1 inhibitor (Vinpocetine) increased CREB phosphorylation and restored orientation selectivity columns and neuronal orientation tuning. Conclusions/Significance These findings suggest that CREB function is important for the maturation of orientation selectivity and that plasticity enhancement by vinpocetine may play a role in the treatment of sensory problems in FASD. PMID:19680548
"Sustainability On Earth" WebQuests: Do They Qualify as Problem-Based Learning Activities?
NASA Astrophysics Data System (ADS)
Leite, Laurinda; Dourado, Luís; Morgado, Sofia
2015-02-01
Information and communication technologies (ICT), namely the Internet, can play a valuable educational role in several school subjects, including science education. The same applies to problem-based learning (PBL), that is, a student-centered active learning methodology that can prepare students for lifelong learning. WebQuests (WQs) combine PBL and Internet use, and they can reduce the probability of having students surfing the Internet without any clear purpose. The objective of this paper is to investigate to what extent WQs available from Portuguese schools' and universities' websites, focusing on the "Sustainability on Earth" eighth-grade school science theme, are consistent with a PBL perspective. Results from content analysis of 92 WQs indicate that the WQs selected for this paper are rarely consistent with PBL requirements. Teachers should be both aware of this issue and ready to improve the WQs available before using them in their science classes so that greater educational advantage can be generated from this powerful tool.
Method for Assessing Risk of Road Accidents in Transportation of School Children
NASA Astrophysics Data System (ADS)
Pogotovkina, N. S.; Volodkin, P. P.; Demakhina, E. S.
2017-11-01
The rationale behind the problem being investigated is explained by the remaining high level of the accident rates with the participation of vehicles carrying groups of children, including school buses, in the Russian Federation over the period of several years. The article is aimed at the identification of new approaches to improve the safety of transportation of schoolchildren in accordance with the Concept of children transportation by buses and the plan for its implementation. The leading approach to solve the problem under consideration is the prediction of accidents in the schoolchildren transportation. The article presents the results of the accident rate analysis with the participation of school buses in the Russian Federation for five years. Besides, a system to monitor the transportation of schoolchildren is proposed; the system will allow analyzing and forecasting traffic accidents which involve buses carrying groups of children, including school buses. In addition, the article presents a methodology for assessing the risk of road accidents during the transportation of schoolchildren.
Decontaminate feature for tracking: adaptive tracking via evolutionary feature subset
NASA Astrophysics Data System (ADS)
Liu, Qiaoyuan; Wang, Yuru; Yin, Minghao; Ren, Jinchang; Li, Ruizhi
2017-11-01
Although various visual tracking algorithms have been proposed in the last 2-3 decades, it remains a challenging problem for effective tracking with fast motion, deformation, occlusion, etc. Under complex tracking conditions, most tracking models are not discriminative and adaptive enough. When the combined feature vectors are inputted to the visual models, this may lead to redundancy causing low efficiency and ambiguity causing poor performance. An effective tracking algorithm is proposed to decontaminate features for each video sequence adaptively, where the visual modeling is treated as an optimization problem from the perspective of evolution. Every feature vector is compared to a biological individual and then decontaminated via classical evolutionary algorithms. With the optimized subsets of features, the "curse of dimensionality" has been avoided while the accuracy of the visual model has been improved. The proposed algorithm has been tested on several publicly available datasets with various tracking challenges and benchmarked with a number of state-of-the-art approaches. The comprehensive experiments have demonstrated the efficacy of the proposed methodology.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; ...
2014-06-30
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in ordermore » to reduce the overall numerical uncertainty while leveraging available computational resources. Finally, the coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.« less
Inverse problems in heterogeneous and fractured media using peridynamics
Turner, Daniel Z.; van Bloemen Waanders, Bart G.; Parks, Michael L.
2015-12-10
The following work presents an adjoint-based methodology for solving inverse problems in heterogeneous and fractured media using state-based peridynamics. We show that the inner product involving the peridynamic operators is self-adjoint. The proposed method is illustrated for several numerical examples with constant and spatially varying material parameters as well as in the context of fractures. We also present a framework for obtaining material parameters by integrating digital image correlation (DIC) with inverse analysis. This framework is demonstrated by evaluating the bulk and shear moduli for a sample of nuclear graphite using digital photographs taken during the experiment. The resulting measuredmore » values correspond well with other results reported in the literature. Lastly, we show that this framework can be used to determine the load state given observed measurements of a crack opening. Furthermore, this type of analysis has many applications in characterizing subsurface stress-state conditions given fracture patterns in cores of geologic material.« less