Problem solving using soft systems methodology.
Land, L
This article outlines a method of problem solving which considers holistic solutions to complex problems. Soft systems methodology allows people involved in the problem situation to have control over the decision-making process.
Tracking problem solving by multivariate pattern analysis and Hidden Markov Model algorithms.
Anderson, John R
2012-03-01
Multivariate pattern analysis can be combined with Hidden Markov Model algorithms to track the second-by-second thinking as people solve complex problems. Two applications of this methodology are illustrated with a data set taken from children as they interacted with an intelligent tutoring system for algebra. The first "mind reading" application involves using fMRI activity to track what students are doing as they solve a sequence of algebra problems. The methodology achieves considerable accuracy at determining both what problem-solving step the students are taking and whether they are performing that step correctly. The second "model discovery" application involves using statistical model evaluation to determine how many substates are involved in performing a step of algebraic problem solving. This research indicates that different steps involve different numbers of substates and these substates are associated with different fluency in algebra problem solving. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Hermann, Robert
1997-01-01
The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.
Introducing soft systems methodology plus (SSM+): why we need it and what it can contribute.
Braithwaite, Jeffrey; Hindle, Don; Iedema, Rick; Westbrook, Johanna I
2002-01-01
There are many complicated and seemingly intractable problems in the health care sector. Past ways to address them have involved political responses, economic restructuring, biomedical and scientific studies, and managerialist or business-oriented tools. Few methods have enabled us to develop a systematic response to problems. Our version of soft systems methodology, SSM+, seems to improve problem solving processes by providing an iterative, staged framework that emphasises collaborative learning and systems redesign involving both technical and cultural fixes.
Conjugate gradient based projection - A new explicit methodology for frictional contact
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Li, Maocheng; Sha, Desong
1993-01-01
With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.
NASA Technical Reports Server (NTRS)
Nakajima, Yukio; Padovan, Joe
1987-01-01
In a three-part series of papers, a generalized finite element methodology is formulated to handle traveling load problems involving large deformation fields in structure composed of viscoelastic media. The main thrust of this paper is to develop an overall finite element methodology and associated solution algorithms to handle the transient aspects of moving problems involving contact impact type loading fields. Based on the methodology and algorithms formulated, several numerical experiments are considered. These include the rolling/sliding impact of tires with road obstructions.
General Methodology for Designing Spacecraft Trajectories
NASA Technical Reports Server (NTRS)
Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.
2012-01-01
A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.
School-Based Methylphenidate Placebo Protocols: Methodological and Practical Issues.
ERIC Educational Resources Information Center
Hyman, Irwin A.; Wojtowicz, Alexandra; Lee, Kee Duk; Haffner, Mary Elizabeth; Fiorello, Catherine A.; And Others
1998-01-01
Focuses on methodological issues involved in choosing instruments to monitor behavior, once a comprehensive evaluation has suggested trials on Ritalin. Case examples illustrate problems of teacher compliance in filling out measures, supplying adequate placebos, and obtaining physical cooperation. Emerging school-based methodologies are discussed…
The Ranking of Higher Education Institutions in Russia: Some Methodological Problems.
ERIC Educational Resources Information Center
Filinov, Nikolay B.; Ruchkina, Svetlana
2002-01-01
The ranking of higher education institutions in Russia is examined from two points of view: as a social phenomenon and as a multi-criteria decision-making problem. The first point of view introduces the idea of interested and involved parties; the second introduces certain principles on which a rational ranking methodology should be based.…
ERIC Educational Resources Information Center
Fetterman, David M.
1981-01-01
Shows how evaluation design and federal involvement in Youth Employment Demonstration Projects unintentionally cause a negative appraisal. Indicates the problem stems from interaction of the contract research corporation, the educational research establishment, and the federal bureaucracy, rather than a specific methodology or bureaucratic…
Design and Diagnosis Problem Solving with Multifunctional Technical Knowledge Bases
1992-09-29
STRUCTURE METHODOLOGY Design problem solving is a complex activity involving a number of subtasks. and a number of alternative methods potentially available...Conference on Artificial Intelligence. London: The British Computer Society, pp. 621-633. Friedland, P. (1979). Knowledge-based experimental design ...Computing Milieuxl: Management of Computing and Information Systems- -ty,*m man- agement General Terms: Design . Methodology Additional Key Words and Phrases
Towards a Methodology for Identifying Program Constraints During Requirements Analysis
NASA Technical Reports Server (NTRS)
Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo
1997-01-01
Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.
Control and stabilization of decentralized systems
NASA Technical Reports Server (NTRS)
Byrnes, Christopher I.; Gilliam, David; Martin, Clyde F.
1989-01-01
Proceeding from the problem posed by the need to stabilize the motion of two helicopters maneuvering a single load, a methodology is developed for the stabilization of classes of decentralized systems based on a more algebraic approach, which involves the external symmetries of decentralized systems. Stabilizing local-feedback laws are derived for any class of decentralized systems having a semisimple algebra of symmetries; the helicopter twin-lift problem, as well as certain problems involving the stabilization of discretizations of distributed parameter problems, have just such algebras of symmetries.
ERIC Educational Resources Information Center
Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna
2004-01-01
Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…
Deb, Kalyanmoy; Sinha, Ankur
2010-01-01
Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.
Action research methodology in clinical pharmacy: how to involve and change.
Nørgaard, Lotte Stig; Sørensen, Ellen Westh
2016-06-01
Introduction The focus in clinical pharmacy practice is and has for the last 30-35 years been on changing the role of pharmacy staff into service orientation and patient counselling. One way of doing this is by involving staff in change process and as a researcher to take part in the change process by establishing partnerships with staff. On the background of the authors' widespread action research (AR)-based experiences, recommendations and comments for how to conduct an AR-study is described, and one of their AR-based studies illustrate the methodology and the research methods used. Methodology AR is defined as an approach to research which is based on a problem-solving relationship between researchers and clients, which aims at both solving a problem and at collaboratively generating new knowledge. Research questions relevant in AR-studies are: what was the working process in this change oriented study? What learning and/or changes took place? What challenges/pitfalls had to be overcome? What were the influence/consequences for the involved parts? When to use If you want to implement new services and want to involve staff and others in the process, an AR methodology is very suitable. The basic advantages of doing AR-based studies are grounded in their participatory and democratic basis and their starting point in problems experienced in practice. Limitations Some of the limitations in AR-studies are that neither of the participants in a project steering group are the only ones to decide. Furthermore, the collective process makes the decision-making procedures relatively complex.
Atwood's Machine as a Tool to Introduce Variable Mass Systems
ERIC Educational Resources Information Center
de Sousa, Celia A.
2012-01-01
This article discusses an instructional strategy which explores eventual similarities and/or analogies between familiar problems and more sophisticated systems. In this context, the Atwood's machine problem is used to introduce students to more complex problems involving ropes and chains. The methodology proposed helps students to develop the…
Intentionality, degree of damage, and moral judgments.
Berg-Cross, L G
1975-12-01
153 first graders were given Piagetian moral judgment problems with a new simplified methodology as well as the usual story-pair paradigm. The new methodology involved making quantitative judgments about single stories and examined the influence of level of intentionality and degree of damage upon absolute punishment ratings. Contrary to results obtained with a story-pair methodology, it was found that with single stories even 6-year-old children responded to the level of intention in the stories as well as the quantity and quality of damage involved. This suggested that Piaget's methodology may be forcing children to employ a simplifying strategy while under other conditions they are able to perform the mental operations necessary to make complex moral judgments.
How to Arrive at Good Research Questions?
ERIC Educational Resources Information Center
Gafoor, K. Abdul
2008-01-01
Identifying an area of research a topic, deciding on a problem, and formulating it in to a researchable question are very difficult stages in the whole research process at least for beginners. Few books on research methodology elaborates the various process involved in problem selection and clarification. Viewing research and problem selection as…
Case study of a problem-based learning course of physics in a telecommunications engineering degree
NASA Astrophysics Data System (ADS)
Macho-Stadler, Erica; Jesús Elejalde-García, Maria
2013-08-01
Active learning methods can be appropriate in engineering, as their methodology promotes meta-cognition, independent learning and problem-solving skills. Problem-based learning is the educational process by which problem-solving activities and instructor's guidance facilitate learning. Its key characteristic involves posing a 'concrete problem' to initiate the learning process, generally implemented by small groups of students. Many universities have developed and used active methodologies successfully in the teaching-learning process. During the past few years, the University of the Basque Country has promoted the use of active methodologies through several teacher training programmes. In this paper, we describe and analyse the results of the educational experience using the problem-based learning (PBL) method in a physics course for undergraduates enrolled in the technical telecommunications engineering degree programme. From an instructors' perspective, PBL strengths include better student attitude in class and increased instructor-student and student-student interactions. The students emphasised developing teamwork and communication skills in a good learning atmosphere as positive aspects.
ERIC Educational Resources Information Center
Tarone, Elaine
1979-01-01
Explores the validity of Labov's (1969) "Observer Paradox," and the five axioms describing the problems involved in linguistic research, for interlanguage research. Methodological remedies are suggested. (AM)
Future Research Needs in Learning Disabilities.
ERIC Educational Resources Information Center
Senf, Gerald M.
This paper deals with future research needs and problems in learning disabilities, and is divided into the following two broad categories: (1) supporting conditions, which involve necessary prerequisites to the research effort; and (2) procedural considerations, which deal with methodological concerns. First, the problems posed by supporting…
Medical Problem-Solving: A Critique of the Literature.
ERIC Educational Resources Information Center
McGuire, Christine H.
1985-01-01
Prescriptive, decision-analysis of medical problem-solving has been based on decision theory that involves calculation and manipulation of complex probability and utility values to arrive at optimal decisions that will maximize patient benefits. The studies offer a methodology for improving clinical judgment. (Author/MLW)
ERIC Educational Resources Information Center
Yakubova, Gulnoza; Hughes, Elizabeth M.; Hornberger, Erin
2015-01-01
The purpose of this study was to determine the effectiveness of a point-of-view video modeling intervention to teach mathematics problem-solving when working on word problems involving subtracting mixed fractions with uncommon denominators. Using a multiple-probe across students design of single-case methodology, three high school students with…
A Systems Analysis Role Play Case: We Sell Stuff, Inc.
ERIC Educational Resources Information Center
Mitri, Michel; Cole, Carey
2007-01-01
Most systems development projects incorporate some sort of life cycle approach in their development. Whether the development methodology involves a traditional life cycle, prototyping, rapid application development, or some other approach, the first step usually involves a system investigation, which includes problem identification, feasibility…
Prototyping a Microcomputer-Based Online Library Catalog. Occasional Papers Number 177.
ERIC Educational Resources Information Center
Lazinger, Susan S.; Shoval, Peretz
This report examines and evaluates the application of prototyping methodology in the design of a microcomputer-based online library catalog. The methodology for carrying out the research involves a five-part examination of the problem on both the theoretical and applied levels, each of which is discussed in a separate section as follows: (1) a…
Information Needs of the Ceramic Industry; A Users-Need Study.
ERIC Educational Resources Information Center
Janning, Edward A.; And Others
This report examines the problems in the flow of scientific and technological information in the Ceramic Industry. The research methodology used involved a panel of experts which defined the functions performed by ceramists and their corresponding information needs, listed sources of information available to ceramists, and defined problems and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, R.A.
1980-12-01
This comparison study involves a preliminary verification of finite element calculations. The methodology of the comparison study consists of solving four example problems with both the SPECTROM finite element program and the MARC-CDC general purpose finite element program. The results show close agreement for all example problems.
Finding common ground in large carnivore conservation: mapping contending perspectives
Mattson, D.J.; Byrd, K.L.; Rutherford, M.B.; Brown, S.R.; Clark, T.W.
2006-01-01
Reducing current conflict over large carnivore conservation and designing effective strategies that enjoy broad public support depend on a better understanding of the values, beliefs, and demands of those who are involved or affected. We conducted a workshop attended by diverse participants involved in conservation of large carnivores in the northern U.S. Rocky Mountains, and used Q methodology to elucidate participant perspectives regarding "problems" and "solutions". Q methodology employs qualitative and quantitative techniques to reveal the subjectivity in any situation. We identified four general perspectives for both problems and solutions, three of which (Carnivore Advocates, Devolution Advocates, and Process Reformers) were shared by participants across domains. Agency Empathizers (problems) and Economic Pragmatists (solutions) were not clearly linked. Carnivore and Devolution Advocates expressed diametrically opposed perspectives that legitimized different sources of policy-relevant information ("science" for Carnivore Advocates and "local knowledge" for Devolution Advocates). Despite differences, we identified potential common ground focused on respectful, persuasive, and creative processes that would build understanding and tolerance. ?? 2006 Elsevier Ltd. All rights reserved.
A decision model for planetary missions
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.; Brigadier, W. L.
1976-01-01
Many techniques developed for the solution of problems in economics and operations research are directly applicable to problems involving engineering trade-offs. This paper investigates the use of utility theory for decision making in planetary exploration space missions. A decision model is derived that accounts for the objectives of the mission - science - the cost of flying the mission and the risk of mission failure. A simulation methodology for obtaining the probability distribution of science value and costs as a function spacecraft and mission design is presented and an example application of the decision methodology is given for various potential alternatives in a comet Encke mission.
ERIC Educational Resources Information Center
MacBride, Owen
This survey of studies of medical school costs was made in order to evaluate and compare the methodologies and findings of those studies. The survey covered studies of one or more medical schools that either produced figures for average annual per-student cost of education and/or discussed the methodologies and problems involved in producing such…
SDS-Polyacrylamide Electrophoresis and Western Blotting Applied to the Study of Asthma.
García-Solaesa, Virginia; Abad, Sara Ciria
2016-01-01
Western blotting is used to analyze proteins after being separated by electrophoresis and subsequently electro-transferred to a membrane. Once immobilized, a specific protein can be identified through its reaction with a labeled antibody or antigen. It is a methodology commonly used in biomedical research such as asthma studies, to assess the pathways of inflammatory mediators involved in the disease.Here, we describe an example of western blotting to determine the factors involved in asthma. In this chapter, the methodology of western blotting is reviewed, paying attention on potential problems and giving interesting recommendations.
NASA Astrophysics Data System (ADS)
Pawar, Sumedh; Sharma, Atul
2018-01-01
This work presents mathematical model and solution methodology for a multiphysics engineering problem on arc formation during welding and inside a nozzle. A general-purpose commercial CFD solver ANSYS FLUENT 13.0.0 is used in this work. Arc formation involves strongly coupled gas dynamics and electro-dynamics, simulated by solution of coupled Navier-Stoke equations, Maxwell's equations and radiation heat-transfer equation. Validation of the present numerical methodology is demonstrated with an excellent agreement with the published results. The developed mathematical model and the user defined functions (UDFs) are independent of the geometry and are applicable to any system that involves arc-formation, in 2D axisymmetric coordinates system. The high-pressure flow of SF6 gas in the nozzle-arc system resembles arc chamber of SF6 gas circuit breaker; thus, this methodology can be extended to simulate arcing phenomenon during current interruption.
Aeroelastic optimization methodology for viscous and turbulent flows
NASA Astrophysics Data System (ADS)
Barcelos Junior, Manuel Nascimento Dias
2007-12-01
In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.
Yakubova, Gulnoza; Hughes, Elizabeth M; Hornberger, Erin
2015-09-01
The purpose of this study was to determine the effectiveness of a point-of-view video modeling intervention to teach mathematics problem-solving when working on word problems involving subtracting mixed fractions with uncommon denominators. Using a multiple-probe across students design of single-case methodology, three high school students with ASD completed the study. All three students demonstrated greater accuracy in solving fraction word problems and maintained accuracy levels at a 1-week follow-up.
ERIC Educational Resources Information Center
Li, Yanyan; Huang, Zhinan; Jiang, Menglu; Chang, Ting-Wen
2016-01-01
Incorporating scientific fundamentals via engineering through a design-based methodology has proven to be highly effective for STEM education. Engineering design can be instantiated for learning as they involve mental and physical stimulation and develop practical skills especially in solving problems. Lego bricks, as a set of toys based on design…
Optimal use of human and machine resources for Space Station assembly operations
NASA Technical Reports Server (NTRS)
Parrish, Joseph C.
1988-01-01
This paper investigates the issues involved in determining the best mix of human and machine resources for assembly of the Space Station. It presents the current Station assembly sequence, along with descriptions of the available assembly resources. A number of methodologies for optimizing the human/machine tradeoff problem have been developed, but the Space Station assembly offers some unique issues that have not yet been addressed. These include a strong constraint on available EVA time for early flights and a phased deployment of assembly resources over time. A methodology for incorporating the previously developed decision methods to the special case of the Space Station is presented. This methodology emphasizes an application of multiple qualitative and quantitative techniques, including simulation and decision analysis, for producing an objective, robust solution to the tradeoff problem.
NASA Technical Reports Server (NTRS)
Paul, Arthur S.; Gill, Tepper L.; Maclin, Arlene P.
1989-01-01
A study of NASA's Systems Management Policy (SMP) concluded that the primary methodology being used by the Mission Operations and Data Systems Directorate and its subordinate, the Networks Division, is very effective. Still some unmet needs were identified. This study involved evaluating methodologies, tools, and techniques with the potential for resolving the previously identified deficiencies. Six preselected methodologies being used by other organizations with similar development problems were studied. The study revealed a wide range of significant differences in structure. Each system had some strengths but none will satisfy all of the needs of the Networks Division. Areas for improvement of the methodology being used by the Networks Division are listed with recommendations for specific action.
NASA Astrophysics Data System (ADS)
Çakır, Süleyman
2017-10-01
In this study, a two-phase methodology for resource allocation problems under a fuzzy environment is proposed. In the first phase, the imprecise Shannon's entropy method and the acceptability index are suggested, for the first time in the literature, to select input and output variables to be used in the data envelopment analysis (DEA) application. In the second step, an interval inverse DEA model is executed for resource allocation in a short run. In an effort to exemplify the practicality of the proposed fuzzy model, a real case application has been conducted involving 16 cement firms listed in Borsa Istanbul. The results of the case application indicated that the proposed hybrid model is a viable procedure to handle input-output selection and resource allocation problems under fuzzy conditions. The presented methodology can also lend itself to different applications such as multi-criteria decision-making problems.
ERIC Educational Resources Information Center
Fatokun, J. O.; Fatokun, K. V. F.
2013-01-01
In this paper, we present the concept of problem-based learning as a tool for learning Mathematics and Chemistry, and in fact, all sciences, using life situations or simulated scenario. The methodology involves some level of brain storming. Here, active learning takes place and knowledge gained by students either way through a collaborative…
Expert System Development Methodology (ESDM)
NASA Technical Reports Server (NTRS)
Sary, Charisse; Gilstrap, Lewey; Hull, Larry G.
1990-01-01
The Expert System Development Methodology (ESDM) provides an approach to developing expert system software. Because of the uncertainty associated with this process, an element of risk is involved. ESDM is designed to address the issue of risk and to acquire the information needed for this purpose in an evolutionary manner. ESDM presents a life cycle in which a prototype evolves through five stages of development. Each stage consists of five steps, leading to a prototype for that stage. Development may proceed to a conventional development methodology (CDM) at any time if enough has been learned about the problem to write requirements. ESDM produces requirements so that a product may be built with a CDM. ESDM is considered preliminary because is has not yet been applied to actual projects. It has been retrospectively evaluated by comparing the methods used in two ongoing expert system development projects that did not explicitly choose to use this methodology but which provided useful insights into actual expert system development practices and problems.
Discrete Adjoint-Based Design for Unsteady Turbulent Flows On Dynamic Overset Unstructured Grids
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.; Diskin, Boris
2012-01-01
A discrete adjoint-based design methodology for unsteady turbulent flows on three-dimensional dynamic overset unstructured grids is formulated, implemented, and verified. The methodology supports both compressible and incompressible flows and is amenable to massively parallel computing environments. The approach provides a general framework for performing highly efficient and discretely consistent sensitivity analysis for problems involving arbitrary combinations of overset unstructured grids which may be static, undergoing rigid or deforming motions, or any combination thereof. General parent-child motions are also accommodated, and the accuracy of the implementation is established using an independent verification based on a complex-variable approach. The methodology is used to demonstrate aerodynamic optimizations of a wind turbine geometry, a biologically-inspired flapping wing, and a complex helicopter configuration subject to trimming constraints. The objective function for each problem is successfully reduced and all specified constraints are satisfied.
Environmental exposure effects on composite materials for commercial aircraft
NASA Technical Reports Server (NTRS)
Hoffman, D. J.
1978-01-01
Activities reported include completion of the program design tasks, resolution of a high fiber volume problem and resumption of specimen fabrication, fixture fabrication, and progress on the analysis methodology and definition of the typical aircraft environment. Program design activities including test specimens, specimen holding fixtures, flap-track fairing tailcones, and ground exposure racks were completed. The problem experienced in obtaining acceptable fiber volume fraction results on two of the selected graphite epoxy material systems was resolved with an alteration to the bagging procedure called out in BAC 5562. The revised bagging procedure, involving lower numbers of bleeder plies, produces acceptable results. All required laminates for the contract have now been laid up and cured. Progress in the area of analysis methodology has been centered about definition of the environment that a commercial transport aircraft undergoes. The selected methodology is analagous to fatigue life assessment.
Robust optimization modelling with applications to industry and environmental problems
NASA Astrophysics Data System (ADS)
Chaerani, Diah; Dewanto, Stanley P.; Lesmana, Eman
2017-10-01
Robust Optimization (RO) modeling is one of the existing methodology for handling data uncertainty in optimization problem. The main challenge in this RO methodology is how and when we can reformulate the robust counterpart of uncertain problems as a computationally tractable optimization problem or at least approximate the robust counterpart by a tractable problem. Due to its definition the robust counterpart highly depends on how we choose the uncertainty set. As a consequence we can meet this challenge only if this set is chosen in a suitable way. The development on RO grows fast, since 2004, a new approach of RO called Adjustable Robust Optimization (ARO) is introduced to handle uncertain problems when the decision variables must be decided as a ”wait and see” decision variables. Different than the classic Robust Optimization (RO) that models decision variables as ”here and now”. In ARO, the uncertain problems can be considered as a multistage decision problem, thus decision variables involved are now become the wait and see decision variables. In this paper we present the applications of both RO and ARO. We present briefly all results to strengthen the importance of RO and ARO in many real life problems.
Pedagogy and/or technology: Making difference in improving students' problem solving skills
NASA Astrophysics Data System (ADS)
Hrepic, Zdeslav; Lodder, Katherine; Shaw, Kimberly A.
2013-01-01
Pen input computers combined with interactive software may have substantial potential for promoting active instructional methodologies and for facilitating students' problem solving ability. An excellent example is a study in which introductory physics students improved retention, conceptual understanding and problem solving abilities when one of three weekly lectures was replaced with group problem solving sessions facilitated with Tablet PCs and DyKnow software [1,2]. The research goal of the present study was to isolate the effect of the methodology itself (using additional time to teach problem solving) from that of the involved technology. In Fall 2011 we compared the performance of students taking the same introductory physics lecture course while enrolled in two separate problem-solving sections. One section used pen-based computing to facilitate group problem solving while the other section used low-tech methods for one third of the semester (covering Kinematics), and then traded technologies for the middle third of the term (covering Dynamics). Analysis of quiz, exam and standardized pre-post test results indicated no significant difference in scores of the two groups. Combining this result with those of previous studies implies primacy of pedagogy (collaborative problem solving itself) over technology for student learning in problem solving recitations.
An open initiative involving cross-disciplinary contributors of computer-assisted structure elucidation (CASE), including methodology specialists, software and database developers and the editorial board of Magnetic Resonance in Chemistry, is addressing the old problem of reporti...
Doing Research That Makes a Difference
ERIC Educational Resources Information Center
Bensimon, Estela Mara; Polkinghorne, Donald E.; Bauman, Georgia L.; Vallejo, Edlyn
2004-01-01
This article describes an alternative methodology for conducting research that is intended to bring about institutional change. This process involves developing deeper awareness among faculty members, administrators, or counselors, of a problem that exists in their local context. In some instances these individuals may be unaware that the problem…
Symmetry, Contingency, Complexity: Accommodating Uncertainty in Public Relations Theory.
ERIC Educational Resources Information Center
Murphy, Priscilla
2000-01-01
Explores the potential of complexity theory as a unifying theory in public relations, where scholars have recently raised problems involving flux, uncertainty, adaptiveness, and loss of control. Describes specific complexity-based methodologies and their potential for public relations studies. Offers an account of complexity theory, its…
New Approaches and Applications for Monte Carlo Perturbation Theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aufiero, Manuele; Bidaud, Adrien; Kotlyar, Dan
2017-02-01
This paper presents some of the recent and new advancements in the extension of Monte Carlo Perturbation Theory methodologies and application. In particular, the discussed problems involve Brunup calculation, perturbation calculation based on continuous energy functions, and Monte Carlo Perturbation Theory in loosely coupled systems.
Evaluation in health: participatory methodology and involvement of municipal managers
de Almeida, Cristiane Andrea Locatelli; Tanaka, Oswaldo Yoshimi
2016-01-01
ABSTRACT OBJECTIVE To analyze scopes and limits of the use of participatory methodology of evaluation with municipal health managers and administrators. METHODS Qualitative research with health policymakers and managers of the Comissão Intergestores Regional (CIR – Regional Interagency Commission) of a health region of the state of Sao Paulo in Brazil. Representatives from seven member cities participated in seven workshops facilitated by the researchers, with the aim of assessing a specific problem of the care line, which would be used as a tracer of the system integrality. The analysis of the collected empirical material was based on the hermeneutic-dialectic methodology and aimed at the evaluation of the applied participatory methodology, according to its capacity of promoting a process of assessment capable to be used as a support for municipal management. RESULTS With the participatory approach of evaluation, we were able to promote in-depth discussions with the group, especially related to the construction of integral care and to the inclusion of the user’s perspective in decision-making, linked to the search for solution to concrete problems of managers. By joint exploration, the possibility of using data from electronic information systems was opened, as well as information coming directly from the users of the services, to enhance discussions and negotiations between partners. The participants were disbelievers of the replication potential of this type of evaluation without the direct monitoring of the academy, given the difficulty of organizing the process in everyday life, already taken by emergency and political issues. CONCLUSIONS Evaluations of programs and services carried out within the Regional Interagency Commission, starting from the local interest and facilitating the involvement of its members by the use of participatory methodologies, can contribute to the construction of integral care. To the extent that the act of evaluating stay invested with greater significance to the local actors, its involvement with the evaluations at the federal level can also be stimulated. PMID:27509011
Roberts, Renée J; Wilson, Ashley E; Quezado, Zenaide
2017-03-01
Six Sigma and Lean methodologies are effective quality improvement tools in many health care settings. We applied the DMAIC methodology (define, measure, analyze, improve, control) to address deficiencies in our pediatric anesthesia supply chain. We defined supply chain problems by mapping existing processes and soliciting comments from those involved. We used daily distance walked by anesthesia technicians and number of callouts for missing supplies as measurements that we analyzed before and after implementing improvements (anesthesia cart redesign). We showed improvement in the metrics after those interventions were implemented, and those improvements were sustained and thus controlled 1 year after implementation.
Fractional Programming for Communication Systems—Part II: Uplink Scheduling via Matching
NASA Astrophysics Data System (ADS)
Shen, Kaiming; Yu, Wei
2018-05-01
This two-part paper develops novel methodologies for using fractional programming (FP) techniques to design and optimize communication systems. Part I of this paper proposes a new quadratic transform for FP and treats its application for continuous optimization problems. In this Part II of the paper, we study discrete problems, such as those involving user scheduling, which are considerably more difficult to solve. Unlike the continuous problems, discrete or mixed discrete-continuous problems normally cannot be recast as convex problems. In contrast to the common heuristic of relaxing the discrete variables, this work reformulates the original problem in an FP form amenable to distributed combinatorial optimization. The paper illustrates this methodology by tackling the important and challenging problem of uplink coordinated multi-cell user scheduling in wireless cellular systems. Uplink scheduling is more challenging than downlink scheduling, because uplink user scheduling decisions significantly affect the interference pattern in nearby cells. Further, the discrete scheduling variable needs to be optimized jointly with continuous variables such as transmit power levels and beamformers. The main idea of the proposed FP approach is to decouple the interaction among the interfering links, thereby permitting a distributed and joint optimization of the discrete and continuous variables with provable convergence. The paper shows that the well-known weighted minimum mean-square-error (WMMSE) algorithm can also be derived from a particular use of FP; but our proposed FP-based method significantly outperforms WMMSE when discrete user scheduling variables are involved, both in term of run-time efficiency and optimizing results.
Some human factors issues in the development and evaluation of cockpit alerting and warning systems
NASA Technical Reports Server (NTRS)
Randle, R. J., Jr.; Larsen, W. E.; Williams, D. H.
1980-01-01
A set of general guidelines for evaluating a newly developed cockpit alerting and warning system in terms of human factors issues are provided. Although the discussion centers around a general methodology, it is made specifically to the issues involved in alerting systems. An overall statement of the current operational problem is presented. Human factors problems with reference to existing alerting and warning systems are described. The methodology for proceeding through system development to system test is discussed. The differences between traditional human factors laboratory evaluations and those required for evaluation of complex man-machine systems under development are emphasized. Performance evaluation in the alerting and warning subsystem using a hypothetical sample system is explained.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilmanov, Anvar, E-mail: agilmano@umn.edu; Le, Trung Bao, E-mail: lebao002@umn.edu; Sotiropoulos, Fotis, E-mail: fotis@umn.edu
We present a new numerical methodology for simulating fluid–structure interaction (FSI) problems involving thin flexible bodies in an incompressible fluid. The FSI algorithm uses the Dirichlet–Neumann partitioning technique. The curvilinear immersed boundary method (CURVIB) is coupled with a rotation-free finite element (FE) model for thin shells enabling the efficient simulation of FSI problems with arbitrarily large deformation. Turbulent flow problems are handled using large-eddy simulation with the dynamic Smagorinsky model in conjunction with a wall model to reconstruct boundary conditions near immersed boundaries. The CURVIB and FE solvers are coupled together on the flexible solid–fluid interfaces where the structural nodalmore » positions, displacements, velocities and loads are calculated and exchanged between the two solvers. Loose and strong coupling FSI schemes are employed enhanced by the Aitken acceleration technique to ensure robust coupling and fast convergence especially for low mass ratio problems. The coupled CURVIB-FE-FSI method is validated by applying it to simulate two FSI problems involving thin flexible structures: 1) vortex-induced vibrations of a cantilever mounted in the wake of a square cylinder at different mass ratios and at low Reynolds number; and 2) the more challenging high Reynolds number problem involving the oscillation of an inverted elastic flag. For both cases the computed results are in excellent agreement with previous numerical simulations and/or experiential measurements. Grid convergence tests/studies are carried out for both the cantilever and inverted flag problems, which show that the CURVIB-FE-FSI method provides their convergence. Finally, the capability of the new methodology in simulations of complex cardiovascular flows is demonstrated by applying it to simulate the FSI of a tri-leaflet, prosthetic heart valve in an anatomic aorta and under physiologic pulsatile conditions.« less
Atwood's machine as a tool to introduce variable mass systems
NASA Astrophysics Data System (ADS)
de Sousa, Célia A.
2012-03-01
This article discusses an instructional strategy which explores eventual similarities and/or analogies between familiar problems and more sophisticated systems. In this context, the Atwood's machine problem is used to introduce students to more complex problems involving ropes and chains. The methodology proposed helps students to develop the ability needed to apply relevant concepts in situations not previously encountered. The pedagogical advantages are relevant for both secondary and high school students, showing that, through adequate examples, the question of the validity of Newton's second law may even be introduced to introductory level students.
[Problems of world outlook and methodology of science integration in biological studies].
Khododova, Iu D
1981-01-01
Problems of worldoutlook and methodology of the natural-science knowledge are considered basing on the analysis of tendencies in the development of the membrane theory of cell processes and the use of principles of biological membrane functioning when solving some scientific and applied problems pertaining to different branches of chemistry and biology. The notion scientific knowledge integration is defined as interpenetration of approaches, methods and ideas of different branches of knowledge and enrichment on this basis of their content resulting in knowledge augmentation in each field taken separately. These processes are accompanied by appearance of new branches of knowledge - sciences "on junction" and their subsequent differentiations. The analysis of some gnoseological situations shows that integration of sciences contributes to coordination and some agreement of thinking styles of different specialists, puts forward keen personality of a scientist demanding, in particular, his high professional mobility. Problems of scientific activity organization are considered, which involve social sciences into the integration processes. The role of philosophy in the integration processes is emphasized.
Chai, Xun; Gao, Feng; Pan, Yang; Qi, Chenkun; Xu, Yilin
2015-04-22
Coordinate identification between vision systems and robots is quite a challenging issue in the field of intelligent robotic applications, involving steps such as perceiving the immediate environment, building the terrain map and planning the locomotion automatically. It is now well established that current identification methods have non-negligible limitations such as a difficult feature matching, the requirement of external tools and the intervention of multiple people. In this paper, we propose a novel methodology to identify the geometric parameters of 3D vision systems mounted on robots without involving other people or additional equipment. In particular, our method focuses on legged robots which have complex body structures and excellent locomotion ability compared to their wheeled/tracked counterparts. The parameters can be identified only by moving robots on a relatively flat ground. Concretely, an estimation approach is provided to calculate the ground plane. In addition, the relationship between the robot and the ground is modeled. The parameters are obtained by formulating the identification problem as an optimization problem. The methodology is integrated on a legged robot called "Octopus", which can traverse through rough terrains with high stability after obtaining the identification parameters of its mounted vision system using the proposed method. Diverse experiments in different environments demonstrate our novel method is accurate and robust.
Report on the Directory of Library Collections in Ethnocultural Organizations in Canada: Ontario.
ERIC Educational Resources Information Center
Aliferis, Isabella; Wierucki, Karen
This report discusses the methodology, problems, follow-up procedures, recommendations, and benefits involved in the compilation of a directory which will identify library collections held by ethnocultural clubs, federations, organizations, community centers, and language schools in Canada. Appendices contain samples of a form letter,…
From Acceptance to Rejection: Food Contamination in the Classroom.
ERIC Educational Resources Information Center
Rajecki, D. W.
1989-01-01
Describes a classroom exercise to explain design and measurement principles in methodology and statistics courses. This demonstration which involves measurement of a shift from food acceptance to food rejection produces meaningful data sets. The realism of the exercise gives students a view of problems that emerge in research. (KO)
Costing Educational Wastage: A Pilot Simulation Study. Current Surveys and Research in Statistics.
ERIC Educational Resources Information Center
Berstecher, D.
This pilot simulation study examines the important methodological problems involved in costing educational wastage, focusing specifically on the cost implications of educational wastage in primary education. Purpose of the study is to provide a clearer picture of the underlying rationale and interrelated consequences of reducing educational…
Ethics in the Information Exploitation and Manipulation Age
ERIC Educational Resources Information Center
Snow, Richard; Snow, Mary
2007-01-01
Purpose: The purpose of this paper is to elaborate the need to educate and encourage students to seek an ethical realm in which the researcher not only accurately analyses and documents a problem, but also actually advocates involvement to mitigate negative impacts. Design/methodology/approach: Geographic information systems (GIS) applications are…
Composite Indices of Development and Poverty: An Application to MDGs
ERIC Educational Resources Information Center
De Muro, Pasquale; Mazziotta, Matteo; Pareto, Adriano
2011-01-01
The measurement of development or poverty as multidimensional phenomena is very difficult because there are several theoretical, methodological and empirical problems involved. The literature of composite indicators offers a wide variety of aggregation methods, all with their pros and cons. In this paper, we propose a new, alternative composite…
Boeninger, Daria K.; Masyn, Katherine E.; Conger, Rand D.
2012-01-01
Although studies have established associations between parenting characteristics and adolescent suicidality, the strength of the evidence for these links remains unclear, largely because of methodological limitations, including lack of accounting for possible child effects on parenting. This study addresses these issues by using autoregressive cross-lag models with data on 802 adolescents and their parents across 5 years. Observed parenting behaviors predicted change in adolescent suicidal problems across one-year intervals even after controlling for adolescents’ effects on parenting. Nurturant-involved parenting continued to demonstrate salutary effects after controlling for adolescent and parent internalizing psychopathology: over time, observed nurturant-involved parenting reduced the likelihood of adolescent suicidal problems. This study increases the empirical support implicating parenting behaviors in the developmental course of adolescent suicidality. PMID:24244079
Methodology for astronaut reconditioning research.
Beard, David J; Cook, Jonathan A
2017-01-01
Space medicine offers some unique challenges, especially in terms of research methodology. A specific challenge for astronaut reconditioning involves identification of what aspects of terrestrial research methodology hold and which require modification. This paper reviews this area and presents appropriate solutions where possible. It is concluded that spaceflight rehabilitation research should remain question/problem driven and is broadly similar to the terrestrial equivalent on small populations, such as rare diseases and various sports. Astronauts and Medical Operations personnel should be involved at all levels to ensure feasibility of research protocols. There is room for creative and hybrid methodology but careful systematic observation is likely to be more achievable and fruitful than complex trial based comparisons. Multi-space agency collaboration will be critical to pool data from small groups of astronauts with the accepted use of standardised outcome measures across all agencies. Systematic reviews will be an essential component. Most limitations relate to the inherent small sample size available for human spaceflight research. Early adoption of a co-operative model for spaceflight rehabilitation research is therefore advised. Copyright © 2016 Elsevier Ltd. All rights reserved.
Palese, A; Mecugni, S; Barbieri, M; Bonocore, M; Buscaroli, A; Buscaroli, A; Caparnoni, M; Colognese, S; Costi, D; Di Vaio, S; Lapi, L; Lionte, G; Nasi, A M; Pellicciari, C; Quartieri, M; Ricci, R; Saguatti, K; Saragoni, M; Tarantola, S; Torri, E; Vaccari, S; Volpi, P; Vinceti, M
2010-01-01
An innovative teaching strategy focused on problem based approach rather than theorical aiming to facilitate the learning of the research methodology in advanced nursing student has been introduced. Through out a qualitative evaluation of the diary kept by the student nurses involved, advantages and disadvantages of this innovative approach have been evaluated. This paper reports a synthesis of the teaching strategy and its impact on the competences in the research methodology as it has been perceived by the students participants.
Where Does Latin "Sum" Come From?
ERIC Educational Resources Information Center
Nyman, Martti A.
1977-01-01
The derivation of Latin "sum,""es(s),""est" from Indo-European "esmi,""est,""esti" involves methodological problems. It is claimed here that the development of "sum" from "esmi" is related to the origin of the variation "est-st" (less than"esti"). The study is primarily concerned with this process, but chronological suggestions are also made. (CHK)
Possibilities of Particle Finite Element Methods in Industrial Forming Processes
NASA Astrophysics Data System (ADS)
Oliver, J.; Cante, J. C.; Weyler, R.; Hernandez, J.
2007-04-01
The work investigates the possibilities offered by the particle finite element method (PFEM) in the simulation of forming problems involving large deformations, multiple contacts, and new boundaries generation. The description of the most distinguishing aspects of the PFEM, and its application to simulation of representative forming processes, illustrate the proposed methodology.
Dusting the Archives of Childhood: Child Welfare Records as Historical Sources
ERIC Educational Resources Information Center
Vehkalahti, Kaisa
2016-01-01
Using administrative sources in the history of education and childhood involves a range of methodological and ethical considerations. This article discusses these problems, as well as the role of archives and archival policies in preserving history and shaping our understanding of past childhoods. Using Finnish child welfare archives from the…
On-Line Representation of a Clinical Case and the Development of Expertise.
ERIC Educational Resources Information Center
Boshuizen, Henny P. A.; And Others
Designed to examine the structural differences in the representation of medical problems in subjects with varying degrees of medical expertise, this study uses an online, thinking-aloud technique to investigate the validity of Feltovich and Barrows' model of expert medical knowledge and illness scripts. Study methodology involved asking one…
ERIC Educational Resources Information Center
Newby, Michael; Marcoulides, Laura D.
2008-01-01
Purpose: The purpose of this paper is to model the relationship between student performance, student attitudes, and computer laboratory environments. Design/methodology/approach: Data were collected from 234 college students enrolled in courses that involved the use of a computer to solve problems and provided the laboratory experience by means of…
Cognitive Activity-based Design Methodology for Novice Visual Communication Designers
ERIC Educational Resources Information Center
Kim, Hyunjung; Lee, Hyunju
2016-01-01
The notion of design thinking is becoming more concrete nowadays, as design researchers and practitioners study the thinking processes involved in design and employ the concept of design thinking to foster better solutions to complex and ill-defined problems. The goal of the present research is to develop a cognitive activity-based design…
ERIC Educational Resources Information Center
Böckler, Nils; Roth, Viktoria; Stetten, Lina; Zick, Andreas
2014-01-01
The authors begin their commentary by saying that looking at the phenotypical characteristics of a school shooting, which focus on the perpetrators' experiences in school contexts seems to be overdue. In spite of methodological problems, the studies involved in the review seem to paint a clear picture with regard to social ostracism and harassment…
Modeling Multibody Stage Separation Dynamics Using Constraint Force Equation Methodology
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Roithmayr, Carlos M.; Toniolo, Matthew D.; Karlgaard, Christopher D.; Pamadi, Bandu N.
2011-01-01
This paper discusses the application of the constraint force equation methodology and its implementation for multibody separation problems using three specially designed test cases. The first test case involves two rigid bodies connected by a fixed joint, the second case involves two rigid bodies connected with a universal joint, and the third test case is that of Mach 7 separation of the X-43A vehicle. For the first two cases, the solutions obtained using the constraint force equation method compare well with those obtained using industry- standard benchmark codes. For the X-43A case, the constraint force equation solutions show reasonable agreement with the flight-test data. Use of the constraint force equation method facilitates the analysis of stage separation in end-to-end simulations of launch vehicle trajectories
Chai, Xun; Gao, Feng; Pan, Yang; Qi, Chenkun; Xu, Yilin
2015-01-01
Coordinate identification between vision systems and robots is quite a challenging issue in the field of intelligent robotic applications, involving steps such as perceiving the immediate environment, building the terrain map and planning the locomotion automatically. It is now well established that current identification methods have non-negligible limitations such as a difficult feature matching, the requirement of external tools and the intervention of multiple people. In this paper, we propose a novel methodology to identify the geometric parameters of 3D vision systems mounted on robots without involving other people or additional equipment. In particular, our method focuses on legged robots which have complex body structures and excellent locomotion ability compared to their wheeled/tracked counterparts. The parameters can be identified only by moving robots on a relatively flat ground. Concretely, an estimation approach is provided to calculate the ground plane. In addition, the relationship between the robot and the ground is modeled. The parameters are obtained by formulating the identification problem as an optimization problem. The methodology is integrated on a legged robot called “Octopus”, which can traverse through rough terrains with high stability after obtaining the identification parameters of its mounted vision system using the proposed method. Diverse experiments in different environments demonstrate our novel method is accurate and robust. PMID:25912350
Learning comunication strategies for distributed artificial intelligence
NASA Astrophysics Data System (ADS)
Kinney, Michael; Tsatsoulis, Costas
1992-08-01
We present a methodology that allows collections of intelligent system to automatically learn communication strategies, so that they can exchange information and coordinate their problem solving activity. In our methodology communication between agents is determined by the agents themselves, which consider the progress of their individual problem solving activities compared to the communication needs of their surrounding agents. Through learning, communication lines between agents might be established or disconnected, communication frequencies modified, and the system can also react to dynamic changes in the environment that might force agents to cease to exist or to be added. We have established dynamic, quantitative measures of the usefulness of a fact, the cost of a fact, the work load of an agent, and the selfishness of an agent (a measure indicating an agent's preference between transmitting information versus performing individual problem solving), and use these values to adapt the communication between intelligent agents. In this paper we present the theoretical foundations of our work together with experimental results and performance statistics of networks of agents involved in cooperative problem solving activities.
The colloquial approach: An active learning technique
NASA Astrophysics Data System (ADS)
Arce, Pedro
1994-09-01
This paper addresses the very important problem of the effectiveness of teaching methodologies in fundamental engineering courses such as transport phenomena. An active learning strategy, termed the colloquial approach, is proposed in order to increase student involvement in the learning process. This methodology is a considerable departure from traditional methods that use solo lecturing. It is based on guided discussions, and it promotes student understanding of new concepts by directing the student to construct new ideas by building upon the current knowledge and by focusing on key cases that capture the essential aspects of new concepts. The colloquial approach motivates the student to participate in discussions, to develop detailed notes, and to design (or construct) his or her own explanation for a given problem. This paper discusses the main features of the colloquial approach within the framework of other current and previous techniques. Problem-solving strategies and the need for new textbooks and for future investigations based on the colloquial approach are also outlined.
ERIC Educational Resources Information Center
Wolpert, Miranda; Humphrey, Neil; Belsky, Jay; Deighton, Jessica
2013-01-01
The Targeted Mental Health in Schools (TaMHS) programme was a nationwide initiative that funded mental health provision in schools for pupils at risk of or already experiencing mental health problems. The implementation, impact and experience of this programme was evaluated using quantitative and qualitative methodology involving three main…
ERIC Educational Resources Information Center
Grueneich, Royal; Trabasso, Tom
This review of research involving children's moral judgment of literature indicates that such research has been plagued by serious methodological problems stemming largely from the fact that the stimulus materials used to assess children's comprehension and evaluations have tended to be poorly constructed. It contends that this forces children to…
ERIC Educational Resources Information Center
Emerson, Eric; Felce, David; Stancliffe, Roger J.
2013-01-01
This article examines two methodological issues regarding ways of obtaining and analyzing outcome data for people with intellectual disabilities: (a) self-report and proxy-report data and (b) analysis of population-based data sets. Some people with intellectual disabilities have difficulties with self-reporting due to problems of understanding and…
High-performance parallel analysis of coupled problems for aircraft propulsion
NASA Technical Reports Server (NTRS)
Felippa, C. A.; Farhat, C.; Lanteri, S.; Gumaste, U.; Ronaghi, M.
1994-01-01
Applications are described of high-performance parallel, computation for the analysis of complete jet engines, considering its multi-discipline coupled problem. The coupled problem involves interaction of structures with gas dynamics, heat conduction and heat transfer in aircraft engines. The methodology issues addressed include: consistent discrete formulation of coupled problems with emphasis on coupling phenomena; effect of partitioning strategies, augmentation and temporal solution procedures; sensitivity of response to problem parameters; and methods for interfacing multiscale discretizations in different single fields. The computer implementation issues addressed include: parallel treatment of coupled systems; domain decomposition and mesh partitioning strategies; data representation in object-oriented form and mapping to hardware driven representation, and tradeoff studies between partitioning schemes and fully coupled treatment.
A hybrid nonlinear programming method for design optimization
NASA Technical Reports Server (NTRS)
Rajan, S. D.
1986-01-01
Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.
Development of Six Sigma methodology for CNC milling process improvements
NASA Astrophysics Data System (ADS)
Ismail, M. N.; Rose, A. N. M.; Mohammed, N. Z.; Rashid, M. F. F. Ab
2017-10-01
Quality and productivity have been identified as an important role in any organization, especially for manufacturing sectors to gain more profit that leads to success of a company. This paper reports a work improvement project in Kolej Kemahiran Tinggi MARA Kuantan. It involves problem identification in production of “Khufi” product and proposing an effective framework to improve the current situation effectively. Based on the observation and data collection on the work in progress (WIP) product, the major problem has been identified related to function of the product which is the parts can’t assemble properly due to dimension of the product is out of specification. The six sigma has been used as a methodology to study and improve of the problems identified. Six Sigma is a highly statistical and data driven approach to solving complex business problems. It uses a methodical five phase approach define, measure, analysis, improve and control (DMAIC) to help understand the process and the variables that affect it so that can be optimized the processes. Finally, the root cause and solution for the production of “Khufi” problem has been identified and implemented then the result for this product was successfully followed the specification of fitting.
Fudin, R
1999-08-01
Analyses of procedures in Lloyd H. Silverman's subliminal psychodynamic activation experiments identify problems and questions. Given the information provided, none of his experiments can be replicated, and none of his positive results were found under luminance conditions he reckoned in 1983 were typical of such outcomes. Furthermore, there is no evidence in any of his experiments that all stimuli were presented completely within the fovea, a condition critical to the production of positive findings (Silverman & Geisler, 1986). These considerations and the fact that no experiment using Silverman's procedures can yield unambiguous positive results (Fudin, 1986) underscore the need to start anew research in this area. Such research should be undertaken with a greater appreciation of methodological issues involved in exposing and encoding subliminal stimuli than that found in all but a few experiments on subliminal psychodynamic activation.
Outline of cost-benefit analysis and a case study
NASA Technical Reports Server (NTRS)
Kellizy, A.
1978-01-01
The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.
Palesh, Oxana; Peppone, Luke; Innominato, Pasquale F; Janelsins, Michelle; Jeong, Monica; Sprod, Lisa; Savard, Josee; Rotatori, Max; Kesler, Shelli; Telli, Melinda; Mustian, Karen
2012-01-01
Sleep problems are highly prevalent in cancer patients undergoing chemotherapy. This article reviews existing evidence on etiology, associated symptoms, and management of sleep problems associated with chemotherapy treatment during cancer. It also discusses limitations and methodological issues of current research. The existing literature suggests that subjectively and objectively measured sleep problems are the highest during the chemotherapy phase of cancer treatments. A possibly involved mechanism reviewed here includes the rise in the circulating proinflammatory cytokines and the associated disruption in circadian rhythm in the development and maintenance of sleep dysregulation in cancer patients during chemotherapy. Various approaches to the management of sleep problems during chemotherapy are discussed with behavioral intervention showing promise. Exercise, including yoga, also appear to be effective and safe at least for subclinical levels of sleep problems in cancer patients. Numerous challenges are associated with conducting research on sleep in cancer patients during chemotherapy treatments and they are discussed in this review. Dedicated intervention trials, methodologically sound and sufficiently powered, are needed to test current and novel treatments of sleep problems in cancer patients receiving chemotherapy. Optimal management of sleep problems in patients with cancer receiving treatment may improve not only the well-being of patients, but also their prognosis given the emerging experimental and clinical evidence suggesting that sleep disruption might adversely impact treatment and recovery from cancer. PMID:23486503
A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base
NASA Technical Reports Server (NTRS)
Kautzmann, Frank N., III
1988-01-01
Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.
Methodology for senior-proof guidelines: A practice example from the Netherlands.
van Munster, Barbara C; Portielje, Johanna E A; Maier, Andrea B; Arends, Arend J; de Beer, Johannes J A
2018-02-01
Evidence-based guidelines constitute a foundation for medical decision making. It is often unclear whether recommendations in general guidelines also apply to older people. This study aimed to develop a methodology to increase the focus on older people in the development of guidelines. The methodology distinguishes 4 groups of older people: (1) relatively healthy older people; (2) older people with 1 additional specific (interfering) comorbid condition; (3) older people with multimorbidity; and (4) vulnerable older people. The level of focus on older people required may be determined by the prevalence of the disease or condition, level of suffering, social relevance, and the expectation that a guideline may improve the quality of care. A specialist in geriatric medicine may be involved in the guideline process via participation, provision of feedback on drafts, or involvement in the analysis of problem areas. Regarding the patient perspective, it is advised to involve organisations for older people or informal carers in the inventory of problem areas, and additionally to perform literature research of patient values on the subject. If the guideline focuses on older people, then the relative importance of the various outcome measures for this target group needs to be explicitly stated. Search strategies for all the 4 groups are suggested. For clinical studies that focus on the treatment of diseases that frequently occur in older people, a check should be made regarding whether these studies produce the required evidence. This can be achieved by verifying if there is sufficient representation of older people in the studies and determining if there is a separate reporting of results applying to this age group. © 2017 John Wiley & Sons, Ltd.
Fluid moments of the nonlinear Landau collision operator
Hirvijoki, E.; Lingam, M.; Pfefferle, D.; ...
2016-08-09
An important problem in plasma physics is the lack of an accurate and complete description of Coulomb collisions in associated fluid models. To shed light on the problem, this Letter introduces an integral identity involving the multivariate Hermite tensor polynomials and presents a method for computing exact expressions for the fluid moments of the nonlinear Landau collision operator. In conclusion, the proposed methodology provides a systematic and rigorous means of extending the validity of fluid models that have an underlying inverse-square force particle dynamics to arbitrary collisionality and flow.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
On multi-site damage identification using single-site training data
NASA Astrophysics Data System (ADS)
Barthorpe, R. J.; Manson, G.; Worden, K.
2017-11-01
This paper proposes a methodology for developing multi-site damage location systems for engineering structures that can be trained using single-site damaged state data only. The methodology involves training a sequence of binary classifiers based upon single-site damage data and combining the developed classifiers into a robust multi-class damage locator. In this way, the multi-site damage identification problem may be decomposed into a sequence of binary decisions. In this paper Support Vector Classifiers are adopted as the means of making these binary decisions. The proposed methodology represents an advancement on the state of the art in the field of multi-site damage identification which require either: (1) full damaged state data from single- and multi-site damage cases or (2) the development of a physics-based model to make multi-site model predictions. The potential benefit of the proposed methodology is that a significantly reduced number of recorded damage states may be required in order to train a multi-site damage locator without recourse to physics-based model predictions. In this paper it is first demonstrated that Support Vector Classification represents an appropriate approach to the multi-site damage location problem, with methods for combining binary classifiers discussed. Next, the proposed methodology is demonstrated and evaluated through application to a real engineering structure - a Piper Tomahawk trainer aircraft wing - with its performance compared to classifiers trained using the full damaged-state dataset.
ERIC Educational Resources Information Center
Moore, Gwendolyn B.; And Others
The Privacy Act of 1974 places restrictions on the Federal, state and local agencies' use of the Social Security account number as an identifier. For some agencies, compliance will involve changes in implementation of retrieval algorithms. This report describes methodology applicable to these changes in the more general context of the problem of…
ERIC Educational Resources Information Center
Rivera-Navarro, Jesús; Cubo, Esther; Almazán, Javier
2014-01-01
This article analyzes the perceptions of Spanish health professionals, children with Tourette's Syndrome (TS) and their parents about social, school and family problems related to the disorder. A qualitative research methodology was used involving Focus Groups (FGs) made up of children with TS (× 2 FGs), parents/caregivers of persons with TS (× 2…
Arolt, V; Rothermundt, M; Peters, M; Leonard, B
2002-01-01
There is convincing evidence that cytokines are involved in the physiology and pathophysiology of brain function and interact with different neurotransmitter and neuroendocrine pathways. The possible involvement of the immune system in the neurobiological mechanisms that underlie psychiatric disorders has attracted increasing attention in recent years. Thus in the last decade, numerous clinical studies have demonstrated dysregulated immune functions in patients with psychiatric disorders. Such findings formed the basis of the 7th Expert Meeting on Psychiatry and Immunology in Muenster, Germany, where a consensus symposium was held to consider the strengths and weaknesses of current research in psychoneuroimmunology. Following a general overview of the field, the following topics were discussed: (1) methodological problems in laboratory procedures and recruitment of clinical samples; (2) the importance of pre-clinical research and animal models in psychiatric research; (3) the problem of statistical vs biological relevance. It was concluded that, despite a fruitful proliferation of research activities throughout the last decade, the continuous elaboration of methodological standards including the implementation of hypothesis-driven research represents a task that is likely to prove crucial for the future development of immunology research in clinical psychiatry.
Semicompeting risks in aging research: methods, issues and needs
Varadhan, Ravi; Xue, Qian-Li; Bandeen-Roche, Karen
2015-01-01
A semicompeting risks problem involves two-types of events: a nonterminal and a terminal event (death). Typically, the nonterminal event is the focus of the study, but the terminal event can preclude the occurrence of the nonterminal event. Semicompeting risks are ubiquitous in studies of aging. Examples of semicompeting risk dyads include: dementia and death, frailty syndrome and death, disability and death, and nursing home placement and death. Semicompeting risk models can be divided into two broad classes: models based only on observables quantities (class O) and those based on potential (latent) failure times (class L). The classical illness-death model belongs to class O. This model is a special case of the multistate models, which has been an active area of methodology development. During the past decade and a half, there has also been a flurry of methodological activity on semicompeting risks based on latent failure times (L models). These advances notwithstanding, the semi-competing risks methodology has not penetrated biomedical research, in general, and gerontological research, in particular. Some possible reasons for this lack of uptake are: the methods are relatively new and sophisticated, conceptual problems associated with potential failure time models are difficult to overcome, paucity of expository articles aimed at educating practitioners, and non-availability of readily usable software. The main goals of this review article are: (i) to describe the major types of semicompeting risks problems arising in aging research, (ii) to provide a brief survey of the semicompeting risks methods, (iii) to suggest appropriate methods for addressing the problems in aging research, (iv) to highlight areas where more work is needed, and (v) to suggest ways to facilitate the uptake of the semicompeting risks methodology by the broader biomedical research community. PMID:24729136
Innovations in design and technology. The story of hip arthroplasty.
Amstutz, H C
2000-09-01
The current study reviews the early history of surgeon-initiated trial and error development in hip joint arthroplasty and the subsequent methodological evolution to proper criteria for hypothesis testing using bioengineers and other research scientists. The interplay and relationships to industry, universities, scientific organizations, and the Food and Drug Administration with respect to device development in hip arthroplasty are reviewed. The ethics of and responsibilities to involved parties are outlined, citing the history of many contemporary developments. Examples are provided from the evolution and introduction of unsuccessful innovations, and the problems inherent in the current methodology of the approval process from the Food and Drug Administration using the 5-10K, Investigative Device Exemption, and the Pre-Market Approval protocols. The pros and cons of randomized trials for devices are outlined with the conclusion that they are not appropriate for device introduction. The proper, rational methodology for introduction of new devices is a phased-in clinical trial process after pertinent bench testing. Finally, the ethical dilemmas created by managed care are addressed. Industry involvements of the surgeon-spokesmen are cited.
The Development of a "Neighborhood in Solidarity" in Switzerland.
Zwygart, Marion; Plattet, Alain; Ammor, Sarah
2017-01-01
This article presents a case study based on the "Neighborhood in Solidarity" (NS) methodology to illustrate its application in a locality of 8,000 inhabitants in Switzerland. This specific project is proposed to exemplify the global aim of the NS methodology. That aim is to increase the integration of elderly persons in societies in order to improve their quality of life. The case study demonstrates the enhancement of the capacity of the older people to remain actively engaged in their neighborhood. The article focuses on the creation of an autonomous community of empowered older people who can resolve their own problems after a 5-year project. The construction of the local community is presented throughout the six steps of the methodology: (1) preliminary analysis, (2) diagnostic, (3) construction, (4) project design, (5) project implementation, and (6) empowerment and with three degrees of involvement (community, participative, and integrative involvement). Performance and output indicators, quality indicators, and social determinants of health assess the development of the local project. The impacts of the projects which are illustrated in this specific example motivated this publication to inspire practitioners from other countries.
Atmospheric Tracer Inverse Modeling Using Markov Chain Monte Carlo (MCMC)
NASA Astrophysics Data System (ADS)
Kasibhatla, P.
2004-12-01
In recent years, there has been an increasing emphasis on the use of Bayesian statistical estimation techniques to characterize the temporal and spatial variability of atmospheric trace gas sources and sinks. The applications have been varied in terms of the particular species of interest, as well as in terms of the spatial and temporal resolution of the estimated fluxes. However, one common characteristic has been the use of relatively simple statistical models for describing the measurement and chemical transport model error statistics and prior source statistics. For example, multivariate normal probability distribution functions (pdfs) are commonly used to model these quantities and inverse source estimates are derived for fixed values of pdf paramaters. While the advantage of this approach is that closed form analytical solutions for the a posteriori pdfs of interest are available, it is worth exploring Bayesian analysis approaches which allow for a more general treatment of error and prior source statistics. Here, we present an application of the Markov Chain Monte Carlo (MCMC) methodology to an atmospheric tracer inversion problem to demonstrate how more gereral statistical models for errors can be incorporated into the analysis in a relatively straightforward manner. The MCMC approach to Bayesian analysis, which has found wide application in a variety of fields, is a statistical simulation approach that involves computing moments of interest of the a posteriori pdf by efficiently sampling this pdf. The specific inverse problem that we focus on is the annual mean CO2 source/sink estimation problem considered by the TransCom3 project. TransCom3 was a collaborative effort involving various modeling groups and followed a common modeling and analysis protocoal. As such, this problem provides a convenient case study to demonstrate the applicability of the MCMC methodology to atmospheric tracer source/sink estimation problems.
Common Methodological Problems in Research on the Addictions.
ERIC Educational Resources Information Center
Nathan, Peter E.; Lansky, David
1978-01-01
Identifies common problems in research on the addictions and offers suggestions for remediating these methodological problems. The addictions considered include alcoholism and drug dependencies. Problems considered are those arising from inadequate, incomplete, or biased reviews of relevant literatures and methodological shortcomings of subject…
The problem drinker's lived experience of suffering: an exploration using hermeneutic phenomenology.
Smith, B A
1998-01-01
Research in the area of problem drinking has traditionally relied on quantitative methodologies which view the problem from the researcher's perspective. The purpose of this hermeneutic-phenomenological study was to describe and understand the problem drinker's lived experience of suffering using a philosophy and research approach which preserves the uniqueness of the experience from the sufferer's point of view. The method involved conducting in-depth interviews with a sample of six problem drinkers. Interviews were analysed using an interpretive process, which aimed at generating a deeper understanding of the topic by facilitating a fusion of the world views of both participant and researcher. A reflexive journal recorded the involvement of the self of the researcher throughout the research process. Suffering was viewed as a spiralling vicious circle of physical, psychological, social and spiritual distress. Symptoms of physical dependence, shame and guilt emerged strongly as being both sequelae of heavy drinking and cues to further drinking bouts. Evoking memories of previous suffering through telling one's story was found to be an empowering and motivating force. The results have relevance to specialist and generic workers, who are urged to pay greater attention to the social, psychological and spiritual care of problem drinkers.
Artistic image analysis using graph-based learning approaches.
Carneiro, Gustavo
2013-08-01
We introduce a new methodology for the problem of artistic image analysis, which among other tasks, involves the automatic identification of visual classes present in an art work. In this paper, we advocate the idea that artistic image analysis must explore a graph that captures the network of artistic influences by computing the similarities in terms of appearance and manual annotation. One of the novelties of our methodology is the proposed formulation that is a principled way of combining these two similarities in a single graph. Using this graph, we show that an efficient random walk algorithm based on an inverted label propagation formulation produces more accurate annotation and retrieval results compared with the following baseline algorithms: bag of visual words, label propagation, matrix completion, and structural learning. We also show that the proposed approach leads to a more efficient inference and training procedures. This experiment is run on a database containing 988 artistic images (with 49 visual classification problems divided into a multiclass problem with 27 classes and 48 binary problems), where we show the inference and training running times, and quantitative comparisons with respect to several retrieval and annotation performance measures.
Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel
2013-06-01
Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.
Verification of a Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Roithmayr, Carlos; Toniolo, Matthew D.; Karlgaard, Christopher; Pamadi, Bandu N.
2008-01-01
This paper discusses the verification of the Constraint Force Equation (CFE) methodology and its implementation in the Program to Optimize Simulated Trajectories II (POST2) for multibody separation problems using three specially designed test cases. The first test case involves two rigid bodies connected by a fixed joint; the second case involves two rigid bodies connected with a universal joint; and the third test case is that of Mach 7 separation of the Hyper-X vehicle. For the first two cases, the POST2/CFE solutions compared well with those obtained using industry standard benchmark codes, namely AUTOLEV and ADAMS. For the Hyper-X case, the POST2/CFE solutions were in reasonable agreement with the flight test data. The CFE implementation in POST2 facilitates the analysis and simulation of stage separation as an integral part of POST2 for seamless end-to-end simulations of launch vehicle trajectories.
Rivera, José; Carrillo, Mariano; Chacón, Mario; Herrera, Gilberto; Bojorquez, Gilberto
2007-01-01
The development of smart sensors involves the design of reconfigurable systems capable of working with different input sensors. Reconfigurable systems ideally should spend the least possible amount of time in their calibration. An autocalibration algorithm for intelligent sensors should be able to fix major problems such as offset, variation of gain and lack of linearity, as accurately as possible. This paper describes a new autocalibration methodology for nonlinear intelligent sensors based on artificial neural networks, ANN. The methodology involves analysis of several network topologies and training algorithms. The proposed method was compared against the piecewise and polynomial linearization methods. Method comparison was achieved using different number of calibration points, and several nonlinear levels of the input signal. This paper also shows that the proposed method turned out to have a better overall accuracy than the other two methods. Besides, experimentation results and analysis of the complete study, the paper describes the implementation of the ANN in a microcontroller unit, MCU. In order to illustrate the method capability to build autocalibration and reconfigurable systems, a temperature measurement system was designed and tested. The proposed method is an improvement over the classic autocalibration methodologies, because it impacts on the design process of intelligent sensors, autocalibration methodologies and their associated factors, like time and cost.
ERIC Educational Resources Information Center
Jelicic, Helena; Phelps, Erin; Lerner, Richard M.
2010-01-01
The study of adolescent development rests on methodologically appropriate collection and interpretation of longitudinal data. While all longitudinal studies of adolescent development involve missing data, the methods to treat missingness that have been recommended most often focus on missing data from cross-sectional studies. The problems of…
A time-parallel approach to strong-constraint four-dimensional variational data assimilation
NASA Astrophysics Data System (ADS)
Rao, Vishwas; Sandu, Adrian
2016-05-01
A parallel-in-time algorithm based on an augmented Lagrangian approach is proposed to solve four-dimensional variational (4D-Var) data assimilation problems. The assimilation window is divided into multiple sub-intervals that allows parallelization of cost function and gradient computations. The solutions to the continuity equations across interval boundaries are added as constraints. The augmented Lagrangian approach leads to a different formulation of the variational data assimilation problem than the weakly constrained 4D-Var. A combination of serial and parallel 4D-Vars to increase performance is also explored. The methodology is illustrated on data assimilation problems involving the Lorenz-96 and the shallow water models.
Ethical problems in clinical psychopharmacology.
Müller-Oerlinghausen, B
1978-10-01
The present article originates from some intriguing problems which the author, working as a clinical pharmacologist and psychiatrist, was faced with during clinical investigations. Practical difficulties appearing at first glance as of a rather methodological nature often reveal themselves as ethical questions. Investigation of psychotropic drugs in normal volunteers as well as in psychiatric patients is taken as a model to exemplify certain fundamental ethical aspects of medical research. It is emphasized that the "solution" of ethical problems cannot be achieved by referring to a given code of norms which themselves depend on certain historical circumstances, but rather by recognizing and reasoning the conflicts which result from various moral maxims. Clinical psychopharmacology should not only be conscious of its methodological shortcomings and future goals but also accept the justification of discussions about the ethical and legal questions involved in its dealings and take an active part in these debates. With regard to the relationship between patient and investigator, "solidarity" [23] instead of ongoing paternalism or legal formalism, appears to be a realistic goal. This is also true in the area of psychopharmacological research.
A decentralized square root information filter/smoother
NASA Technical Reports Server (NTRS)
Bierman, G. J.; Belzer, M. R.
1985-01-01
A number of developments has recently led to a considerable interest in the decentralization of linear least squares estimators. The developments are partly related to the impending emergence of VLSI technology, the realization of parallel processing, and the need for algorithmic ways to speed the solution of dynamically decoupled, high dimensional estimation problems. A new method is presented for combining Square Root Information Filters (SRIF) estimates obtained from independent data sets. The new method involves an orthogonal transformation, and an information matrix filter 'homework' problem discussed by Schweppe (1973) is generalized. The employed SRIF orthogonal transformation methodology has been described by Bierman (1977).
Program of policy studies in science and technology
NASA Technical Reports Server (NTRS)
Mayo, L. H.
1973-01-01
The application of an interdisciplinary, problem-oriented capability to the performance of total social impact evaluations is discussed. The consequences of introducing new configurations, technological or otherwise into future social environments are presented. The primary characteristics of the program are summarized: (1) emphasis on interdisciplinary, problem-oriented analysis; (2) development of intra- and inter-institutional arrangements for the purpose of analyzing social problems, evaluating existing programs, and assessing the social impacts of prospective policies, programs, and other public actions; (3) focus on methodological approaches to the projection of alternative future social environments, the identification of the effects of the introduction of new policies, programs, or other actions into the social system, and the evaluation of the social impacts of such effects; (4) availability of analytical resources for advisory and research tasks, and provision for use of program facilities as a neutral forum for the discussion of public issues involving involving the impact of advancing technology on social value-institutional processes.
Continuation of probability density functions using a generalized Lyapunov approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baars, S., E-mail: s.baars@rug.nl; Viebahn, J.P., E-mail: viebahn@cwi.nl; Mulder, T.E., E-mail: t.e.mulder@uu.nl
Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial differential equations near fixed points, under a small noise approximation. Key innovation is the efficient solution of a generalized Lyapunov equation using an iterative method involving low-rank approximations. We apply and illustrate the capabilities of the method using a problem in physical oceanography, i.e. the occurrence of multiple steady states of the Atlantic Ocean circulation.
Cyran, Krzysztof A.
2018-01-01
This work considers the problem of utilizing electroencephalographic signals for use in systems designed for monitoring and enhancing the performance of aircraft pilots. Systems with such capabilities are generally referred to as cognitive cockpits. This article provides a description of the potential that is carried by such systems, especially in terms of increasing flight safety. Additionally, a neuropsychological background of the problem is presented. Conducted research was focused mainly on the problem of discrimination between states of brain activity related to idle but focused anticipation of visual cue and reaction to it. Especially, a problem of selecting a proper classification algorithm for such problems is being examined. For that purpose an experiment involving 10 subjects was planned and conducted. Experimental electroencephalographic data was acquired using an Emotiv EPOC+ headset. Proposed methodology involved use of a popular method in biomedical signal processing, the Common Spatial Pattern, extraction of bandpower features, and an extensive test of different classification algorithms, such as Linear Discriminant Analysis, k-nearest neighbors, and Support Vector Machines with linear and radial basis function kernels, Random Forests, and Artificial Neural Networks. PMID:29849544
NASA Astrophysics Data System (ADS)
Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan
2018-01-01
Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.
Automated control of hierarchical systems using value-driven methods
NASA Technical Reports Server (NTRS)
Pugh, George E.; Burke, Thomas E.
1990-01-01
An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.
Large scale nonlinear programming for the optimization of spacecraft trajectories
NASA Astrophysics Data System (ADS)
Arrieta-Camacho, Juan Jose
Despite the availability of high fidelity mathematical models, the computation of accurate optimal spacecraft trajectories has never been an easy task. While simplified models of spacecraft motion can provide useful estimates on energy requirements, sizing, and cost; the actual launch window and maneuver scheduling must rely on more accurate representations. We propose an alternative for the computation of optimal transfers that uses an accurate representation of the spacecraft dynamics. Like other methodologies for trajectory optimization, this alternative is able to consider all major disturbances. In contrast, it can handle explicitly equality and inequality constraints throughout the trajectory; it requires neither the derivation of costate equations nor the identification of the constrained arcs. The alternative consist of two steps: (1) discretizing the dynamic model using high-order collocation at Radau points, which displays numerical advantages, and (2) solution to the resulting Nonlinear Programming (NLP) problem using an interior point method, which does not suffer from the performance bottleneck associated with identifying the active set, as required by sequential quadratic programming methods; in this way the methodology exploits the availability of sound numerical methods, and next generation NLP solvers. In practice the methodology is versatile; it can be applied to a variety of aerospace problems like homing, guidance, and aircraft collision avoidance; the methodology is particularly well suited for low-thrust spacecraft trajectory optimization. Examples are presented which consider the optimization of a low-thrust orbit transfer subject to the main disturbances due to Earth's gravity field together with Lunar and Solar attraction. Other example considers the optimization of a multiple asteroid rendezvous problem. In both cases, the ability of our proposed methodology to consider non-standard objective functions and constraints is illustrated. Future research directions are identified, involving the automatic scheduling and optimization of trajectory correction maneuvers. The sensitivity information provided by the methodology is expected to be invaluable in such research pursuit. The collocation scheme and nonlinear programming algorithm presented in this work, complement other existing methodologies by providing reliable and efficient numerical methods able to handle large scale, nonlinear dynamic models.
Identification of asteroid dynamical families
NASA Technical Reports Server (NTRS)
Valsecchi, G. B.; Carusi, A.; Knezevic, Z.; Kresak, L.; Williams, J. G.
1989-01-01
Problems involved in the identification of asteroid dynamical families are discussed, and some methodological guidelines are presented. Asteroid family classifications are reviewed, and differences in the existing classifications are examined with special attention given to the effects of observational selection on the classification of family membership. The paper also discusses various theories of secular perturbations, including the classical linear theory, the theory of Williams (1969), and the higher order/degree theory of Yuasa (1973).
Fu, Hongyun; VanLandingham, Mark J
2012-05-01
Although the existing literature on immigrant mental health is extensive, major substantive and methodological gaps remain. Substantively, there is little population-based research that focuses on the mental health consequences of migration for Vietnamese Americans. More generally, although a wide range of mental health problems among immigrants has been identified, the potential causal or mediating mechanisms underlying these problems remain elusive. This latter substantive shortcoming is related to a key methodological challenge involving the potentially confounding effects of selection on migration-related outcomes. This article addresses these challenges by employing a "natural experiment" design, involving comparisons among three population-based samples of Vietnamese immigrants, never-leavers, and returnees (N=709). Data were collected in Ho Chi Minh City and in New Orleans between 2003 and 2005. The study investigates the long-term impact of international migration on Vietnamese mental health, and the potential mediating effects of social networks and physical health on these migration-related outcomes. The results reveal both mental health advantages and disadvantages among Vietnamese immigrants relative to the two groups of Vietnamese nationals. Selection can be ruled out for some of these differences, and both social networks and physical health are found to play important explanatory roles.
Probabilistic analysis of structures involving random stress-strain behavior
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Thacker, B. H.; Harren, S. V.
1991-01-01
The present methodology for analysis of structures with random stress strain behavior characterizes the uniaxial stress-strain curve in terms of (1) elastic modulus, (2) engineering stress at initial yield, (3) initial plastic-hardening slope, (4) engineering stress at point of ultimate load, and (5) engineering strain at point of ultimate load. The methodology is incorporated into the Numerical Evaluation of Stochastic Structures Under Stress code for probabilistic structural analysis. The illustrative problem of a thick cylinder under internal pressure, where both the internal pressure and the stress-strain curve are random, is addressed by means of the code. The response value is the cumulative distribution function of the equivalent plastic strain at the inner radius.
LES, DNS, and RANS for the Analysis of High-Speed Turbulent Reacting Flows
NASA Technical Reports Server (NTRS)
Colucci, P. J.; Jaberi, F. A.; Givi, P.
1996-01-01
A filtered density function (FDF) method suitable for chemically reactive flows is developed in the context of large eddy simulation. The advantage of the FDF methodology is its inherent ability to resolve subgrid scales (SGS) scalar correlations that otherwise have to be modeled. Because of the lack of robust models to accurately predict these correlations in turbulent reactive flows, simulations involving turbulent combustion are often met with a degree of skepticism. The FDF methodology avoids the closure problem associated with these terms and treats the reaction in an exact manner. The scalar FDF approach is particularly attractive since it can be coupled with existing hydrodynamic computational fluid dynamics (CFD) codes.
[Conceptual and methodological issues involved in the research field of diagnostic reasoning].
Di Persia, Francisco N
2016-05-01
The psychopathological field is crossed by dilemmas that put in question its methodological, conceptual and philosophical filiations. Since the early works of Ey and Jaspers until recent work of Berrios it has been in question the position psychopathology has in the field of medicine in general, and in the field of psychiatry in particular, especially if it should follow the principles of natural science or if it has an autonomous position between them. This debate has led to two opposing positions facing two different models of psychopathology: the biomedical model and the socio-constructionist model. In this work it is proposed to review the scope and difficulties involved in each model following two central axes: diagnostic reasoning and mental illness conceptual problem. Later, as a synthesis of the analysis proposed they are identified central concepts of each model that could allow the development of a hybrid model in psychopathology; in between them the comprehensive framework employed in symptoms recognition and the social component that characterizes it are highlighted. As a conclusion, these concepts are proposed as central aspects for conceptual and methodological clarification of the research field of diagnostic reasoning in psychopathology.
Psychopathology in pediatric epilepsy: role of antiepileptic drugs.
Caplan, Rochelle
2012-01-01
Children with epilepsy are usually treated with antiepileptic drugs (AEDS). Some AEDs adversely affect behavior in susceptible children. Since psychiatric comorbidity is prevalent in pediatric epilepsy, this paper attempts to disentangle these AED side effects from the psychopathology associated with this illness. It first outlines the clinical and methodological problems involved in determining if AEDs contribute to the behavior and emotional problems of children with epilepsy. It then presents research evidence for and against the role AEDs play in the psychopathology of children with epilepsy, and outlines how future studies might investigate this problem. A brief description of how to clinically separate out AED effects from the complex illness-related and psychosocial factors that contribute to the behavior difficulties of children with epilepsy concludes the paper.
NASA Astrophysics Data System (ADS)
Gimenez, Juan M.; González, Leo M.
2015-03-01
In this paper, a new generation of the particle method known as Particle Finite Element Method (PFEM), which combines convective particle movement and a fixed mesh resolution, is applied to free surface flows. This interesting variant, previously described in the literature as PFEM-2, is able to use larger time steps when compared to other similar numerical tools which implies shorter computational times while maintaining the accuracy of the computation. PFEM-2 has already been extended to free surface problems, being the main topic of this paper a deep validation of this methodology for a wider range of flows. To accomplish this task, different improved versions of discontinuous and continuous enriched basis functions for the pressure field have been developed to capture the free surface dynamics without artificial diffusion or undesired numerical effects when different density ratios are involved. A collection of problems has been carefully selected such that a wide variety of Froude numbers, density ratios and dominant dissipative cases are reported with the intention of presenting a general methodology, not restricted to a particular range of parameters, and capable of using large time-steps. The results of the different free-surface problems solved, which include: Rayleigh-Taylor instability, sloshing problems, viscous standing waves and the dam break problem, are compared to well validated numerical alternatives or experimental measurements obtaining accurate approximations for such complex flows.
Object oriented development of engineering software using CLIPS
NASA Technical Reports Server (NTRS)
Yoon, C. John
1991-01-01
Engineering applications involve numeric complexity and manipulations of a large amount of data. Traditionally, numeric computation has been the concern in developing an engineering software. As engineering application software became larger and more complex, management of resources such as data, rather than the numeric complexity, has become the major software design problem. Object oriented design and implementation methodologies can improve the reliability, flexibility, and maintainability of the resulting software; however, some tasks are better solved with the traditional procedural paradigm. The C Language Integrated Production System (CLIPS), with deffunction and defgeneric constructs, supports the procedural paradigm. The natural blending of object oriented and procedural paradigms has been cited as the reason for the popularity of the C++ language. The CLIPS Object Oriented Language's (COOL) object oriented features are more versatile than C++'s. A software design methodology based on object oriented and procedural approaches appropriate for engineering software, and to be implemented in CLIPS was outlined. A method for sensor placement for Space Station Freedom is being implemented in COOL as a sample problem.
Argumentation: A Methodology to Facilitate Critical Thinking.
Makhene, Agnes
2017-06-20
Caring is a difficult nursing activity that involves a complex nature of a human being in need of complex decision-making and problem solving through the critical thinking process. It is mandatory that critical thinking is facilitated in general and in nursing education particularly in order to render care in diverse multicultural patient care settings. This paper aims to describe how argumentation can be used to facilitate critical thinking in learners. A qualitative, exploratory and descriptive design that is contextual was used. Purposive sampling method was used to draw a sample and Miles and Huberman methodology of qualitative analysis was used to analyse data. Lincoln and Guba's strategies were employed to ensure trustworthiness, while Dhai and McQuoid-Mason's principles of ethical consideration were used. Following data analysis the findings were integrated within literature which culminated into the formulation of guidelines that can be followed when using argumentation as a methodology to facilitate critical thinking.
Conceptualising the lack of health insurance coverage.
Davis, J B
2000-01-01
This paper examines the lack of health insurance coverage in the US as a public policy issue. It first compares the problem of health insurance coverage to the problem of unemployment to show that in terms of the numbers of individuals affected lack of health insurance is a problem comparable in importance to the problem of unemployment. Secondly, the paper discusses the methodology involved in measuring health insurance coverage, and argues that the current method of estimation of the uninsured underestimates the extent that individuals go without health insurance. Third, the paper briefly introduces Amartya Sen's functioning and capabilities framework to suggest a way of representing the extent to which individuals are uninsured. Fourth, the paper sketches a means of operationalizing the Sen representation of the uninsured in terms of the disability-adjusted life year (DALY) measure.
High-Order Moving Overlapping Grid Methodology in a Spectral Element Method
NASA Astrophysics Data System (ADS)
Merrill, Brandon E.
A moving overlapping mesh methodology that achieves spectral accuracy in space and up to second-order accuracy in time is developed for solution of unsteady incompressible flow equations in three-dimensional domains. The targeted applications are in aerospace and mechanical engineering domains and involve problems in turbomachinery, rotary aircrafts, wind turbines and others. The methodology is built within the dual-session communication framework initially developed for stationary overlapping meshes. The methodology employs semi-implicit spectral element discretization of equations in each subdomain and explicit treatment of subdomain interfaces with spectrally-accurate spatial interpolation and high-order accurate temporal extrapolation, and requires few, if any, iterations, yet maintains the global accuracy and stability of the underlying flow solver. Mesh movement is enabled through the Arbitrary Lagrangian-Eulerian formulation of the governing equations, which allows for prescription of arbitrary velocity values at discrete mesh points. The stationary and moving overlapping mesh methodologies are thoroughly validated using two- and three-dimensional benchmark problems in laminar and turbulent flows. The spatial and temporal global convergence, for both methods, is documented and is in agreement with the nominal order of accuracy of the underlying solver. Stationary overlapping mesh methodology was validated to assess the influence of long integration times and inflow-outflow global boundary conditions on the performance. In a turbulent benchmark of fully-developed turbulent pipe flow, the turbulent statistics are validated against the available data. Moving overlapping mesh simulations are validated on the problems of two-dimensional oscillating cylinder and a three-dimensional rotating sphere. The aerodynamic forces acting on these moving rigid bodies are determined, and all results are compared with published data. Scaling tests, with both methodologies, show near linear strong scaling, even for moderately large processor counts. The moving overlapping mesh methodology is utilized to investigate the effect of an upstream turbulent wake on a three-dimensional oscillating NACA0012 extruded airfoil. A direct numerical simulation (DNS) at Reynolds Number 44,000 is performed for steady inflow incident upon the airfoil oscillating between angle of attack 5.6° and 25° with reduced frequency k=0.16. Results are contrasted with subsequent DNS of the same oscillating airfoil in a turbulent wake generated by a stationary upstream cylinder.
Breaking the Change Barrier: A 40 Year Analysis of Air Force Pilot Retention Solutions
national defense. A problem/solution research methodology using the organizational management theory of path dependence explored the implications of the...exodus is to start the incentive process earlier in the career and prior to the final decision to separate. Path dependent analysis indicates all prior... incentive options and personal involvement in the overall process. The Air Force can annually budget and forecast incentive requirements and personnel
Applications of polymeric smart materials to environmental problems.
Gray, H N; Bergbreiter, D E
1997-01-01
New methods for the reduction and remediation of hazardous wastes like carcinogenic organic solvents, toxic materials, and nuclear contamination are vital to environmental health. Procedures for effective waste reduction, detection, and removal are important components of any such methods. Toward this end, polymeric smart materials are finding useful applications. Polymer-bound smart catalysts are useful in waste minimization, catalyst recovery, and catalyst reuse. Polymeric smart coatings have been developed that are capable of both detecting and removing hazardous nuclear contaminants. Such applications of smart materials involving catalysis chemistry, sensor chemistry, and chemistry relevant to decontamination methodology are especially applicable to environmental problems. PMID:9114277
Acoustic scattering on spheroidal shapes near boundaries
NASA Astrophysics Data System (ADS)
Miloh, Touvia
2016-11-01
A new expression for the Lamé product of prolate spheroidal wave functions is presented in terms of a distribution of multipoles along the axis of the spheroid between its foci (generalizing a corresponding theorem for spheroidal harmonics). Such an "ultimate" singularity system can be effectively used for solving various linear boundary-value problems governed by the Helmholtz equation involving prolate spheroidal bodies near planar or other boundaries. The general methodology is formally demonstrated for the axisymmetric acoustic scattering problem of a rigid (hard) spheroid placed near a hard/soft wall or inside a cylindrical duct under an axial incidence of a plane acoustic wave.
Zhukovsky, K
2014-01-01
We present a general method of operational nature to analyze and obtain solutions for a variety of equations of mathematical physics and related mathematical problems. We construct inverse differential operators and produce operational identities, involving inverse derivatives and families of generalised orthogonal polynomials, such as Hermite and Laguerre polynomial families. We develop the methodology of inverse and exponential operators, employing them for the study of partial differential equations. Advantages of the operational technique, combined with the use of integral transforms, generating functions with exponentials and their integrals, for solving a wide class of partial derivative equations, related to heat, wave, and transport problems, are demonstrated.
NASA Technical Reports Server (NTRS)
Fymat, A. L.; Kalaba, R. E.
1977-01-01
The original problem of anisotropic scattering in an atmosphere illuminated by a unidirectional source is replaced by an analogous formulation where the incident light is omnidirectional. A radiative-transfer equation for the omnidirectional case is obtained in which the direction of illumination plays no role and the source-function analog, Sobolev's (1972) source function Phi exponent m, contains only a single integral term. For radiation incident on the top or the bottom of the atmosphere, this equation involves the functions b exponent m and h exponent m, respectively, with m corresponding to the order of the harmonic component of the scattered radiation field; these two functions are shown to be only one through some simple reciprocity relations. The transfer problem is then reformulated for the function a exponent m, in which case the source-function analog (Sobolev's function D exponent m) involves incident direction.
Development of a case tool to support decision based software development
NASA Technical Reports Server (NTRS)
Wild, Christian J.
1993-01-01
A summary of the accomplishments of the research over the past year are presented. Achievements include: made demonstrations with DHC, a prototype supporting decision based software development (DBSD) methodology, for Paramax personnel at ODU; met with Paramax personnel to discuss DBSD issues, the process of integrating DBSD and Refinery and the porting process model; completed and submitted a paper describing DBSD paradigm to IFIP '92; completed and presented a paper describing the approach for software reuse at the Software Reuse Workshop in April 1993; continued to extend DHC with a project agenda, facility necessary for a better project management; completed a primary draft of the re-engineering process model for porting; created a logging form to trace all the activities involved in the process of solving the reengineering problem, and developed a primary chart with the problems involved by the reengineering process.
Neger, Emily N; Prinz, Ronald J
2015-07-01
Parental substance abuse is a serious problem affecting the well-being of children and families. The co-occurrence of parental substance abuse and problematic parenting is recognized as a major public health concern. This review focuses on 21 outcome studies that tested dual treatment of substance abuse and parenting. A summary of theoretical conceptualizations of the connections between substance abuse and parenting provides a backdrop for the review. Outcomes of the dual treatment studies were generally positive with respect to reduction of parental substance use and improvement of parenting. Research in this area varied in methodological rigor and needs to overcome challenges regarding design issues, sampling frame, and complexities inherent in such a high-risk population. This area of work can be strengthened by randomized controlled trials, use of mixed-methods outcome measures, consideration of parent involvement with child protective services, involvement of significant others in treatment, provision of concrete supports for treatment attendance and facilitative public policies. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification
NASA Astrophysics Data System (ADS)
Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.
MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.
Neger, Emily N.; Prinz, Ronald J.
2015-01-01
Parental substance abuse is a serious problem affecting the well-being of children and families. The co-occurrence of parental substance abuse and problematic parenting is recognized as a major public health concern. This review focuses on 21 outcome studies that tested dual treatment of substance abuse and parenting. A summary of theoretical conceptualizations of the connections between substance abuse and parenting provides a backdrop for the review. Outcomes of the dual treatment studies were generally positive with respect to reduction of parental substance use and improvement of parenting. Research in this area varied in methodological rigor and needs to overcome challenges regarding design issues, sampling frame, and complexities inherent in such a high-risk population. This area of work can be strengthened by randomized controlled trials, use of mixed-methods outcome measures, consideration of parent involvement with child protective services, involvement of significant others in treatment, provision of concrete supports for treatment attendance and facilitative public policies. PMID:25939033
NASA Technical Reports Server (NTRS)
Taylor, Arthur C., III; Hou, Gene W.
1993-01-01
In this study involving advanced fluid flow codes, an incremental iterative formulation (also known as the delta or correction form) together with the well-known spatially-split approximate factorization algorithm, is presented for solving the very large sparse systems of linear equations which are associated with aerodynamic sensitivity analysis. For smaller 2D problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. Iterative methods are needed for larger 2D and future 3D applications, however, because direct methods require much more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioning of the coefficient matrix; this problem can be overcome when these equations are cast in the incremental form. These and other benefits are discussed. The methodology is successfully implemented and tested in 2D using an upwind, cell-centered, finite volume formulation applied to the thin-layer Navier-Stokes equations. Results are presented for two sample airfoil problems: (1) subsonic low Reynolds number laminar flow; and (2) transonic high Reynolds number turbulent flow.
An overall index of environmental quality in coal mining areas and energy facilities.
Vatalis, Konstantinos I; Kaliampakos, Demetrios C
2006-12-01
An approach to measuring environmental quality and trends in coal mining and industrial areas was attempted in this work. For this purpose, the establishment of a reference scale characterizing the status of environmental quality is proposed by developing an Environmental Quality Index (EQI). The methodology involves three main components: social research, the opinion of environmental experts, and the combination of new or existing indices. A survey of public opinion was carried out to identify the main environmental problems in the region of interest. Environmental experts carried out a survey, and the weights of specific environmental problems were obtained through a fuzzy Delphi method and pairwise comparison. The weight attributed to each environmental problem was computed, using new or existing indices (subindices) in the relevant literature. The EQI comprises a combination of the subindices with their own weights. The methodology was applied to a heavily industrialized coal basin in northwestern Macedonia, Greece. The results show that the new index may be used as a reliable tool for evaluating environmental quality in different areas. In addition, the study of EQI trends on an interannual basis can provide useful information on the efficiency of environmental policies already implemented by the responsible authorities.
The SIMRAND methodology - Simulation of Research and Development Projects
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1984-01-01
In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.
Nuclear medicine and imaging research (Instrumentation and quantitative methods of evaluation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1989-09-01
This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Litchfield, J.W.; Watts, R.L.; Gurwell, W.E.
A materials assessment methodology for identifying specific critical material requirements that could hinder the implementation of solar energy has been developed and demonstrated. The methodology involves an initial screening process, followed by a more detailed materials assessment. The detailed assessment considers such materials concerns and constraints as: process and production constraints, reserve and resource limitations, lack of alternative supply sources, geopolitical problems, environmental and energy concerns, time constraints, and economic constraints. Data for 55 bulk and 53 raw materials are currently available on the data base. These materials are required in the example photovoltaic systems. One photovoltaic system and thirteenmore » photovoltaic cells, ten solar heating and cooling systems, and two agricultural and industrial process heat systems have been characterized to define their engineering and bulk material requirements.« less
Cognition of an expert tackling an unfamiliar conceptual physics problem
NASA Astrophysics Data System (ADS)
Schuster, David; Undreiu, Adriana
2009-11-01
We have investigated and analyzed the cognition of an expert tackling a qualitative conceptual physics problem of an unfamiliar type. Our goal was to elucidate the detailed cognitive processes and knowledge elements involved, irrespective of final solution form, and consider implications for instruction. The basic but non-trivial problem was to find qualitatively the direction of acceleration of a pendulum bob at various stages of its motion, a problem originally studied by Reif and Allen. Methodology included interviews, introspection, retrospection and self-reported metacognition. Multiple facets of cognition were revealed, with different reasoning strategies used at different stages and for different points on the path. An account is given of the zigzag thinking paths and interplay of reasoning modes and schema elements involved. We interpret the cognitive processes in terms of theoretical concepts that emerged, namely: case-based, principle-based, experiential-intuitive and practical-heuristic reasoning; knowledge elements and schemata; activation; metacognition and epistemic framing. The complexity of cognition revealed in this case study contrasts with the tidy principle-based solutions we present to students. The pervasive role of schemata, case-based reasoning, practical heuristic strategies, and their interplay with physics principles is noteworthy, since these aspects of cognition are generally neither recognized nor taught. The schema/reasoning-mode perspective has direct application in science teaching, learning and problem-solving.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ustinov, E. A., E-mail: eustinov@mail.wplus.net
This paper presents a refined technique to describe two-dimensional phase transitions in dense fluids adsorbed on a crystalline surface. Prediction of parameters of 2D liquid–solid equilibrium is known to be an extremely challenging problem, which is mainly due to a small difference in thermodynamic functions of coexisting phases and lack of accuracy of numerical experiments in case of their high density. This is a serious limitation of various attempts to circumvent this problem. To improve this situation, a new methodology based on the kinetic Monte Carlo method was applied. The methodology involves analysis of equilibrium gas–liquid and gas–solid systems undergoingmore » an external potential, which allows gradual shifting parameters of the phase coexistence. The interrelation of the chemical potential and tangential pressure for each system is then treated with the Gibbs–Duhem equation to obtain the point of intersection corresponding to the liquid/solid–solid equilibrium coexistence. The methodology is demonstrated on the krypton–graphite system below and above the 2D critical temperature. Using experimental data on the liquid–solid and the commensurate–incommensurate transitions in the krypton monolayer derived from adsorption isotherms, the Kr–graphite Lennard–Jones parameters have been corrected resulting in a higher periodic potential modulation.« less
Appropriate methodologies for empirical bioethics: it's all relative.
Ives, Jonathan; Draper, Heather
2009-05-01
In this article we distinguish between philosophical bioethics (PB), descriptive policy orientated bioethics (DPOB) and normative policy oriented bioethics (NPOB). We argue that finding an appropriate methodology for combining empirical data and moral theory depends on what the aims of the research endeavour are, and that, for the most part, this combination is only required for NPOB. After briefly discussing the debate around the is/ought problem, and suggesting that both sides of this debate are misunderstanding one another (i.e. one side treats it as a conceptual problem, whilst the other treats it as an empirical claim), we outline and defend a methodological approach to NPOB based on work we have carried out on a project exploring the normative foundations of paternal rights and responsibilities. We suggest that given the prominent role already played by moral intuition in moral theory, one appropriate way to integrate empirical data and philosophical bioethics is to utilize empirically gathered lay intuition as the foundation for ethical reasoning in NPOB. The method we propose involves a modification of a long-established tradition on non-intervention in qualitative data gathering, combined with a form of reflective equilibrium where the demands of theory and data are given equal weight and a pragmatic compromise reached.
Evaluation of the HARDMAN comparability methodology for manpower, personnel and training
NASA Technical Reports Server (NTRS)
Zimmerman, W.; Butler, R.; Gray, V.; Rosenberg, L.
1984-01-01
The methodology evaluation and recommendation are part of an effort to improve Hardware versus Manpower (HARDMAN) methodology for projecting manpower, personnel, and training (MPT) to support new acquisition. Several different validity tests are employed to evaluate the methodology. The methodology conforms fairly well with both the MPT user needs and other accepted manpower modeling techniques. Audits of three completed HARDMAN applications reveal only a small number of potential problem areas compared to the total number of issues investigated. The reliability study results conform well with the problem areas uncovered through the audits. The results of the accuracy studies suggest that the manpower life-cycle cost component is only marginally sensitive to changes in other related cost variables. Even with some minor problems, the methodology seem sound and has good near term utility to the Army. Recommendations are provided to firm up the problem areas revealed through the evaluation.
Initial design and evaluation of automatic restructurable flight control system concepts
NASA Technical Reports Server (NTRS)
Weiss, J. L.; Looze, D. P.; Eterno, J. S.; Grunberg, D. B.
1986-01-01
Results of efforts to develop automatic control design procedures for restructurable aircraft control systems is presented. The restructurable aircraft control problem involves designing a fault tolerance control system which can accommodate a wide variety of unanticipated aircraft failure. Under NASA sponsorship, many of the technologies which make such a system possible were developed and tested. Future work will focus on developing a methodology for integrating these technologies and demonstration of a complete system.
2016-01-05
is we made progress, ex post facto , in terms of methodology development towards what DARPA recently called for, namely “improve[ing] our understanding...for their bases of activity. Prior quantitative research , by the P.I. and by others, has shown that tribalism is an important incubator of Islamist...undergraduate students partook in the research , and were involved in conceptualization of problems, quantitative and qualitative research , coding of
NASA Technical Reports Server (NTRS)
Shkarayev, S.; Krashantisa, R.; Tessler, A.
2004-01-01
An important and challenging technology aimed at the next generation of aerospace vehicles is that of structural health monitoring. The key problem is to determine accurately, reliably, and in real time the applied loads, stresses, and displacements experienced in flight, with such data establishing an information database for structural health monitoring. The present effort is aimed at developing a finite element-based methodology involving an inverse formulation that employs measured surface strains to recover the applied loads, stresses, and displacements in an aerospace vehicle in real time. The computational procedure uses a standard finite element model (i.e., "direct analysis") of a given airframe, with the subsequent application of the inverse interpolation approach. The inverse interpolation formulation is based on a parametric approximation of the loading and is further constructed through a least-squares minimization of calculated and measured strains. This procedure results in the governing system of linear algebraic equations, providing the unknown coefficients that accurately define the load approximation. Numerical simulations are carried out for problems involving various levels of structural approximation. These include plate-loading examples and an aircraft wing box. Accuracy and computational efficiency of the proposed method are discussed in detail. The experimental validation of the methodology by way of structural testing of an aircraft wing is also discussed.
Evaluation of a primary school drug drama project: methodological issues and key findings.
Starkey, F; Orme, J
2001-10-01
This paper describes the impact evaluation of a primary school drug drama project developed by a health promotion service and a theatre's education department in England. The project targeted 10-11 year olds in 41 schools with an interactive drama production and workshop day on attitudes, choices, decisions and risks of alcohol, tobacco and illegal drug use. Parents were also involved in parents' evenings and watching children's performances. The research consisted of both process evaluation, consultation with pupils, teachers, parents, actors and health promotion staff on the project itself, and impact evaluation which looked at potential changes in children's knowledge, attitudes and decision-making skills. This paper reports findings of the impact evaluation, from six of the schools participating in the project. The impact evaluation consisted of pre- and post-project testing using a 'draw and write' and a problem-solving exercise. These findings suggest that the project had a significant impact on the children's knowledge of names of specific illegal drugs, and on their awareness that alcohol and cigarettes were also drugs, and secondly encouraged the children to think in less stereotypical terms about drugs and drug users. The problem-solving exercise, involving decision-making scenarios, showed small but positive trends between pre- and post-project solutions in more than half of the response categories. Methodological difficulties relating to evaluating such a project are discussed.
Valapour, Maryam; Paulson, Kristin M; Hilde, Alisha
2013-04-01
Publication is one of the primary rewards in the academic research community and is the first step in the dissemination of a new discovery that could lead to recognition and opportunity. Because of this, the publication of research can serve as a tacit endorsement of the methodology behind the science. This becomes a problem when vulnerable populations that are incapable of giving legitimate informed consent, such as prisoners, are used in research. The problem is especially critical in the field of transplant research, in which unverified consent can enable research that exploits the vulnerabilities of prisoners, especially those awaiting execution. Because the doctrine of informed consent is central to the protection of vulnerable populations, we have performed a historical analysis of the standards of informed consent in codes of international human subject protections to form the foundation for our limit and ban recommendations: (1) limit the publication of transplant research involving prisoners in general and (2) ban the publication of transplant research involving executed prisoners in particular. Copyright © 2013 American Association for the Study of Liver Diseases.
Evolutionary fuzzy modeling human diagnostic decisions.
Peña-Reyes, Carlos Andrés
2004-05-01
Fuzzy CoCo is a methodology, combining fuzzy logic and evolutionary computation, for constructing systems able to accurately predict the outcome of a human decision-making process, while providing an understandable explanation of the underlying reasoning. Fuzzy logic provides a formal framework for constructing systems exhibiting both good numeric performance (accuracy) and linguistic representation (interpretability). However, fuzzy modeling--meaning the construction of fuzzy systems--is an arduous task, demanding the identification of many parameters. To solve it, we use evolutionary computation techniques (specifically cooperative coevolution), which are widely used to search for adequate solutions in complex spaces. We have successfully applied the algorithm to model the decision processes involved in two breast cancer diagnostic problems, the WBCD problem and the Catalonia mammography interpretation problem, obtaining systems both of high performance and high interpretability. For the Catalonia problem, an evolved system was embedded within a Web-based tool-called COBRA-for aiding radiologists in mammography interpretation.
Prat, P; Aulinas, M; Turon, C; Comas, J; Poch, M
2009-01-01
Current management of sanitation infrastructures (sewer systems, wastewater treatment plant, receiving water, bypasses, deposits, etc) is not fulfilling the objectives of up to date legislation, to achieve a good ecological and chemical status of water bodies through integrated management. These made it necessary to develop new methodologies that help decision makers to improve the management in order to achieve that status. Decision Support Systems (DSS) based on Multi-Agent System (MAS) paradigm are promising tools to improve the integrated management. When all the different agents involved interact, new important knowledge emerges. This knowledge can be used to build better DSS and improve wastewater infrastructures management achieving the objectives planned by legislation. The paper describes a methodology to acquire this knowledge through a Role Playing Game (RPG). First of all there is an introduction about the wastewater problems, a definition of RPG, and the relation between RPG and MAS. Then it is explained how the RPG was built with two examples of game sessions and results. The paper finishes with a discussion about the uses of this methodology and future work.
Benchmark Problems Used to Assess Computational Aeroacoustics Codes
NASA Technical Reports Server (NTRS)
Dahl, Milo D.; Envia, Edmane
2005-01-01
The field of computational aeroacoustics (CAA) encompasses numerical techniques for calculating all aspects of sound generation and propagation in air directly from fundamental governing equations. Aeroacoustic problems typically involve flow-generated noise, with and without the presence of a solid surface, and the propagation of the sound to a receiver far away from the noise source. It is a challenge to obtain accurate numerical solutions to these problems. The NASA Glenn Research Center has been at the forefront in developing and promoting the development of CAA techniques and methodologies for computing the noise generated by aircraft propulsion systems. To assess the technological advancement of CAA, Glenn, in cooperation with the Ohio Aerospace Institute and the AeroAcoustics Research Consortium, organized and hosted the Fourth CAA Workshop on Benchmark Problems. Participants from industry and academia from both the United States and abroad joined to present and discuss solutions to benchmark problems. These demonstrated technical progress ranging from the basic challenges to accurate CAA calculations to the solution of CAA problems of increasing complexity and difficulty. The results are documented in the proceedings of the workshop. Problems were solved in five categories. In three of the five categories, exact solutions were available for comparison with CAA results. A fourth category of problems representing sound generation from either a single airfoil or a blade row interacting with a gust (i.e., problems relevant to fan noise) had approximate analytical or completely numerical solutions. The fifth category of problems involved sound generation in a viscous flow. In this case, the CAA results were compared with experimental data.
Semi-Supervised Learning of Lift Optimization of Multi-Element Three-Segment Variable Camber Airfoil
NASA Technical Reports Server (NTRS)
Kaul, Upender K.; Nguyen, Nhan T.
2017-01-01
This chapter describes a new intelligent platform for learning optimal designs of morphing wings based on Variable Camber Continuous Trailing Edge Flaps (VCCTEF) in conjunction with a leading edge flap called the Variable Camber Krueger (VCK). The new platform consists of a Computational Fluid Dynamics (CFD) methodology coupled with a semi-supervised learning methodology. The CFD component of the intelligent platform comprises of a full Navier-Stokes solution capability (NASA OVERFLOW solver with Spalart-Allmaras turbulence model) that computes flow over a tri-element inboard NASA Generic Transport Model (GTM) wing section. Various VCCTEF/VCK settings and configurations were considered to explore optimal design for high-lift flight during take-off and landing. To determine globally optimal design of such a system, an extremely large set of CFD simulations is needed. This is not feasible to achieve in practice. To alleviate this problem, a recourse was taken to a semi-supervised learning (SSL) methodology, which is based on manifold regularization techniques. A reasonable space of CFD solutions was populated and then the SSL methodology was used to fit this manifold in its entirety, including the gaps in the manifold where there were no CFD solutions available. The SSL methodology in conjunction with an elastodynamic solver (FiDDLE) was demonstrated in an earlier study involving structural health monitoring. These CFD-SSL methodologies define the new intelligent platform that forms the basis for our search for optimal design of wings. Although the present platform can be used in various other design and operational problems in engineering, this chapter focuses on the high-lift study of the VCK-VCCTEF system. Top few candidate design configurations were identified by solving the CFD problem in a small subset of the design space. The SSL component was trained on the design space, and was then used in a predictive mode to populate a selected set of test points outside of the given design space. The new design test space thus populated was evaluated by using the CFD component by determining the error between the SSL predictions and the true (CFD) solutions, which was found to be small. This demonstrates the proposed CFD-SSL methodologies for isolating the best design of the VCK-VCCTEF system, and it holds promise for quantitatively identifying best designs of flight systems, in general.
Prada, Sergio I
2017-12-01
The Medicaid Drug Utilization Review (DUR) program is a 2-phase process conducted by Medicaid state agencies. The first phase is a prospective DUR and involves electronically monitoring prescription drug claims to identify prescription-related problems, such as therapeutic duplication, contraindications, incorrect dosage, or duration of treatment. The second phase is a retrospective DUR and involves ongoing and periodic examinations of claims data to identify patterns of fraud, abuse, underutilization, drug-drug interaction, or medically unnecessary care, implementing corrective actions when needed. The Centers for Medicare & Medicaid Services requires each state to measure prescription drug cost-savings generated from its DUR programs on an annual basis, but it provides no guidance or unified methodology for doing so. To describe and synthesize the methodologies used by states to measure cost-savings using their Medicaid retrospective DUR program in federal fiscal years 2014 and 2015. For each state, the cost-savings methodologies included in the Medicaid DUR 2014 and 2015 reports were downloaded from Medicaid's website. The reports were then reviewed and synthesized. Methods described by the states were classified according to research designs often described in evaluation textbooks. In 2014, the most often used prescription drugs cost-savings estimation methodology for the Medicaid retrospective DUR program was a simple pre-post intervention method, without a comparison group (ie, 12 states). In 2015, the most common methodology used was a pre-post intervention method, with a comparison group (ie, 14 states). Comparisons of savings attributed to the program among states are still unreliable, because of a lack of a common methodology available for measuring cost-savings. There is great variation among states in the methods used to measure prescription drug utilization cost-savings. This analysis suggests that there is still room for improvement in terms of methodology transparency, which is important, because lack of transparency hinders states from learning from each other. Ultimately, the federal government needs to evaluate and improve its DUR program.
Applications of decision analysis and related techniques to industrial engineering problems at KSC
NASA Technical Reports Server (NTRS)
Evans, Gerald W.
1995-01-01
This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).
Finite-time H∞ control for linear continuous system with norm-bounded disturbance
NASA Astrophysics Data System (ADS)
Meng, Qingyi; Shen, Yanjun
2009-04-01
In this paper, the definition of finite-time H∞ control is presented. The system under consideration is subject to time-varying norm-bounded exogenous disturbance. The main aim of this paper is focused on the design a state feedback controller which ensures that the closed-loop system is finite-time bounded (FTB) and reduces the effect of the disturbance input on the controlled output to a prescribed level. A sufficient condition is presented for the solvability of this problem, which can be reduced to a feasibility problem involving linear matrix inequalities (LMIs). A detailed solving method is proposed for the restricted linear matrix inequalities. Finally, examples are given to show the validity of the methodology.
Object Transportation by Two Mobile Robots with Hand Carts
Hara, Tatsunori
2014-01-01
This paper proposes a methodology by which two small mobile robots can grasp, lift, and transport large objects using hand carts. The specific problems involve generating robot actions and determining the hand cart positions to achieve the stable loading of objects onto the carts. These problems are solved using nonlinear optimization, and we propose an algorithm for generating robot actions. The proposed method was verified through simulations and experiments using actual devices in a real environment. The proposed method could reduce the number of robots required to transport large objects with 50–60%. In addition, we demonstrated the efficacy of this task in real environments where errors occur in robot sensing and movement. PMID:27433499
Object Transportation by Two Mobile Robots with Hand Carts.
Sakuyama, Takuya; Figueroa Heredia, Jorge David; Ogata, Taiki; Hara, Tatsunori; Ota, Jun
2014-01-01
This paper proposes a methodology by which two small mobile robots can grasp, lift, and transport large objects using hand carts. The specific problems involve generating robot actions and determining the hand cart positions to achieve the stable loading of objects onto the carts. These problems are solved using nonlinear optimization, and we propose an algorithm for generating robot actions. The proposed method was verified through simulations and experiments using actual devices in a real environment. The proposed method could reduce the number of robots required to transport large objects with 50-60%. In addition, we demonstrated the efficacy of this task in real environments where errors occur in robot sensing and movement.
NASA Technical Reports Server (NTRS)
Lakin, W. D.
1981-01-01
The use of integrating matrices in solving differential equations associated with rotating beam configurations is examined. In vibration problems, by expressing the equations of motion of the beam in matrix notation, utilizing the integrating matrix as an operator, and applying the boundary conditions, the spatial dependence is removed from the governing partial differential equations and the resulting ordinary differential equations can be cast into standard eigenvalue form. Integrating matrices are derived based on two dimensional rectangular grids with arbitrary grid spacings allowed in one direction. The derivation of higher dimensional integrating matrices is the initial step in the generalization of the integrating matrix methodology to vibration and stability problems involving plates and shells.
Philosophical Aspects of Space Science
NASA Astrophysics Data System (ADS)
Poghosyan, Gevorg
2015-07-01
The modern astronomy and physics are closely related to the philosophy. If in the past philosophy was largely confined to interpretations of the results obtained by the natural sciences, in the present times it becomes a full member of the scientific research process. Philosophy is currently involved not only in the methodological problems of the natural sciences and formulation process of the general conclusions. In most cases, the philosophical considerations are allowed to make a choice between the different physical hypotheses and assumptions. A unified approach to solving the problems of philosophy and natural sciences becomes more important as the physical and philosophical aspects are often intertwined, forming a mold that defines our knowledge of today's leading edge.
Accurate quantum chemical calculations
NASA Technical Reports Server (NTRS)
Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.
1989-01-01
An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.
The Movable Type Method Applied to Protein-Ligand Binding.
Zheng, Zheng; Ucisik, Melek N; Merz, Kenneth M
2013-12-10
Accurately computing the free energy for biological processes like protein folding or protein-ligand association remains a challenging problem. Both describing the complex intermolecular forces involved and sampling the requisite configuration space make understanding these processes innately difficult. Herein, we address the sampling problem using a novel methodology we term "movable type". Conceptually it can be understood by analogy with the evolution of printing and, hence, the name movable type. For example, a common approach to the study of protein-ligand complexation involves taking a database of intact drug-like molecules and exhaustively docking them into a binding pocket. This is reminiscent of early woodblock printing where each page had to be laboriously created prior to printing a book. However, printing evolved to an approach where a database of symbols (letters, numerals, etc.) was created and then assembled using a movable type system, which allowed for the creation of all possible combinations of symbols on a given page, thereby, revolutionizing the dissemination of knowledge. Our movable type (MT) method involves the identification of all atom pairs seen in protein-ligand complexes and then creating two databases: one with their associated pairwise distant dependent energies and another associated with the probability of how these pairs can combine in terms of bonds, angles, dihedrals and non-bonded interactions. Combining these two databases coupled with the principles of statistical mechanics allows us to accurately estimate binding free energies as well as the pose of a ligand in a receptor. This method, by its mathematical construction, samples all of configuration space of a selected region (the protein active site here) in one shot without resorting to brute force sampling schemes involving Monte Carlo, genetic algorithms or molecular dynamics simulations making the methodology extremely efficient. Importantly, this method explores the free energy surface eliminating the need to estimate the enthalpy and entropy components individually. Finally, low free energy structures can be obtained via a free energy minimization procedure yielding all low free energy poses on a given free energy surface. Besides revolutionizing the protein-ligand docking and scoring problem this approach can be utilized in a wide range of applications in computational biology which involve the computation of free energies for systems with extensive phase spaces including protein folding, protein-protein docking and protein design.
Decision-problem state analysis methodology
NASA Technical Reports Server (NTRS)
Dieterly, D. L.
1980-01-01
A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.
Is manipulation of color effective in study of the global precedence effect?
Vidal-López, Joaquín; Romera-Vivancos, Juan Antonio
2009-04-01
This article evaluates the use of color manipulation in studying the effect of global precedence and the possible involvement of the magnocellular processing system. The analysis shows variations of color used in three studies produced changes on the global precedence effect, but findings based on this technique present some methodological problems and have little theoretical support from the magnocellular processing-system perspective. For this reason, more research is required to develop knowledge about the origin of these variations in global precedence.
Recent Advances in Modeling Hugoniots with Cheetah
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glaesemann, K R; Fried, L E
2005-07-26
We describe improvements to the Cheetah thermochemical-kinetics code's equilibrium solver to enable it to find a wider range of thermodynamic states. Cheetah supports a wide range of elements, condensed detonation products, and gas phase reactions. Therefore, Cheetah can be applied to a wide range of shock problems involving both energetic and non-energetic materials. An improve equation of state is also introduced. New experimental validations of Cheetah's equation of state methodology have been performed, including both reacted and unreacted Hugoniots.
A perspectivist review of supermetallicity studies. II
NASA Astrophysics Data System (ADS)
Taylor, B. J.
A summary of indirect deductions is provided, taking into account a high-dispersion analysis of Delta Pav conducted by Rodgers (1969), a study of three K giant - F dwarf binaries performed by Deming and Butler (1979), and investigations involving the Hyades giants. Attention is given to an analysis of explanations, the analyses reported by Peterson (1976), the most recent results, future work on the VSL Giants, a summary of deficiencies in the methodology of supermetallicity, and the present state of the M67 problem.
Recent Advances in Modeling Hugoniots with Cheetah
NASA Astrophysics Data System (ADS)
Glaesemann, K. R.; Fried, L. E.
2006-07-01
We describe improvements to the Cheetah thermochemical-kinetics code's equilibrium solver to enable it to find a wider range of thermodynamic states. Cheetah supports a wide range of elements, condensed detonation products, and gas phase reactions. Therefore, Cheetah can be applied to a wide range of shock problems involving both energetic and non-energetic materials. An improve equation of state is also introduced. New experimental validations of Cheetah's equation of state methodology have been performed, including both reacted and unreacted Hugoniots.
The Role of Metaphysical Naturalism in Science
NASA Astrophysics Data System (ADS)
Mahner, Martin
2012-10-01
This paper defends the view that metaphysical naturalism is a constitutive ontological principle of science in that the general empirical methods of science, such as observation, measurement and experiment, and thus the very production of empirical evidence, presuppose a no-supernature principle. It examines the consequences of metaphysical naturalism for the testability of supernatural claims, and it argues that explanations involving supernatural entities are pseudo-explanatory due to the many semantic and ontological problems of supernatural concepts. The paper also addresses the controversy about metaphysical versus methodological naturalism.
[Problem-posing as a nutritional education strategy with obese teenagers].
Rodrigues, Erika Marafon; Boog, Maria Cristina Faber
2006-05-01
Obesity is a public health issue with relevant social determinants in its etiology and where interventions with teenagers encounter complex biopsychological conditions. This study evaluated intervention in nutritional education through a problem-posing approach with 22 obese teenagers, treated collectively and individually for eight months. Speech acts were collected through the use of word cards, observer recording, and tape-recording. The study adopted a qualitative methodology, and the approach involved content analysis. Problem-posing facilitated changes in eating behavior, triggering reflections on nutritional practices, family circumstances, social stigma, interaction with health professionals, and religion. Teenagers under individual care posed problems more effectively in relation to eating, while those under collective care posed problems in relation to family and psychological issues, with effective qualitative eating changes in both groups. The intervention helped teenagers understand their life history and determinants of eating behaviors, spontaneously implementing eating changes and making them aware of possibilities for maintaining the new practices and autonomously exercising their role as protagonists in their own health care.
Volk, Martin; Lautenbach, Sven; van Delden, Hedwig; Newham, Lachlan T H; Seppelt, Ralf
2010-12-01
This article analyses the benefits and shortcomings of the recently developed decision support systems (DSS) FLUMAGIS, Elbe-DSS, CatchMODS, and MedAction. The analysis elaborates on the following aspects: (i) application area/decision problem, (ii) stakeholder interaction/users involved, (iii) structure of DSS/model structure, (iv) usage of the DSS, and finally (v) most important shortcomings. On the basis of this analysis, we formulate four criteria that we consider essential for the successful use of DSS in landscape and river basin management. The criteria relate to (i) system quality, (ii) user support and user training, (iii) perceived usefulness and (iv) user satisfaction. We can show that the availability of tools and technologies for DSS in landscape and river basin management is good to excellent. However, our investigations indicate that several problems have to be tackled. First of all, data availability and homogenisation, uncertainty analysis and uncertainty propagation and problems with model integration require further attention. Furthermore, the appropriate and methodological stakeholder interaction and the definition of 'what end-users really need and want' have been documented as general shortcomings of all four examples of DSS. Thus, we propose an iterative development process that enables social learning of the different groups involved in the development process, because it is easier to design a DSS for a group of stakeholders who actively participate in an iterative process. We also identify two important lines of further development in DSS: the use of interactive visualization tools and the methodology of optimization to inform scenario elaboration and evaluate trade-offs among environmental measures and management alternatives.
NASA Astrophysics Data System (ADS)
Volk, Martin; Lautenbach, Sven; van Delden, Hedwig; Newham, Lachlan T. H.; Seppelt, Ralf
2010-12-01
This article analyses the benefits and shortcomings of the recently developed decision support systems (DSS) FLUMAGIS, Elbe-DSS, CatchMODS, and MedAction. The analysis elaborates on the following aspects: (i) application area/decision problem, (ii) stakeholder interaction/users involved, (iii) structure of DSS/model structure, (iv) usage of the DSS, and finally (v) most important shortcomings. On the basis of this analysis, we formulate four criteria that we consider essential for the successful use of DSS in landscape and river basin management. The criteria relate to (i) system quality, (ii) user support and user training, (iii) perceived usefulness and (iv) user satisfaction. We can show that the availability of tools and technologies for DSS in landscape and river basin management is good to excellent. However, our investigations indicate that several problems have to be tackled. First of all, data availability and homogenisation, uncertainty analysis and uncertainty propagation and problems with model integration require further attention. Furthermore, the appropriate and methodological stakeholder interaction and the definition of `what end-users really need and want' have been documented as general shortcomings of all four examples of DSS. Thus, we propose an iterative development process that enables social learning of the different groups involved in the development process, because it is easier to design a DSS for a group of stakeholders who actively participate in an iterative process. We also identify two important lines of further development in DSS: the use of interactive visualization tools and the methodology of optimization to inform scenario elaboration and evaluate trade-offs among environmental measures and management alternatives.
Faux-Pas Test: A Proposal of a Standardized Short Version.
Fernández-Modamio, Mar; Arrieta-Rodríguez, Marta; Bengochea-Seco, Rosario; Santacoloma-Cabero, Iciar; Gómez de Tojeiro-Roce, Juan; García-Polavieja, Bárbara; González-Fraile, Eduardo; Martín-Carrasco, Manuel; Griffin, Kim; Gil-Sanz, David
2018-06-26
Previous research on theory of mind suggests that people with schizophrenia have difficulties with complex mentalization tasks that involve the integration of cognition and affective mental states. One of the tools most commonly used to assess theory of mind is the Faux-Pas Test. However, it presents two main methodological problems: 1) the lack of a standard scoring system; 2) the different versions are not comparable due to a lack of information on the stories used. These methodological problems make it difficult to draw conclusions about performance on this test by people with schizophrenia. The aim of this study was to develop a reduced version of the Faux-Pas test with adequate psychometric properties. The test was administered to control and clinical groups. Interrater and test-retest reliability were analyzed for each story in order to select the set of 10 stories included in the final reduced version. The shortened version showed good psychometric properties for controls and patients: test-retest reliability of 0.97 and 0.78, inter-rater reliability of 0.95 and 0.87 and Cronbach's alpha of 0.82 and 0.72.
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
Modified teaching approach for an enhanced medical physics graduate education experience
Rutel, IB
2011-01-01
Lecture-based teaching promotes a passive interaction with students. Opportunities to modify this format are available to enhance the overall learning experience for both students and instructors. The description for a discussion-based learning format is presented as it applies to a graduate curriculum with technical (formal mathematical derivation) topics. The presented hybrid method involves several techniques, including problem-based learning, modeling, and online lectures, eliminating didactic lectures. The results from an end-of-course evaluation show that the students appear to prefer the modified format over the more traditional methodology of “lecture only” contact time. These results are motivation for further refinement and continued implementation of the described methodology in the current course and potentially other courses within the department graduate curriculum. PMID:22279505
Using Risk Assessment Methodologies to Meet Management Objectives
NASA Technical Reports Server (NTRS)
DeMott, D. L.
2015-01-01
Corporate and program objectives focus on desired performance and results. ?Management decisions that affect how to meet these objectives now involve a complex mix of: technology, safety issues, operations, process considerations, employee considerations, regulatory requirements, financial concerns and legal issues. ?Risk Assessments are a tool for decision makers to understand potential consequences and be in a position to reduce, mitigate or eliminate costly mistakes or catastrophic failures. Using a risk assessment methodology is only a starting point. ?A risk assessment program provides management with important input in the decision making process. ?A pro-active organization looks to the future to avoid problems, a reactive organization can be blindsided by risks that could have been avoided. ?You get out what you put in, how useful your program is will be up to the individual organization.
Combining EEG and eye movement recording in free viewing: Pitfalls and possibilities.
Nikolaev, Andrey R; Meghanathan, Radha Nila; van Leeuwen, Cees
2016-08-01
Co-registration of EEG and eye movement has promise for investigating perceptual processes in free viewing conditions, provided certain methodological challenges can be addressed. Most of these arise from the self-paced character of eye movements in free viewing conditions. Successive eye movements occur within short time intervals. Their evoked activity is likely to distort the EEG signal during fixation. Due to the non-uniform distribution of fixation durations, these distortions are systematic, survive across-trials averaging, and can become a source of confounding. We illustrate this problem with effects of sequential eye movements on the evoked potentials and time-frequency components of EEG and propose a solution based on matching of eye movement characteristics between experimental conditions. The proposal leads to a discussion of which eye movement characteristics are to be matched, depending on the EEG activity of interest. We also compare segmentation of EEG into saccade-related epochs relative to saccade and fixation onsets and discuss the problem of baseline selection and its solution. Further recommendations are given for implementing EEG-eye movement co-registration in free viewing conditions. By resolving some of the methodological problems involved, we aim to facilitate the transition from the traditional stimulus-response paradigm to the study of visual perception in more naturalistic conditions. Copyright © 2016 Elsevier Inc. All rights reserved.
Self-adaptive MOEA feature selection for classification of bankruptcy prediction data.
Gaspar-Cunha, A; Recio, G; Costa, L; Estébanez, C
2014-01-01
Bankruptcy prediction is a vast area of finance and accounting whose importance lies in the relevance for creditors and investors in evaluating the likelihood of getting into bankrupt. As companies become complex, they develop sophisticated schemes to hide their real situation. In turn, making an estimation of the credit risks associated with counterparts or predicting bankruptcy becomes harder. Evolutionary algorithms have shown to be an excellent tool to deal with complex problems in finances and economics where a large number of irrelevant features are involved. This paper provides a methodology for feature selection in classification of bankruptcy data sets using an evolutionary multiobjective approach that simultaneously minimise the number of features and maximise the classifier quality measure (e.g., accuracy). The proposed methodology makes use of self-adaptation by applying the feature selection algorithm while simultaneously optimising the parameters of the classifier used. The methodology was applied to four different sets of data. The obtained results showed the utility of using the self-adaptation of the classifier.
NASA Astrophysics Data System (ADS)
Dodick, Jeff; Argamon, Shlomo; Chase, Paul
2009-08-01
A key focus of current science education reforms involves developing inquiry-based learning materials. However, without an understanding of how working scientists actually do science, such learning materials cannot be properly developed. Until now, research on scientific reasoning has focused on cognitive studies of individual scientific fields. However, the question remains as to whether scientists in different fields fundamentally rely on different methodologies. Although many philosophers and historians of science do indeed assert that there is no single monolithic scientific method, this has never been tested empirically. We therefore approach this problem by analyzing patterns of language used by scientists in their published work. Our results demonstrate systematic variation in language use between types of science that are thought to differ in their characteristic methodologies. The features of language use that were found correspond closely to a proposed distinction between Experimental Sciences (e.g., chemistry) and Historical Sciences (e.g., paleontology); thus, different underlying rhetorical and conceptual mechanisms likely operate for scientific reasoning and communication in different contexts.
Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira
2014-08-01
This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Self-Adaptive MOEA Feature Selection for Classification of Bankruptcy Prediction Data
Gaspar-Cunha, A.; Recio, G.; Costa, L.; Estébanez, C.
2014-01-01
Bankruptcy prediction is a vast area of finance and accounting whose importance lies in the relevance for creditors and investors in evaluating the likelihood of getting into bankrupt. As companies become complex, they develop sophisticated schemes to hide their real situation. In turn, making an estimation of the credit risks associated with counterparts or predicting bankruptcy becomes harder. Evolutionary algorithms have shown to be an excellent tool to deal with complex problems in finances and economics where a large number of irrelevant features are involved. This paper provides a methodology for feature selection in classification of bankruptcy data sets using an evolutionary multiobjective approach that simultaneously minimise the number of features and maximise the classifier quality measure (e.g., accuracy). The proposed methodology makes use of self-adaptation by applying the feature selection algorithm while simultaneously optimising the parameters of the classifier used. The methodology was applied to four different sets of data. The obtained results showed the utility of using the self-adaptation of the classifier. PMID:24707201
ERIC Educational Resources Information Center
Gauthier, Benoit; And Others
1997-01-01
Identifies the more representative problem-solving models in environmental education. Suggests the addition of a strategy for defining a problem situation using Soft Systems Methodology to environmental education activities explicitly designed for the development of critical thinking. Contains 45 references. (JRH)
NASA Astrophysics Data System (ADS)
Arsenault, Louis-Francois; Neuberg, Richard; Hannah, Lauren A.; Millis, Andrew J.
We present a machine learning-based statistical regression approach to the inversion of Fredholm integrals of the first kind by studying an important example for the quantum materials community, the analytical continuation problem of quantum many-body physics. It involves reconstructing the frequency dependence of physical excitation spectra from data obtained at specific points in the complex frequency plane. The approach provides a natural regularization in cases where the inverse of the Fredholm kernel is ill-conditioned and yields robust error metrics. The stability of the forward problem permits the construction of a large database of input-output pairs. Machine learning methods applied to this database generate approximate solutions which are projected onto the subspace of functions satisfying relevant constraints. We show that for low input noise the method performs as well or better than Maximum Entropy (MaxEnt) under standard error metrics, and is substantially more robust to noise. We expect the methodology to be similarly effective for any problem involving a formally ill-conditioned inversion, provided that the forward problem can be efficiently solved. AJM was supported by the Office of Science of the U.S. Department of Energy under Subcontract No. 3F-3138 and LFA by the Columbia Univeristy IDS-ROADS project, UR009033-05 which also provided part support to RN and LH.
NASA Astrophysics Data System (ADS)
Yee, Eugene
2007-04-01
Although a great deal of research effort has been focused on the forward prediction of the dispersion of contaminants (e.g., chemical and biological warfare agents) released into the turbulent atmosphere, much less work has been directed toward the inverse prediction of agent source location and strength from the measured concentration, even though the importance of this problem for a number of practical applications is obvious. In general, the inverse problem of source reconstruction is ill-posed and unsolvable without additional information. It is demonstrated that a Bayesian probabilistic inferential framework provides a natural and logically consistent method for source reconstruction from a limited number of noisy concentration data. In particular, the Bayesian approach permits one to incorporate prior knowledge about the source as well as additional information regarding both model and data errors. The latter enables a rigorous determination of the uncertainty in the inference of the source parameters (e.g., spatial location, emission rate, release time, etc.), hence extending the potential of the methodology as a tool for quantitative source reconstruction. A model (or, source-receptor relationship) that relates the source distribution to the concentration data measured by a number of sensors is formulated, and Bayesian probability theory is used to derive the posterior probability density function of the source parameters. A computationally efficient methodology for determination of the likelihood function for the problem, based on an adjoint representation of the source-receptor relationship, is described. Furthermore, we describe the application of efficient stochastic algorithms based on Markov chain Monte Carlo (MCMC) for sampling from the posterior distribution of the source parameters, the latter of which is required to undertake the Bayesian computation. The Bayesian inferential methodology for source reconstruction is validated against real dispersion data for two cases involving contaminant dispersion in highly disturbed flows over urban and complex environments where the idealizations of horizontal homogeneity and/or temporal stationarity in the flow cannot be applied to simplify the problem. Furthermore, the methodology is applied to the case of reconstruction of multiple sources.
Reliable and efficient solution of genome-scale models of Metabolism and macromolecular Expression
Ma, Ding; Yang, Laurence; Fleming, Ronan M. T.; ...
2017-01-18
Currently, Constraint-Based Reconstruction and Analysis (COBRA) is the only methodology that permits integrated modeling of Metabolism and macromolecular Expression (ME) at genome-scale. Linear optimization computes steady-state flux solutions to ME models, but flux values are spread over many orders of magnitude. Data values also have greatly varying magnitudes. Furthermore, standard double-precision solvers may return inaccurate solutions or report that no solution exists. Exact simplex solvers based on rational arithmetic require a near-optimal warm start to be practical on large problems (current ME models have 70,000 constraints and variables and will grow larger). We also developed a quadrupleprecision version of ourmore » linear and nonlinear optimizer MINOS, and a solution procedure (DQQ) involving Double and Quad MINOS that achieves reliability and efficiency for ME models and other challenging problems tested here. DQQ will enable extensive use of large linear and nonlinear models in systems biology and other applications involving multiscale data.« less
Reliable and efficient solution of genome-scale models of Metabolism and macromolecular Expression
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Ding; Yang, Laurence; Fleming, Ronan M. T.
Currently, Constraint-Based Reconstruction and Analysis (COBRA) is the only methodology that permits integrated modeling of Metabolism and macromolecular Expression (ME) at genome-scale. Linear optimization computes steady-state flux solutions to ME models, but flux values are spread over many orders of magnitude. Data values also have greatly varying magnitudes. Furthermore, standard double-precision solvers may return inaccurate solutions or report that no solution exists. Exact simplex solvers based on rational arithmetic require a near-optimal warm start to be practical on large problems (current ME models have 70,000 constraints and variables and will grow larger). We also developed a quadrupleprecision version of ourmore » linear and nonlinear optimizer MINOS, and a solution procedure (DQQ) involving Double and Quad MINOS that achieves reliability and efficiency for ME models and other challenging problems tested here. DQQ will enable extensive use of large linear and nonlinear models in systems biology and other applications involving multiscale data.« less
Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller
2018-02-01
Given the complexity of factors contributing to alcohol misuse, appropriate epistemologies and methodologies are needed to understand and intervene meaningfully. We aimed to (1) provide an overview of computational modeling methodologies, with an emphasis on system dynamics modeling; (2) explain how community-based system dynamics modeling can forge new directions in alcohol prevention research; and (3) present a primer on how to build alcohol misuse simulation models using system dynamics modeling, with an emphasis on stakeholder involvement, data sources and model validation. Throughout, we use alcohol misuse among college students in the United States as a heuristic example for demonstrating these methodologies. System dynamics modeling employs a top-down aggregate approach to understanding dynamically complex problems. Its three foundational properties-stocks, flows and feedbacks-capture non-linearity, time-delayed effects and other system characteristics. As a methodological choice, system dynamics modeling is amenable to participatory approaches; in particular, community-based system dynamics modeling has been used to build impactful models for addressing dynamically complex problems. The process of community-based system dynamics modeling consists of numerous stages: (1) creating model boundary charts, behavior-over-time-graphs and preliminary system dynamics models using group model-building techniques; (2) model formulation; (3) model calibration; (4) model testing and validation; and (5) model simulation using learning-laboratory techniques. Community-based system dynamics modeling can provide powerful tools for policy and intervention decisions that can result ultimately in sustainable changes in research and action in alcohol misuse prevention. © 2017 Society for the Study of Addiction.
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Toniolo, Matthew D.; Tartabini, Paul V.; Roithmayr, Carlos M.; Albertson, Cindy W.; Karlgaard, Christopher D.
2016-01-01
The objective of this report is to develop and implement a physics based method for analysis and simulation of multi-body dynamics including launch vehicle stage separation. The constraint force equation (CFE) methodology discussed in this report provides such a framework for modeling constraint forces and moments acting at joints when the vehicles are still connected. Several stand-alone test cases involving various types of joints were developed to validate the CFE methodology. The results were compared with ADAMS(Registered Trademark) and Autolev, two different industry standard benchmark codes for multi-body dynamic analysis and simulations. However, these two codes are not designed for aerospace flight trajectory simulations. After this validation exercise, the CFE algorithm was implemented in Program to Optimize Simulated Trajectories II (POST2) to provide a capability to simulate end-to-end trajectories of launch vehicles including stage separation. The POST2/CFE methodology was applied to the STS-1 Space Shuttle solid rocket booster (SRB) separation and Hyper-X Research Vehicle (HXRV) separation from the Pegasus booster as a further test and validation for its application to launch vehicle stage separation problems. Finally, to demonstrate end-to-end simulation capability, POST2/CFE was applied to the ascent, orbit insertion, and booster return of a reusable two-stage-to-orbit (TSTO) vehicle concept. With these validation exercises, POST2/CFE software can be used for performing conceptual level end-to-end simulations, including launch vehicle stage separation, for problems similar to those discussed in this report.
NASA Technical Reports Server (NTRS)
Taylor, Arthur C., III; Hou, Gene W.
1996-01-01
An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.
Structured Uncertainty Bound Determination From Data for Control and Performance Validation
NASA Technical Reports Server (NTRS)
Lim, Kyong B.
2003-01-01
This report attempts to document the broad scope of issues that must be satisfactorily resolved before one can expect to methodically obtain, with a reasonable confidence, a near-optimal robust closed loop performance in physical applications. These include elements of signal processing, noise identification, system identification, model validation, and uncertainty modeling. Based on a recently developed methodology involving a parameterization of all model validating uncertainty sets for a given linear fractional transformation (LFT) structure and noise allowance, a new software, Uncertainty Bound Identification (UBID) toolbox, which conveniently executes model validation tests and determine uncertainty bounds from data, has been designed and is currently available. This toolbox also serves to benchmark the current state-of-the-art in uncertainty bound determination and in turn facilitate benchmarking of robust control technology. To help clarify the methodology and use of the new software, two tutorial examples are provided. The first involves the uncertainty characterization of a flexible structure dynamics, and the second example involves a closed loop performance validation of a ducted fan based on an uncertainty bound from data. These examples, along with other simulation and experimental results, also help describe the many factors and assumptions that determine the degree of success in applying robust control theory to practical problems.
A content validity study of signs, symptoms and diseases/health problems expressed in LIBRAS1
Aragão, Jamilly da Silva; de França, Inacia Sátiro Xavier; Coura, Alexsandro Silva; de Sousa, Francisco Stélio; Batista, Joana D'arc Lyra; Magalhães, Isabella Medeiros de Oliveira
2015-01-01
Objectives: to validate the content of signs, symptoms and diseases/health problems expressed in LIBRAS for people with deafness Method: methodological development study, which involved 36 people with deafness and three LIBRAS specialists. The study was conducted in three stages: investigation of the signs, symptoms and diseases/health problems, referred to by people with deafness, reported in a questionnaire; video recordings of how people with deafness express, through LIBRA, the signs, symptoms and diseases/health problems; and validation of the contents of the recordings of the expressions by LIBRAS specialists. Data were processed in a spreadsheet and analyzed using univariate tables, with absolute frequencies and percentages. The validation results were analyzed using the Content Validity Index (CVI). Results: 33 expressions in LIBRAS, of signs, symptoms and diseases/health problems were evaluated, and 28 expressions obtained a satisfactory CVI (1.00). Conclusions: the signs, symptoms and diseases/health problems expressed in LIBRAS presented validity, in the study region, for health professionals, especially nurses, for use in the clinical anamnesis of the nursing consultation for people with deafness. PMID:26625991
A content validity study of signs, symptoms and diseases/health problems expressed in LIBRAS.
Aragão, Jamilly da Silva; de França, Inacia Sátiro Xavier; Coura, Alexsandro Silva; de Sousa, Francisco Stélio; Batista, Joana D'arc Lyra; Magalhães, Isabella Medeiros de Oliveira
2015-01-01
To validate the content of signs, symptoms and diseases/health problems expressed in LIBRAS for people with deafness. Method: Methodological development study, which involved 36 people with deafness and three LIBRAS specialists. The study was conducted in three stages: investigation of the signs, symptoms and diseases/health problems, referred to by people with deafness, reported in a questionnaire; video recordings of how people with deafness express, through LIBRA, the signs, symptoms and diseases/health problems; and validation of the contents of the recordings of the expressions by LIBRAS specialists. Data were processed in a spreadsheet and analyzed using univariate tables, with absolute frequencies and percentages. The validation results were analyzed using the Content Validity Index (CVI). 33 expressions in LIBRAS, of signs, symptoms and diseases/health problems were evaluated, and 28 expressions obtained a satisfactory CVI (1.00). The signs, symptoms and diseases/health problems expressed in LIBRAS presented validity, in the study region, for health professionals, especially nurses, for use in the clinical anamnesis of the nursing consultation for people with deafness.
Polcin, Douglas L
Communities throughout the U.S. are struggling to find solutions for serious and persistent homelessness. Alcohol and drug problems can be causes and consequences of homelessness, as well as co-occurring problems that complicate efforts to succeed in finding stable housing. Two prominent service models exist, one known as "Housing First" takes a harm reduction approach and the other known as the "linear" model typically supports a goal of abstinence from alcohol and drugs. Despite their popularity, the research supporting these models suffers from methodological problems and inconsistent findings. One purpose of this paper is to describe systematic reviews of the homelessness services literature, which illustrate weaknesses in research designs and inconsistent conclusions about the effectiveness of current models. Problems among some of the seminal studies on homelessness include poorly defined inclusion and exclusion criteria, inadequate measures of alcohol and drug use, unspecified or poorly implemented comparison conditions, and lack of procedures documenting adherence to service models. Several recent papers have suggested broader based approaches for homeless services that integrate alternatives and respond better to consumer needs. Practical considerations for implementing a broader system of services are described and peer managed recovery homes are presented as examples of services that address some of the gaps in current approaches. Three issues are identified that need more attention from researchers: 1) improving upon the methodological limitations in current studies, 2) assessing the impact of broader based, integrated services on outcome, and 3) assessing approaches to the service needs of homeless persons involved in the criminal justice system.
Polcin, Douglas L.
2016-01-01
Abstract Communities throughout the U.S. are struggling to find solutions for serious and persistent homelessness. Alcohol and drug problems can be causes and consequences of homelessness, as well as co-occurring problems that complicate efforts to succeed in finding stable housing. Two prominent service models exist, one known as “Housing First” takes a harm reduction approach and the other known as the “linear” model typically supports a goal of abstinence from alcohol and drugs. Despite their popularity, the research supporting these models suffers from methodological problems and inconsistent findings. One purpose of this paper is to describe systematic reviews of the homelessness services literature, which illustrate weaknesses in research designs and inconsistent conclusions about the effectiveness of current models. Problems among some of the seminal studies on homelessness include poorly defined inclusion and exclusion criteria, inadequate measures of alcohol and drug use, unspecified or poorly implemented comparison conditions, and lack of procedures documenting adherence to service models. Several recent papers have suggested broader based approaches for homeless services that integrate alternatives and respond better to consumer needs. Practical considerations for implementing a broader system of services are described and peer-managed recovery homes are presented as examples of services that address some of the gaps in current approaches. Three issues are identified that need more attention from researchers: (1) improving upon the methodological limitations in current studies, (2) assessing the impact of broader based, integrated services on outcome, and (3) assessing approaches to the service needs of homeless persons involved in the criminal justice system. PMID:27092027
Diffusion, decolonializing, and participatory action research.
Woodward, William R; Hetley, Richard S
2007-03-01
Miki Takasuna describes knowledge transfer between elite communities of scientists, a process by which ideas become structurally transformed in the host culture. By contrast, a process that we have termed knowledge transfer by deelitization occurs when (a) participatory action researchers work with a community to identify a problem involving oppression or exploitation. Then (b) community members suggest solutions and acquire the tools of analysis and action to pursue social actions. (c) Disadvantaged persons thereby become more aware of their own abilities and resources, and persons with special expertise become more effective. (d) Rather than detachment and value neutrality, this joint process involves advocacy and structural transformation. In the examples of participatory action research documented here, Third World social scientists collaborated with indigenous populations to solve problems of literacy, community-building, land ownership, and political voice. Western social scientists, inspired by these non-Western scientists, then joined in promoting PAR both in the Third World and in Europe and the Americas, e.g., adapting it for solving problems of people with disabilities or disenfranchised women. Emancipatory goals such as these may even help North American psychologists to break free of some methodological chains and to bring about social and political change.
A variational Bayes spatiotemporal model for electromagnetic brain mapping.
Nathoo, F S; Babul, A; Moiseev, A; Virji-Babul, N; Beg, M F
2014-03-01
In this article, we present a new variational Bayes approach for solving the neuroelectromagnetic inverse problem arising in studies involving electroencephalography (EEG) and magnetoencephalography (MEG). This high-dimensional spatiotemporal estimation problem involves the recovery of time-varying neural activity at a large number of locations within the brain, from electromagnetic signals recorded at a relatively small number of external locations on or near the scalp. Framing this problem within the context of spatial variable selection for an underdetermined functional linear model, we propose a spatial mixture formulation where the profile of electrical activity within the brain is represented through location-specific spike-and-slab priors based on a spatial logistic specification. The prior specification accommodates spatial clustering in brain activation, while also allowing for the inclusion of auxiliary information derived from alternative imaging modalities, such as functional magnetic resonance imaging (fMRI). We develop a variational Bayes approach for computing estimates of neural source activity, and incorporate a nonparametric bootstrap for interval estimation. The proposed methodology is compared with several alternative approaches through simulation studies, and is applied to the analysis of a multimodal neuroimaging study examining the neural response to face perception using EEG, MEG, and fMRI. © 2013, The International Biometric Society.
Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J
2008-04-01
Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.
Shukla, Nagesh; Keast, John E; Ceglarek, Darek
2014-10-01
The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Hitchings, Julia E.; Spoth, Richard L.
2010-01-01
Conduct problems are strong positive predictors of substance use and problem substance use among teens, whereas predictive associations of depressed mood with these outcomes are mixed. Conduct problems and depressed mood often co-occur, and such co-occurrence may heighten risk for negative outcomes. Thus, this study examined the interaction of conduct problems and depressed mood at age 11 in relation to substance use and problem use at age 18, and possible mediation through peer substance use at age 16. Analyses of multirater longitudinal data collected from 429 rural youths (222 girls) and their families were conducted using a methodology for testing latent variable interactions. The link between the conduct problems X depressed mood interaction and adolescent substance use was negative and statistically significant. Unexpectedly, positive associations of conduct problems with substance use were stronger at lower levels of depressed mood. A significant negative interaction in relation to peer substance use also was observed, and the estimated indirect effect of the interaction on adolescent use through peer use as a mediator was statistically significant. Findings illustrate the complexity of multiproblem youth. PMID:18455886
Bringing the Unidata IDV to the Cloud
NASA Astrophysics Data System (ADS)
Fisher, W. I.; Oxelson Ganter, J.
2015-12-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While traditional software engineering provides a suite of tools and methodologies which may mitigate this issue, they are typically ignored by developers lacking a background in software engineering. Causing further problems, these methodologies are best applied at the start of project; trying to apply them to an existing, mature project can require an immense effort. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. As a result of these issues, there exists a large body of software which is simultaneously critical to the scientists who are dependent upon it, and yet increasingly difficult to maintain.The solution to this problem was partially provided with the advent of Cloud Computing; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. When coupled with containerization technology such as Docker, we are able to easily bring the same visualization software to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be.Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.
NASA Astrophysics Data System (ADS)
Greca, Ileana M.
2016-03-01
Several international reports promote the use of the inquiry teaching methodology for improvements in science education at elementary school. Nevertheless, research indicates that pre-service elementary teachers have insufficient experience with this methodology and when they try to implement it, the theory they learnt in their university education clashes with the classroom practice they observe, a problem that has also been noted with other innovative methodologies. So, it appears essential for pre-service teachers to conduct supportive reflective practice during their education to integrate theory and practice, which various studies suggest is not usually done. Our study shows how opening up a third discursive space can assist this supportive reflective practice. The third discursive space appears when pre-service teachers are involved in specific activities that allow them to contrast the discourses of theoretical knowledge taught at university with practical knowledge arising from their ideas on science and science teaching and their observations during classroom practice. The case study of three pre-service teachers shows that this strategy was fundamental in helping them to integrate theory and practice, resulting in a better understanding of the inquiry methodology and its application in the classroom.
Eco-analytical Methodology in Environmental Problems Monitoring
NASA Astrophysics Data System (ADS)
Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.
2017-01-01
Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.
High-performance parallel analysis of coupled problems for aircraft propulsion
NASA Technical Reports Server (NTRS)
Felippa, C. A.; Farhat, C.; Lanteri, S.; Maman, N.; Piperno, S.; Gumaste, U.
1994-01-01
This research program deals with the application of high-performance computing methods for the analysis of complete jet engines. We have entitled this program by applying the two dimensional parallel aeroelastic codes to the interior gas flow problem of a bypass jet engine. The fluid mesh generation, domain decomposition, and solution capabilities were successfully tested. We then focused attention on methodology for the partitioned analysis of the interaction of the gas flow with a flexible structure and with the fluid mesh motion that results from these structural displacements. This is treated by a new arbitrary Lagrangian-Eulerian (ALE) technique that models the fluid mesh motion as that of a fictitious mass-spring network. New partitioned analysis procedures to treat this coupled three-component problem are developed. These procedures involved delayed corrections and subcycling. Preliminary results on the stability, accuracy, and MPP computational efficiency are reported.
Transient responses' optimization by means of set-based multi-objective evolution
NASA Astrophysics Data System (ADS)
Avigad, Gideon; Eisenstadt, Erella; Goldvard, Alex; Salomon, Shaul
2012-04-01
In this article, a novel solution to multi-objective problems involving the optimization of transient responses is suggested. It is claimed that the common approach of treating such problems by introducing auxiliary objectives overlooks tradeoffs that should be presented to the decision makers. This means that, if at some time during the responses, one of the responses is optimal, it should not be overlooked. An evolutionary multi-objective algorithm is suggested in order to search for these optimal solutions. For this purpose, state-wise domination is utilized with a new crowding measure for ordered sets being suggested. The approach is tested on both artificial as well as on real life problems in order to explain the methodology and demonstrate its applicability and importance. The results indicate that, from an engineering point of view, the approach possesses several advantages over existing approaches. Moreover, the applications highlight the importance of set-based evolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1988-06-01
This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 58 refs., 15 figs., 4 tabs.« less
New Approaches to HSCT Multidisciplinary Design and Optimization
NASA Technical Reports Server (NTRS)
Schrage, Daniel P.; Craig, James I.; Fulton, Robert E.; Mistree, Farrokh
1999-01-01
New approaches to MDO have been developed and demonstrated during this project on a particularly challenging aeronautics problem- HSCT Aeroelastic Wing Design. To tackle this problem required the integration of resources and collaboration from three Georgia Tech laboratories: ASDL, SDL, and PPRL, along with close coordination and participation from industry. Its success can also be contributed to the close interaction and involvement of fellows from the NASA Multidisciplinary Analysis and Optimization (MAO) program, which was going on in parallel, and provided additional resources to work the very complex, multidisciplinary problem, along with the methods being developed. The development of the Integrated Design Engineering Simulator (IDES) and its initial demonstration is a necessary first step in transitioning the methods and tools developed to larger industrial sized problems of interest. It also provides a framework for the implementation and demonstration of the methodology. Attachment: Appendix A - List of publications. Appendix B - Year 1 report. Appendix C - Year 2 report. Appendix D - Year 3 report. Appendix E - accompanying CDROM.
Cihan, Abdullah; Birkholzer, Jens; Bianchi, Marco
2014-12-31
Large-scale pressure increases resulting from carbon dioxide (CO 2) injection in the subsurface can potentially impact caprock integrity, induce reactivation of critically stressed faults, and drive CO 2 or brine through conductive features into shallow groundwater. Pressure management involving the extraction of native fluids from storage formations can be used to minimize pressure increases while maximizing CO2 storage. However, brine extraction requires pumping, transportation, possibly treatment, and disposal of substantial volumes of extracted brackish or saline water, all of which can be technically challenging and expensive. This paper describes a constrained differential evolution (CDE) algorithm for optimal well placement andmore » injection/ extraction control with the goal of minimizing brine extraction while achieving predefined pressure contraints. The CDE methodology was tested for a simple optimization problem whose solution can be partially obtained with a gradient-based optimization methodology. The CDE successfully estimated the true global optimum for both extraction well location and extraction rate, needed for the test problem. A more complex example application of the developed strategy was also presented for a hypothetical CO 2 storage scenario in a heterogeneous reservoir consisting of a critically stressed fault nearby an injection zone. Through the CDE optimization algorithm coupled to a numerical vertically-averaged reservoir model, we successfully estimated optimal rates and locations for CO 2 injection and brine extraction wells while simultaneously satisfying multiple pressure buildup constraints to avoid fault activation and caprock fracturing. The study shows that the CDE methodology is a very promising tool to solve also other optimization problems related to GCS, such as reducing ‘Area of Review’, monitoring design, reducing risk of leakage and increasing storage capacity and trapping.« less
Dynamic Decision Making under Uncertainty and Partial Information
2017-01-30
order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those
[The ethical reflection approach in decision-making processes in health institutes].
Gruat, Renaud
2015-12-01
Except in the specific case of end-of-life care, the law says nothing about the way in which health professionals must carry out ethical reflection regarding the treatment of their patients. A problem-solving methodology called the "ethical reflection approach" performed over several stages can be used. The decision-making process involves the whole team and draws on the ability of each caregiver to put forward a reasoned argument, in the interest of the patient. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
NASA Technical Reports Server (NTRS)
Beggs, John H.
2000-01-01
The upwind leapfrog or Linear Bicharacteristic Scheme (LBS) has previously been extended to treat lossy dielectric and magnetic materials. This paper examines different methodologies for treatment of the electric loss term in the Linear Bicharacteristic Scheme for computational electromagnetics. Several different treatments of the electric loss term using the LBS are explored and compared on one-dimensional model problems involving reflection from lossy dielectric materials on both uniform and nonuniform grids. Results using these LBS implementations are also compared with the FDTD method for convenience.
Hierarchical Parallelism in Finite Difference Analysis of Heat Conduction
NASA Technical Reports Server (NTRS)
Padovan, Joseph; Krishna, Lala; Gute, Douglas
1997-01-01
Based on the concept of hierarchical parallelism, this research effort resulted in highly efficient parallel solution strategies for very large scale heat conduction problems. Overall, the method of hierarchical parallelism involves the partitioning of thermal models into several substructured levels wherein an optimal balance into various associated bandwidths is achieved. The details are described in this report. Overall, the report is organized into two parts. Part 1 describes the parallel modelling methodology and associated multilevel direct, iterative and mixed solution schemes. Part 2 establishes both the formal and computational properties of the scheme.
NASA Astrophysics Data System (ADS)
Kamat, S. R.; Zula, N. E. N. Md; Rayme, N. S.; Shamsuddin, S.; Husain, K.
2017-06-01
Warehouse is an important entity in manufacturing organizations. It usually involves working activities that relate ergonomics risk factors including repetitive and heavy lifting activities. Aerospace manufacturing workers are prone of having musculoskeletal disorder (MSD) problems because of the manual handling activities. From the questionnaires is states that the workers may have experience discomforts experience during manual handling work. Thus, the objectives of this study are; to investigate the body posture and analyze the level of discomfort for body posture of the workers while performing the repetitive and heavy lifting activities that cause MSD problems and to suggest proper body posture and alternatives to reduce the MSD related problems. Methodology of this study involves interviews, questionnaires distribution, anthropometry measurements, RULA (Right Upper Limb Assessment) assessment sheet and CATIA V5 RULA analysis, NIOSH lifting index (LI) and recommended weight limit (RWL). Ten workers are selected for pilot study and as for anthropometry measurement all workers in the warehouse department were involved. From the first pilot study, the RULA assessment score in CATIA V5 shows the highest score which is 7 for all postures and results after improvement of working posture is very low hence, detecting weight of the material handling is not in recommendation. To reduce the risk of MSD through the improvisation of working posture, the weight limit is also calculated in order to have a RWL for each worker. Therefore, proposing a guideline for the aerospace workers involved with repetitive movement and excessive lifting will help in reducing the risk of getting MSD.
Prada, Sergio I.
2017-01-01
Background The Medicaid Drug Utilization Review (DUR) program is a 2-phase process conducted by Medicaid state agencies. The first phase is a prospective DUR and involves electronically monitoring prescription drug claims to identify prescription-related problems, such as therapeutic duplication, contraindications, incorrect dosage, or duration of treatment. The second phase is a retrospective DUR and involves ongoing and periodic examinations of claims data to identify patterns of fraud, abuse, underutilization, drug–drug interaction, or medically unnecessary care, implementing corrective actions when needed. The Centers for Medicare & Medicaid Services requires each state to measure prescription drug cost-savings generated from its DUR programs on an annual basis, but it provides no guidance or unified methodology for doing so. Objectives To describe and synthesize the methodologies used by states to measure cost-savings using their Medicaid retrospective DUR program in federal fiscal years 2014 and 2015. Method For each state, the cost-savings methodologies included in the Medicaid DUR 2014 and 2015 reports were downloaded from Medicaid's website. The reports were then reviewed and synthesized. Methods described by the states were classified according to research designs often described in evaluation textbooks. Discussion In 2014, the most often used prescription drugs cost-savings estimation methodology for the Medicaid retrospective DUR program was a simple pre-post intervention method, without a comparison group (ie, 12 states). In 2015, the most common methodology used was a pre-post intervention method, with a comparison group (ie, 14 states). Comparisons of savings attributed to the program among states are still unreliable, because of a lack of a common methodology available for measuring cost-savings. Conclusion There is great variation among states in the methods used to measure prescription drug utilization cost-savings. This analysis suggests that there is still room for improvement in terms of methodology transparency, which is important, because lack of transparency hinders states from learning from each other. Ultimately, the federal government needs to evaluate and improve its DUR program. PMID:29403573
Galvez, Gino; Turbin, Mitchel B.; Thielman, Emily J.; Istvan, Joseph A.; Andrews, Judy A.; Henry, James A.
2012-01-01
Objectives Measurement of outcomes has become increasingly important to assess the benefit of audiologic rehabilitation, including hearing aids, in adults. Data from questionnaires, however, are based on retrospective recall of events and experiences, and often can be inaccurate. Questionnaires also do not capture the daily variation that typically occurs in relevant events and experiences. Clinical researchers in a variety of fields have turned to a methodology known as ecological momentary assessment (EMA) to assess quotidian experiences associated with health problems. The objective of this study was to determine the feasibility of using EMA to obtain real-time responses from hearing aid users describing their experiences with challenging hearing situations. Design This study required three phases: (1) develop EMA methodology to assess hearing difficulties experienced by hearing aid users; (2) utilize focus groups to refine the methodology; and (3) test the methodology with 24 hearing aid users. Phase 3 participants carried a personal digital assistant (PDA) 12 hr per day for 2 wk. The PDA alerted participants to respond to questions four times a day. Each assessment started with a question to determine if a hearing problem was experienced since the last alert. If “yes,” then up to 23 questions (depending on contingent response branching) obtained details about the situation. If “no,” then up to 11 questions obtained information that would help to explain why hearing was not a problem. Each participant completed the Hearing Handicap Inventory for the Elderly (HHIE) both before and after the 2-wk EMA testing period to evaluate for “reactivity” (exacerbation of self-perceived hearing problems that could result from the repeated assessments). Results Participants responded to the alerts with a 77% compliance rate, providing a total of 991 completed momentary assessments (mean = 43.1 per participant). A substantial amount of data was obtained with the methodology. Notably, participants reported a “hearing problem situation since the last alert” 37.6% of the time (372 responses). The most common problem situation involved “face-to-face conversation” (53.8% of the time). The next most common problem situation was “telephone conversation” (17.2%) followed by “TV, radio, iPod, etc.” (15.3%), “environmental sounds” (9.7%), and “movies, lecture, etc.” (4.0%). Comparison of pre- and post-EMA mean HHIE scores revealed no significant difference (p>.05), indicating that reactivity did not occur for this group. It should be noted, however, that 37.5% of participants reported a greater sense of awareness regarding their hearing loss and use of hearing aids. Conclusions Results showed participants were compliant, gave positive feedback, and did not demonstrate reactivity based on pre- and post-HHIE scores. We conclude that EMA methodology is feasible with patients who use hearing aids and could potentially inform hearing healthcare (HHC) services. The next step is to develop and evaluate EMA protocols that provide detailed daily patient information to audiologists at each stage of HHC. The advantages of such an approach would be to obtain real-life outcome measures, and to determine within- and between-day variability in outcomes and associated factors. Such information currently is not available from patients who seek and use HHC services. PMID:22531573
Galvez, Gino; Turbin, Mitchel B; Thielman, Emily J; Istvan, Joseph A; Andrews, Judy A; Henry, James A
2012-01-01
Measurement of outcomes has become increasingly important to assess the benefit of audiologic rehabilitation, including hearing aids, in adults. Data from questionnaires, however, are based on retrospective recall of events and experiences, and often can be inaccurate. Questionnaires also do not capture the daily variation that typically occurs in relevant events and experiences. Clinical researchers in a variety of fields have turned to a methodology known as ecological momentary assessment (EMA) to assess quotidian experiences associated with health problems. The objective of this study was to determine the feasibility of using EMA to obtain real-time responses from hearing aid users describing their experiences with challenging hearing situations. This study required three phases: (1) develop EMA methodology to assess hearing difficulties experienced by hearing aid users; (2) make use of focus groups to refine the methodology; and (3) test the methodology with 24 hearing aid users. Phase 3 participants carried a personal digital assistant 12 hr per day for 2 weeks. The personal digital assistant alerted participants to respond to questions four times a day. Each assessment started with a question to determine whether a hearing problem was experienced since the last alert. If "yes," then up to 23 questions (depending on contingent response branching) obtained details about the situation. If "no," then up to 11 questions obtained information that would help to explain why hearing was not a problem. Each participant completed the Hearing Handicap Inventory for the Elderly (HHIE) both before and after the 2-week EMA testing period to evaluate for "reactivity" (exacerbation of self-perceived hearing problems that could result from the repeated assessments). Participants responded to the alerts with a 77% compliance rate, providing a total of 991 completed momentary assessments (mean = 43.1 per participant). A substantial amount of data were obtained with the methodology. It is important to note that participants reported a "hearing problem situation since the last alert" 37.6% of the time (372 responses). The most common problem situation involved "face-to-face conversation" (53.8% of the time). The next most common problem situation was "telephone conversation" (17.2%) followed by "TV, radio, iPod, etc." (15.3%), "environmental sounds" (9.7%), and "movies, lecture, etc." (4.0%). Comparison of pre- and post-EMA mean HHIE scores revealed no significant difference (p > 0.05), indicating that reactivity did not occur for this group. It should be noted, however, that 37.5% of participants reported a greater sense of awareness regarding their hearing loss and use of hearing aids. Results showed participants were compliant, gave positive feedback, and did not demonstrate reactivity based on pre- and post-HHIE scores. We conclude that EMA methodology is feasible with patients who use hearing aids and could potentially inform hearing healthcare (HHC) services. The next step is to develop and evaluate EMA protocols that provide detailed daily patient information to audiologists at each stage of HHC. The advantages of such an approach would be to obtain real-life outcome measures, and to determine within- and between-day variability in outcomes and associated factors. Such information at present is not available from patients who seek and use HHC services.
Methodological issues in the study of violence against women
Ruiz‐Pérez, Isabel; Plazaola‐Castaño, Juncal; Vives‐Cases, Carmen
2007-01-01
The objective of this paper is to review the methodological issues that arise when studying violence against women as a public health problem, focusing on intimate partner violence (IPV), since this is the form of violence that has the greatest consequences at a social and political level. The paper focuses first on the problems of defining what is meant by IPV. Secondly, the paper describes the difficulties in assessing the magnitude of the problem. Obtaining reliable data on this type of violence is a complex task, because of the methodological issues derived from the very nature of the phenomenon, such as the private, intimate context in which this violence often takes place, which means the problem cannot be directly observed. Finally, the paper examines the limitations and bias in research on violence, including the lack of consensus with regard to measuring events that may or may not represent a risk factor for violence against women or the methodological problem related to the type of sampling used in both aetiological and prevalence studies. PMID:18000113
NASA Astrophysics Data System (ADS)
Zendejas, Gerardo; Chiasson, Mike
This paper will propose and explore a method to enhance focal actors' abilities to enroll and control the many social and technical components interacting during the initiation, production, and diffusion of innovations. The reassembling and stabilizing of such components is the challenging goal of the focal actors involved in these processes. To address this possibility, a healthcare project involving the initiation, production, and diffusion of an IT-based innovation will be influenced by the researcher, using concepts from actor network theory (ANT), within an action research methodology (ARM). The experiences using this method, and the nature of enrolment and translation during its use, will highlight if and how ANT can provide a problem-solving method to help assemble the social and technical actants involved in the diffusion of an innovation. Finally, the paper will discuss the challenges and benefits of implementing such methods to attain widespread diffusion.
NASA Astrophysics Data System (ADS)
Arsenault, Louis-François; Neuberg, Richard; Hannah, Lauren A.; Millis, Andrew J.
2017-11-01
We present a supervised machine learning approach to the inversion of Fredholm integrals of the first kind as they arise, for example, in the analytic continuation problem of quantum many-body physics. The approach provides a natural regularization for the ill-conditioned inverse of the Fredholm kernel, as well as an efficient and stable treatment of constraints. The key observation is that the stability of the forward problem permits the construction of a large database of outputs for physically meaningful inputs. Applying machine learning to this database generates a regression function of controlled complexity, which returns approximate solutions for previously unseen inputs; the approximate solutions are then projected onto the subspace of functions satisfying relevant constraints. Under standard error metrics the method performs as well or better than the Maximum Entropy method for low input noise and is substantially more robust to increased input noise. We suggest that the methodology will be similarly effective for other problems involving a formally ill-conditioned inversion of an integral operator, provided that the forward problem can be efficiently solved.
Tierney, Edel; McEvoy, Rachel; O'Reilly-de Brún, Mary; de Brún, Tomas; Okonkwo, Ekaterina; Rooney, Michelle; Dowrick, Chris; Rogers, Anne; MacFarlane, Anne
2016-06-01
There have been recent important advances in conceptualizing and operationalizing involvement in health research and health-care service development. However, problems persist in the field that impact on the scope for meaningful involvement to become a routine - normalized - way of working in primary care. In this review, we focus on current practice to critically interrogate factors known to be relevant for normalization - definition, enrolment, enactment and appraisal. Ours was a multidisciplinary, interagency team, with community representation. We searched EBSCO host for papers from 2007 to 2011 and engaged in an iterative, reflexive approach to sampling, appraising and analysing the literature following the principles of a critical interpretive synthesis approach and using Normalization Process Theory. Twenty-six papers were chosen from 289 papers, as a purposeful sample of work that is reported as service user involvement in the field. Few papers provided a clear working definition of service user involvement. The dominant identified rationale for enrolling service users in primary care projects was linked with policy imperatives for co-governance and emancipatory ideals. The majority of methodologies employed were standard health services research methods that do not qualify as research with service users. This indicates a lack of congruence between the stated aims and methods. Most studies only reported positive outcomes, raising questions about the balance or completeness of the published appraisals. To improve normalization of meaningful involvement in primary care, it is necessary to encourage explicit reporting of definitions, methodological innovation to enhance co-governance and dissemination of research processes and findings. © 2014 The Authors Health Expectations Published by John Wiley & Sons Ltd.
Training effectiveness assessment: Methodological problems and issues
NASA Technical Reports Server (NTRS)
Cross, Kenneth D.
1992-01-01
The U.S. military uses a large number of simulators to train and sustain the flying skills of helicopter pilots. Despite the enormous resources required to purchase, maintain, and use those simulators, little effort has been expended in assessing their training effectiveness. One reason for this is the lack of an evaluation methodology that yields comprehensive and valid data at a practical cost. Some of these methodological problems and issues that arise in assessing simulator training effectiveness, as well as problems with the classical transfer-of-learning paradigm were discussed.
Ethical and Legal Implications of the Methodological Crisis in Neuroimaging.
Kellmeyer, Philipp
2017-10-01
Currently, many scientific fields such as psychology or biomedicine face a methodological crisis concerning the reproducibility, replicability, and validity of their research. In neuroimaging, similar methodological concerns have taken hold of the field, and researchers are working frantically toward finding solutions for the methodological problems specific to neuroimaging. This article examines some ethical and legal implications of this methodological crisis in neuroimaging. With respect to ethical challenges, the article discusses the impact of flawed methods in neuroimaging research in cognitive and clinical neuroscience, particularly with respect to faulty brain-based models of human cognition, behavior, and personality. Specifically examined is whether such faulty models, when they are applied to neurological or psychiatric diseases, could put patients at risk, and whether this places special obligations on researchers using neuroimaging. In the legal domain, the actual use of neuroimaging as evidence in United States courtrooms is surveyed, followed by an examination of ways that the methodological problems may create challenges for the criminal justice system. Finally, the article reviews and promotes some promising ideas and initiatives from within the neuroimaging community for addressing the methodological problems.
Mexico City Air Quality Research Initiative; Volume 5, Strategic evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1994-03-01
Members of the Task HI (Strategic Evaluation) team were responsible for the development of a methodology to evaluate policies designed to alleviate air pollution in Mexico City. This methodology utilizes information from various reports that examined ways to reduce pollutant emissions, results from models that calculate the improvement in air quality due to a reduction in pollutant emissions, and the opinions of experts as to the requirements and trade-offs that are involved in developing a program to address the air pollution problem in Mexico City. The methodology combines these data to produce comparisons between different approaches to improving Mexico City`smore » air quality. These comparisons take into account not only objective factors such as the air quality improvement or cost of the different approaches, but also subjective factors such as public acceptance or political attractiveness of the different approaches. The end result of the process is a ranking of the different approaches and, more importantly, the process provides insights into the implications of implementing a particular approach or policy.« less
VARIABLE SELECTION FOR REGRESSION MODELS WITH MISSING DATA
Garcia, Ramon I.; Ibrahim, Joseph G.; Zhu, Hongtu
2009-01-01
We consider the variable selection problem for a class of statistical models with missing data, including missing covariate and/or response data. We investigate the smoothly clipped absolute deviation penalty (SCAD) and adaptive LASSO and propose a unified model selection and estimation procedure for use in the presence of missing data. We develop a computationally attractive algorithm for simultaneously optimizing the penalized likelihood function and estimating the penalty parameters. Particularly, we propose to use a model selection criterion, called the ICQ statistic, for selecting the penalty parameters. We show that the variable selection procedure based on ICQ automatically and consistently selects the important covariates and leads to efficient estimates with oracle properties. The methodology is very general and can be applied to numerous situations involving missing data, from covariates missing at random in arbitrary regression models to nonignorably missing longitudinal responses and/or covariates. Simulations are given to demonstrate the methodology and examine the finite sample performance of the variable selection procedures. Melanoma data from a cancer clinical trial is presented to illustrate the proposed methodology. PMID:20336190
Flutter, Postflutter, and Control of a Supersonic Wing Section
NASA Technical Reports Server (NTRS)
Marzocca, Piergiovanni; Librescu, Liviu; Silva, Walter A.
2002-01-01
A number of issues related to the flutter and postflutter of two-dimensional supersonic lifting surfaces are addressed. Among them there are the 1) investigation of the implications of the nonlinear unsteady aerodynamics and structural nonlinearities on the stable/unstable character of the limit cycle and 2) study of the implications of the incorporation of a control capability on both the flutter boundary and the postflutter behavior. To this end, a powerful methodology based on the Lyapunov first quantity is implemented. Such a treatment of the problem enables one to get a better understanding of the various factors involved in the nonlinear aeroelastic problem, including the stable and unstable limit cycle. In addition, it constitutes a first step toward a more general investigation of nonlinear aeroelastic phenomena of three-dimensional lifting surfaces.
Stochastic approach for radionuclides quantification
NASA Astrophysics Data System (ADS)
Clement, A.; Saurel, N.; Perrin, G.
2018-01-01
Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.
A methodology for hard/soft information fusion in the condition monitoring of aircraft
NASA Astrophysics Data System (ADS)
Bernardo, Joseph T.
2013-05-01
Condition-based maintenance (CBM) refers to the philosophy of performing maintenance when the need arises, based upon indicators of deterioration in the condition of the machinery. Traditionally, CBM involves equipping machinery with electronic sensors that continuously monitor components and collect data for analysis. The addition of the multisensory capability of human cognitive functions (i.e., sensemaking, problem detection, planning, adaptation, coordination, naturalistic decision making) to traditional CBM may create a fuller picture of machinery condition. Cognitive systems engineering techniques provide an opportunity to utilize a dynamic resource—people acting as soft sensors. The literature is extensive on techniques to fuse data from electronic sensors, but little work exists on fusing data from humans with that from electronic sensors (i.e., hard/soft fusion). The purpose of my research is to explore, observe, investigate, analyze, and evaluate the fusion of pilot and maintainer knowledge, experiences, and sensory perceptions with digital maintenance resources. Hard/soft information fusion has the potential to increase problem detection capability, improve flight safety, and increase mission readiness. This proposed project consists the creation of a methodology that is based upon the Living Laboratories framework, a research methodology that is built upon cognitive engineering principles1. This study performs a critical assessment of concept, which will support development of activities to demonstrate hard/soft information fusion in operationally relevant scenarios of aircraft maintenance. It consists of fieldwork, knowledge elicitation to inform a simulation and a prototype.
NASA Technical Reports Server (NTRS)
Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong
1993-01-01
The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.
Neurobiology of suicidal behaviour.
Pjevac, Milica; Pregelj, Peter
2012-10-01
It is known that suicidal behaviour has multiple causes. If triggers could be mainly attributed to environmental factors, predisposition could be associated with early stressors on one side such as childhood adversities and genetic predisposition. No convincing animal model of suicide has been produced to date. The study of endophenotypes has been proposed as a good strategy to overcome the methodological difficulties. However, research in suicidal behaviours using endophenotypes entrails important methodological problems. Further, serotoninergic system was studied in patients with suicidal behaviour primary due to its involvement of serotonin in impulsive-aggressive behaviour, which has been shown to be a major risk factor in suicidal behaviour. Not only on the level of neurotransmitters but also the regulation of neurotropic factors could be impaired in suicide victims. Multiple lines of evidence including studies of levels of BDNF in blood cells and plasma of suicidal patients, postmortem brain studies in suicidal subjects with or without depression, and genetic association studies linking BDNF to suicide suggest that suicidal behaviour may be associated with a decrease in BDNF functioning. It seems that especially specific gene variants regulating the serotoninergic system and other neuronal systems involved in stress response are associated with suicidal behaviour. Most genetic studies on suicidal behaviour have considered a small set of functional polymorphisms relevant mostly to monoaminergic neurotransmission. However, genes and epigenetic mechanisms involved in regulation of other factors such as BDNF seem to be even more relevant for further research.
Conceptual, Methodological, and Ethical Problems in Communicating Uncertainty in Clinical Evidence
Han, Paul K. J.
2014-01-01
The communication of uncertainty in clinical evidence is an important endeavor that poses difficult conceptual, methodological, and ethical problems. Conceptual problems include logical paradoxes in the meaning of probability and “ambiguity”— second-order uncertainty arising from the lack of reliability, credibility, or adequacy of probability information. Methodological problems include questions about optimal methods for representing fundamental uncertainties and for communicating these uncertainties in clinical practice. Ethical problems include questions about whether communicating uncertainty enhances or diminishes patient autonomy and produces net benefits or harms. This article reviews the limited but growing literature on these problems and efforts to address them and identifies key areas of focus for future research. It is argued that the critical need moving forward is for greater conceptual clarity and consistent representational methods that make the meaning of various uncertainties understandable, and for clinical interventions to support patients in coping with uncertainty in decision making. PMID:23132891
CNN based approach for activity recognition using a wrist-worn accelerometer.
Panwar, Madhuri; Dyuthi, S Ram; Chandra Prakash, K; Biswas, Dwaipayan; Acharyya, Amit; Maharatna, Koushik; Gautam, Arvind; Naik, Ganesh R
2017-07-01
In recent years, significant advancements have taken place in human activity recognition using various machine learning approaches. However, feature engineering have dominated conventional methods involving the difficult process of optimal feature selection. This problem has been mitigated by using a novel methodology based on deep learning framework which automatically extracts the useful features and reduces the computational cost. As a proof of concept, we have attempted to design a generalized model for recognition of three fundamental movements of the human forearm performed in daily life where data is collected from four different subjects using a single wrist worn accelerometer sensor. The validation of the proposed model is done with different pre-processing and noisy data condition which is evaluated using three possible methods. The results show that our proposed methodology achieves an average recognition rate of 99.8% as opposed to conventional methods based on K-means clustering, linear discriminant analysis and support vector machine.
Application and systems software in Ada: Development experiences
NASA Technical Reports Server (NTRS)
Kuschill, Jim
1986-01-01
In its most basic sense software development involves describing the tasks to be solved, including the given objects and the operations to be performed on those objects. Unfortunately, the way people describe objects and operations usually bears little resemblance to source code in most contemporary computer languages. There are two ways around this problem. One is to allow users to describe what they want the computer to do in everyday, typically imprecise English. The PRODOC methodology and software development environment is based on a second more flexible and possibly even easier to use approach. Rather than hiding program structure, PRODOC represents such structure graphically using visual programming techniques. In addition, the program terminology used in PRODOC may be customized so as to match the way human experts in any given application area naturally describe the relevant data and operations. The PRODOC methodology is described in detail.
Zhu; Dale
2000-10-01
/ Regional resource use planning relies on key regional stakeholder groups using and having equitable access to appropriate social, economic, and environmental information and assessment tools. Decision support systems (DSS) can improve stakeholder access to such information and analysis tools. Regional resource use planning, however, is a complex process involving multiple issues, multiple assessment criteria, multiple stakeholders, and multiple values. There is a need for an approach to DSS development that can assist in understanding and modeling complex problem situations in regional resource use so that areas where DSSs could provide effective support can be identified, and the user requirements can be well established. This paper presents an approach based on the soft systems methodology for identifying DSS opportunities for regional resource use planning, taking the Central Highlands Region of Queensland, Australia, as a case study.
A CFD study of complex missile and store configurations in relative motion
NASA Technical Reports Server (NTRS)
Baysal, Oktay
1995-01-01
An investigation was conducted from May 16, 1990 to August 31, 1994 on the development of computational fluid dynamics (CFD) methodologies for complex missiles and the store separation problem. These flowfields involved multiple-component configurations, where at least one of the objects was engaged in relative motion. The two most important issues that had to be addressed were: (1) the unsteadiness of the flowfields (time-accurate and efficient CFD algorithms for the unsteady equations), and (2) the generation of grid systems which would permit multiple and moving bodies in the computational domain (dynamic domain decomposition). The study produced two competing and promising methodologies, and their proof-of-concept cases, which have been reported in the open literature: (1) Unsteady solutions on dynamic, overlapped grids, which may also be perceived as moving, locally-structured grids, and (2) Unsteady solutions on dynamic, unstructured grids.
Ethical issues in cancer screening and prevention.
Plutynski, Anya
2012-06-01
November 2009's announcement of the USPSTF's recommendations for screening for breast cancer raised a firestorm of objections. Chief among them were that the panel had insufficiently valued patients' lives or allowed cost considerations to influence recommendations. The publicity about the recommendations, however, often either simplified the actual content of the recommendations or bypassed significant methodological issues, which a philosophical examination of both the science behind screening recommendations and their import reveals. In this article, I discuss two of the leading ethical considerations at issue in screening recommendations: respect for patient autonomy and beneficence and then turn to the most significant methodological issues raised by cancer screening: the potential biases that may infect a trial of screening effectiveness, the problem of base rates in communicating risk, and the trade-offs involved in a judgment of screening effectiveness. These issues reach more broadly, into the use of "evidence-based" medicine generally, and have important implications for informed consent.
Probabilistic self-organizing maps for continuous data.
Lopez-Rubio, Ezequiel
2010-10-01
The original self-organizing feature map did not define any probability distribution on the input space. However, the advantages of introducing probabilistic methodologies into self-organizing map models were soon evident. This has led to a wide range of proposals which reflect the current emergence of probabilistic approaches to computational intelligence. The underlying estimation theories behind them derive from two main lines of thought: the expectation maximization methodology and stochastic approximation methods. Here, we present a comprehensive view of the state of the art, with a unifying perspective of the involved theoretical frameworks. In particular, we examine the most commonly used continuous probability distributions, self-organization mechanisms, and learning schemes. Special emphasis is given to the connections among them and their relative advantages depending on the characteristics of the problem at hand. Furthermore, we evaluate their performance in two typical applications of self-organizing maps: classification and visualization.
An LMI approach for the Integral Sliding Mode and H∞ State Feedback Control Problem
NASA Astrophysics Data System (ADS)
Bezzaoucha, Souad; Henry, David
2015-11-01
This paper deals with the state feedback control problem for linear uncertain systems subject to both matched and unmatched perturbations. The proposed control law is based on an the Integral Sliding Mode Control (ISMC) approach to tackle matched perturbations as well as the H∞ paradigm for robustness against unmatched perturbations. The proposed method also parallels the work presented in [1] which addressed the same problem and proposed a solution involving an Algebraic Riccati Equation (ARE)-based formulation. The contribution of this paper is concerned by the establishment of a Linear Matrix Inequality (LMI)-based solution which offers the possibility to consider other types of constraints such as 𝓓-stability constraints (pole assignment-like constraints). The proposed methodology is applied to a pilot three-tank system and experiment results illustrate the feasibility. Note that only a few real experiments have been rarely considered using SMC in the past. This is due to the high energetic behaviour of the control signal. It is important to outline that the paper does not aim at proposing a LMI formulation of an ARE. This is done since 1971 [2] and further discussed in [3] where the link between AREs and ARIs (algebraic Riccati inequality) is established for the H∞ control problem. The main contribution of this paper is to establish the adequate LMI-based methodology (changes of matrix variables) so that the ARE that corresponds to the particular structure of the mixed ISMC/H∞ structure proposed by [1] can be re-formulated within the LMI paradigm.
Examining gambling-related crime reports in the National Finnish Police Register.
Kuoppamäki, Sanna-Mari; Kääriäinen, Juha; Lind, Kalle
2014-12-01
The aim of this study is to examine the connection between gambling and criminal activity in the National Finnish Police Register. First, a method was created that enabled the search for gambling-related police reports in the National Finnish Police Register. The method is based on finding gambling-related police reports by using gambling-related headwords. Second, all police reports from 2011 that included any mention of gambling were read through (n = 2,233). Suspected gambling-related of crimes (n = 737) were selected from these reports. Those suspected gambling-related crimes were then described and categorized into six different categories: suspected online-related crimes; suspected crimes that were related to lifestyle-gaming; suspected crimes that involved a gambler as a victim of a crime; criminal activity related to problem gambling; casino-connected crimes, and intimate partnership violence resulting from gambling problems. This study, being the first in Finland, generated information on the connection between gambling and criminal activity from the perspective of police reports. Moreover, the study highlights methodological issues that are involved in studying police reports.
[Methodological problems in the use of information technologies in physical education].
Martirosov, E G; Zaĭtseva, G A
2000-01-01
The paper considers methodological problems in the use of computer technologies in physical education by applying diagnostic and consulting systems, educational and educational-and-training process automation systems, and control and self-control programmes for athletes and others.
Expanding Simulations as a Means of Tactical Training with Multinational Partners
2017-06-09
gap through DOTMLPF in combination with an assessment of two case studies involving higher echelon use of simulations. Through this methodology , the...DOTMLPF in combination with an assessment of two case studies involving higher echelon use of simulations. Through this methodology , the findings...CHAPTER 3 RESEARCH METHODOLOGY .................................................................26 CHAPTER 4 ANALYSIS
Integrating Design and Manufacturing for a High Speed Civil Transport Wing
NASA Technical Reports Server (NTRS)
Marx, William J.; Mavris, Dimitri N.; Schrage, Daniel P.
1994-01-01
The aerospace industry is currently addressing the problem of integrating design and manufacturing. Because of the difficulties associated with using conventional, procedural techniques and algorithms, it is the authors' belief that the only feasible way to integrate the two concepts is with the development of an appropriate Knowledge-Based System (KBS). The authors propose a methodology for an aircraft producibility assessment, including a KBS, that addresses both procedural and heuristic aspects of integrating design and manufacturing of a High Speed Civil Transport (HSCT) wing. The HSCT was chosen as the focus of this investigation since it is a current NASA/aerospace industry initiative full of technological challenges involving many disciplines. The paper gives a brief background of selected previous supersonic transport studies followed by descriptions of key relevant design and manufacturing methodologies. Georgia Tech's Concurrent Engineering/Integrated Product and Process Development methodology is discussed with reference to this proposed conceptual producibility assessment. Evaluation criteria are presented that relate pertinent product and process parameters to overall product producibility. In addition, the authors' integration methodology and reasons for selecting a KBS to integrate design and manufacturing are presented in this paper. Finally, a proposed KBS is given, as well as statements of future work and overall investigation objectives.
What lies behind crop decisions?Coming to terms with revealing farmers' preferences
NASA Astrophysics Data System (ADS)
Gomez, C.; Gutierrez, C.; Pulido-Velazquez, M.; López Nicolás, A.
2016-12-01
The paper offers a fully-fledged applied revealed preference methodology to screen and represent farmers' choices as the solution of an optimal program involving trade-offs among the alternative welfare outcomes of crop decisions such as profits, income security and management easiness. The recursive two-stage method is proposed as an alternative to cope with the methodological problems inherent to common practice positive mathematical program methodologies (PMP). Differently from PMP, in the model proposed in this paper, the non-linear costs that are required for both calibration and smooth adjustment are not at odds with the assumptions of linear Leontief technologies and fixed crop prices and input costs. The method frees the model from ad-hoc assumptions about costs and then recovers the potential of economic analysis as a means to understand the rationale behind observed and forecasted farmers' decisions and then to enhance the potential of the model to support policy making in relevant domains such as agricultural policy, water management, risk management and climate change adaptation. After the introduction, where the methodological drawbacks and challenges are set up, section two presents the theoretical model, section three develops its empirical application and presents its implementation to a Spanish irrigation district and finally section four concludes and makes suggestions for further research.
Larsen, Randy J; Kasimatis, Margaret; Frey, Kurt
1992-09-01
We examined the hypothesis that muscle contractions in the face influence subjective emotional experience. Previously, researchers have been critical of experiments designed to test this facial feedback hypothesis, particularly in terms of methodological problems that may lead to demand characteristics. In an effort to surmount these methodological problems Strack, Martin, and Stepper (1988) developed an experimental procedure whereby subjects were induced to contract facial muscles involved in the production of an emotional pattern, without being asked to actually simulate an emotion. Specifically, subjects were required to hold a pen in their teeth, which unobtrusively creates a contraction of the zygomaticus major muscles, the muscles involved in the production of a human smile. This manipulation minimises the likelihood that subjects are able to interpret their zygomaticus contractions as representing a particular emotion, thereby preventing subjects from determining the purpose of the experiment. Strack et al. (1988) found support for the facial feedback hypothesis applied to pleasant affect, in that subjects in the pen-in-teeth condition rated humorous cartoons as being funnier than subjects in the control condition (in which zygomaticus contractions were inhibited). The present study represents an extension of this nonobtrusive methodology to an investigation of the facial feedback of unpleasant affect. Consistent with the Strack et al. procedure, we wanted to have subjects furrow their brow without actually instructing them to do so and without asking them to produce any emotional facial pattern at all. This was achieved by attaching two golf tees to the subject's brow region (just above the inside comer of each eye) and then instructing them to touch the tips of the golf tees together as part of a "divided-attention" experiment. Touching the tips of the golf tees together could only be achieved by a contraction of the corrugator supercilii muscles, the muscles involved in the production of a sad emotional facial pattern. Subjects reported significantly more sadness in response to aversive photographs while touching the tips of the golf tees together than under conditions which inhibited corrugator contractions. These results provide evidence, using a new and unobtrusive manipulation, that facial feedback operates for unpleasant affect to a degree similar to that previously found for pleasant affect.
Spinal Cord Injury-Induced Dysautonomia via Plasticity in Paravertebral Sympathetic Postganglionic
2017-10-01
their near anatomical inaccessibility. We have solved the accessibility problem with a strategic methodological advance. We will determine the extent...inaccessibility. We have solved the accessibility problem with a strategic methodological advance. We will determine the extent to which paravertebral
Human Prenatal Effects: Methodological Problems and Some Suggested Solutions
ERIC Educational Resources Information Center
Copans, Stuart A.
1974-01-01
Briefly reviews the relevant literature on human prenatal effects, describes some of the possible designs for such studies; and discusses some of the methodological problem areas: sample choice, measurement of prenatal variables, monitoring of labor and delivery, and neonatal assessment. (CS)
Zhang, Yong-Feng; Chiang, Hsiao-Dong
2017-09-01
A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.
Overcoming an obstacle in expanding a UMLS semantic type extent.
Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James
2012-02-01
This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. Copyright © 2011 Elsevier Inc. All rights reserved.
Overcoming an Obstacle in Expanding a UMLS Semantic Type Extent
Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James
2011-01-01
This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. PMID:21925287
Transport mechanisms in Schottky diodes realized on GaN
NASA Astrophysics Data System (ADS)
Amor, Sarrah; Ahaitouf, Ali; Ahaitouf, Abdelaziz; Salvestrini, Jean Paul; Ougazzaden, Abdellah
2017-03-01
This work is focused on the conducted transport mechanisms involved on devices based in gallium nitride GaN and its alloys. With considering all conduction mechanisms of current, its possible to understanded these transport phenomena. Thanks to this methodology the current-voltage characteristics of structures with unusual behaviour are further understood and explain. Actually, the barrier height (SBH) is a complex problem since it depends on several parameters like the quality of the metal-semiconductor interface. This study is particularly interesting as solar cells are made on this material and their qualification is closely linked to their transport properties.
NASA Technical Reports Server (NTRS)
Wild, Christian; Eckhardt, Dave
1987-01-01
The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.
NASA Technical Reports Server (NTRS)
Barr, B. G.; Martinko, E. A.
1976-01-01
Activities of the Kansas Applied Remote Sensing Program (KARS) designed to establish interactions on cooperative projects with decision makers in Kansas agencies in the development and application of remote sensing procedures are reported. Cooperative demonstration projects undertaken with several different agencies involved three principal areas of effort: Wildlife Habitat and Environmental Analysis; Urban and Regional Analysis; Agricultural and Rural Analysis. These projects were designed to concentrate remote sensing concepts and methodologies on existing agency problems to insure the continued relevancy of the program and maximize the possibility for immediate operational use. Completed projects are briefly discussed.
PACE team response shows a disregard for the principles of science.
Edwards, Jonathan
2017-08-01
The PACE trial of cognitive behavioural therapy and graded exercise therapy for chronic fatigue syndrome/myalgic encephalomyelitis has raised serious questions about research methodology. An editorial article by Geraghty gives a fair account of the problems involved, if anything understating the case. The response by White et al. fails to address the key design flaw, of an unblinded study with subjective outcome measures, apparently demonstrating a lack of understanding of basic trial design requirements. The failure of the academic community to recognise the weakness of trials of this type suggests that a major overhaul of quality control is needed.
E-therapy for mental health problems: a systematic review.
Postel, Marloes G; de Haan, Hein A; De Jong, Cor A J
2008-09-01
The widespread availability of the Internet offers opportunities for improving access to therapy for people with mental health problems. There is a seemingly infinite supply of Internet-based interventions available on the World Wide Web. The aim of the present study is to systematically assess the methodological quality of randomized controlled trials (RCTs) concerning e-therapy for mental health problems. Two reviewers independently assessed the methodological quality of the RCTs, based on a list of criteria for the methodological quality assessment as recommended by the Cochrane Back Review Group. The search yielded 14 papers that reported RCTs concerning e-therapy for mental-health problems. The methodological quality of studies included in this review was generally low. It is concluded that e-therapy may turn out to be an appropriate therapeutic entity, but the evidence needs to be more convincing. Recommendations are made concerning the method of reporting RCTs and the need to add some content items to an e-therapy study.
Insight with hands and things.
Vallée-Tourangeau, Frédéric; Steffensen, Sune Vork; Vallée-Tourangeau, Gaëlle; Sirota, Miroslav
2016-10-01
Two experiments examined whether different task ecologies influenced insight problem solving. The 17 animals problem was employed, a pure insight problem. Its initial formulation encourages the application of a direct arithmetic solution, but its solution requires the spatial arrangement of sets involving some degree of overlap. Participants were randomly allocated to either a tablet condition where they could use a stylus and an electronic tablet to sketch a solution or a model building condition where participants were given material with which to build enclosures and figurines. In both experiments, participants were much more likely to develop a working solution in the model building condition. The difference in performance elicited by different task ecologies was unrelated to individual differences in working memory, actively open-minded thinking, or need for cognition (Experiment 1), although individual differences in creativity were correlated with problem solving success in Experiment 2. The discussion focuses on the implications of these findings for the prevailing metatheoretical commitment to methodological individualism that places the individual as the ontological locus of cognition. Copyright © 2016 Elsevier B.V. All rights reserved.
Misticoni, G; Marchetti, F; D'Andrea, N
1994-01-01
41 pediatricians agreed to register on a very simple form, all the cases of children affected by bronchial asthma visited in their clinic during october 1993. The data included basic information related to the therapy prescribed, its duration, a judgement on the efficacy of symptoms control and the main problems encountered with the children and their families. 237 cases were reported (mean age 4.6 year, range 2 months-13 years). 80% of children were monitored by the pediatrician; 47% had allergic reactions. The main drug used for profilaxis is ketotifen, a compound without documented efficacy; the main route for drug administration (especially during acute attacks) is by mouth, instead of by aerosol, evidencing problems in the health education on practical skills. In fact the main problems encountered by doctors are related to the communication with patients and families. This survey represents also a research model for involving health care providers and easily and quickly obtaining a useful, methodologically sound and interesting picture of everyday practice.
NASA Astrophysics Data System (ADS)
Setiawan, E. P.; Rosadi, D.
2017-01-01
Portfolio selection problems conventionally means ‘minimizing the risk, given the certain level of returns’ from some financial assets. This problem is frequently solved with quadratic or linear programming methods, depending on the risk measure that used in the objective function. However, the solutions obtained by these method are in real numbers, which may give some problem in real application because each asset usually has its minimum transaction lots. In the classical approach considering minimum transaction lots were developed based on linear Mean Absolute Deviation (MAD), variance (like Markowitz’s model), and semi-variance as risk measure. In this paper we investigated the portfolio selection methods with minimum transaction lots with conditional value at risk (CVaR) as risk measure. The mean-CVaR methodology only involves the part of the tail of the distribution that contributed to high losses. This approach looks better when we work with non-symmetric return probability distribution. Solution of this method can be found with Genetic Algorithm (GA) methods. We provide real examples using stocks from Indonesia stocks market.
Fitting methods to paradigms: are ergonomics methods fit for systems thinking?
Salmon, Paul M; Walker, Guy H; M Read, Gemma J; Goode, Natassia; Stanton, Neville A
2017-02-01
The issues being tackled within ergonomics problem spaces are shifting. Although existing paradigms appear relevant for modern day systems, it is worth questioning whether our methods are. This paper asks whether the complexities of systems thinking, a currently ubiquitous ergonomics paradigm, are outpacing the capabilities of our methodological toolkit. This is achieved through examining the contemporary ergonomics problem space and the extent to which ergonomics methods can meet the challenges posed. Specifically, five key areas within the ergonomics paradigm of systems thinking are focused on: normal performance as a cause of accidents, accident prediction, system migration, systems concepts and ergonomics in design. The methods available for pursuing each line of inquiry are discussed, along with their ability to respond to key requirements. In doing so, a series of new methodological requirements and capabilities are identified. It is argued that further methodological development is required to provide researchers and practitioners with appropriate tools to explore both contemporary and future problems. Practitioner Summary: Ergonomics methods are the cornerstone of our discipline. This paper examines whether our current methodological toolkit is fit for purpose given the changing nature of ergonomics problems. The findings provide key research and practice requirements for methodological development.
Increasing accuracy in the assessment of motion sickness: A construct methodology
NASA Technical Reports Server (NTRS)
Stout, Cynthia S.; Cowings, Patricia S.
1993-01-01
The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.
Nadeem, Naila; Khawaja, Ranish Deedar Ali; Beg, Madiha; Naeem, Muhammad; Majid, Zain
2013-01-01
Background In an integrated method of education, medical students are introduced to radiology in their preclinical years. However, no study has been conducted in Pakistan to demonstrate an academic framework of medical radiology education at an undergraduate level. Therefore, we aimed to document and compare the current level of teaching duties, teaching methodologies, and teaching rewards among radiologists and residents in private and public teaching hospitals in Karachi, Pakistan. Methods A survey was conducted among 121 radiologists and residents in two private and two public teaching hospitals in Karachi, Pakistan. Radiologists who were nationally registered with the Pakistan Medical and Dental Council either part-time or full-time were included. Radiology residents and fellows who were nationally registered with the Pakistan Medical and Dental Council were also included. Self-administered questionnaires addressing teaching duties, methods, and rewards were collected from 95 participants. Results The overall response rate was 78.51% (95/121). All of the radiologists were involved in teaching residents and medical students, but only 36% reported formal training in teaching skills. Although most of the respondents (76%) agreed that medical students appeared enthusiastic about learning radiology, the time spent on teaching medical students was less than five hours per week annually (82%). Only 37% of the respondents preferred dedicated clerkships over distributed clerkships (41%). The most common preferred teaching methodology overall was one-on-one interaction. Tutorials, teaching rounds, and problem-based learning sessions were less favored by radiologists than by residents. Teaching via radiology films (86%) was the most frequent mode of instruction. Salary (59%) was the most commonly cited teaching reward. The majority of respondents (88%) were not satisfied with their current level of teaching rewards. Conclusion All radiologists and residents working in an academic radiology department are involved in teaching undergraduate students at multiple levels. The most valued teaching methodology involves use of images, with one-on-one interaction between the trainer and trainee. The monetary reward for teaching is inbuilt into the salary. The methodology adopted for teaching purposes was significantly different between respondents from private hospitals and those from public teaching hospitals. Because of low satisfaction among the respondents, efforts should be made to provide satisfying teaching rewards. PMID:23745098
Energy-efficient container handling using hybrid model predictive control
NASA Astrophysics Data System (ADS)
Xin, Jianbin; Negenborn, Rudy R.; Lodewijks, Gabriel
2015-11-01
The performance of container terminals needs to be improved to adapt the growth of containers while maintaining sustainability. This paper provides a methodology for determining the trajectory of three key interacting machines for carrying out the so-called bay handling task, involving transporting containers between a vessel and the stacking area in an automated container terminal. The behaviours of the interacting machines are modelled as a collection of interconnected hybrid systems. Hybrid model predictive control (MPC) is proposed to achieve optimal performance, balancing the handling capacity and energy consumption. The underlying control problem is hereby formulated as a mixed-integer linear programming problem. Simulation studies illustrate that a higher penalty on energy consumption indeed leads to improved sustainability using less energy. Moreover, simulations illustrate how the proposed energy-efficient hybrid MPC controller performs under different types of uncertainties.
Inference of emission rates from multiple sources using Bayesian probability theory.
Yee, Eugene; Flesch, Thomas K
2010-03-01
The determination of atmospheric emission rates from multiple sources using inversion (regularized least-squares or best-fit technique) is known to be very susceptible to measurement and model errors in the problem, rendering the solution unusable. In this paper, a new perspective is offered for this problem: namely, it is argued that the problem should be addressed as one of inference rather than inversion. Towards this objective, Bayesian probability theory is used to estimate the emission rates from multiple sources. The posterior probability distribution for the emission rates is derived, accounting fully for the measurement errors in the concentration data and the model errors in the dispersion model used to interpret the data. The Bayesian inferential methodology for emission rate recovery is validated against real dispersion data, obtained from a field experiment involving various source-sensor geometries (scenarios) consisting of four synthetic area sources and eight concentration sensors. The recovery of discrete emission rates from three different scenarios obtained using Bayesian inference and singular value decomposition inversion are compared and contrasted.
Problem Solving in Biology: A Methodology
ERIC Educational Resources Information Center
Wisehart, Gary; Mandell, Mark
2008-01-01
A methodology is described that teaches science process by combining informal logic and a heuristic for rating factual reliability. This system facilitates student hypothesis formation, testing, and evaluation of results. After problem solving with this scheme, students are asked to examine and evaluate arguments for the underlying principles of…
SOME POSSIBLE APPLICATIONS OF PROJECT OUTCOMES RESEARCH METHODOLOGY
Section I, refers to the possibility of using the theory and methodology of Project Outcomes to problems of strategic information. It is felt that...purposes of assessing present and future organizational effectiveness . Section IV, refers to the applications that our study may have for problems of
[Problem-based learning in cardiopulmonary resuscitation: basic life support].
Sardo, Pedro Miguel Garcez; Dal Sasso, Grace Terezinha Marcon
2008-12-01
Descriptive and exploratory study, aimed to develop an educational practice of Problem-Based Learning in CPR/BLS with 24 students in the third stage of the Nursing Undergraduate Course in a University in the Southern region of Brazil. The study used the PBL methodology, focused on problem situations of cardiopulmonary arrest, and was approved by the CONEP. The methodological strategies for data collection, such as participative observation and questionnaires to evaluate the learning, the educational practices and their methodology, allowed for grouping the results in: students' expectations; group activities; individual activities; practical activities; evaluation of the meetings and their methodology. The study showed that PBL allows the educator to evaluate the academic learning process in several dimensions, functioning as a motivating factor for both the educator and the student, because it allows the theoretical-practical integration in an integrated learning process.
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
Methodological Problems on the Way to Integrative Human Neuroscience.
Kotchoubey, Boris; Tretter, Felix; Braun, Hans A; Buchheim, Thomas; Draguhn, Andreas; Fuchs, Thomas; Hasler, Felix; Hastedt, Heiner; Hinterberger, Thilo; Northoff, Georg; Rentschler, Ingo; Schleim, Stephan; Sellmaier, Stephan; Tebartz Van Elst, Ludger; Tschacher, Wolfgang
2016-01-01
Neuroscience is a multidisciplinary effort to understand the structures and functions of the brain and brain-mind relations. This effort results in an increasing amount of data, generated by sophisticated technologies. However, these data enhance our descriptive knowledge , rather than improve our understanding of brain functions. This is caused by methodological gaps both within and between subdisciplines constituting neuroscience, and the atomistic approach that limits the study of macro- and mesoscopic issues. Whole-brain measurement technologies do not resolve these issues, but rather aggravate them by the complexity problem. The present article is devoted to methodological and epistemic problems that obstruct the development of human neuroscience. We neither discuss ontological questions (e.g., the nature of the mind) nor review data, except when it is necessary to demonstrate a methodological issue. As regards intradisciplinary methodological problems, we concentrate on those within neurobiology (e.g., the gap between electrical and chemical approaches to neurophysiological processes) and psychology (missing theoretical concepts). As regards interdisciplinary problems, we suggest that core disciplines of neuroscience can be integrated using systemic concepts that also entail human-environment relations. We emphasize the necessity of a meta-discussion that should entail a closer cooperation with philosophy as a discipline of systematic reflection. The atomistic reduction should be complemented by the explicit consideration of the embodiedness of the brain and the embeddedness of humans. The discussion is aimed at the development of an explicit methodology of integrative human neuroscience , which will not only link different fields and levels, but also help in understanding clinical phenomena.
Methodological Problems on the Way to Integrative Human Neuroscience
Kotchoubey, Boris; Tretter, Felix; Braun, Hans A.; Buchheim, Thomas; Draguhn, Andreas; Fuchs, Thomas; Hasler, Felix; Hastedt, Heiner; Hinterberger, Thilo; Northoff, Georg; Rentschler, Ingo; Schleim, Stephan; Sellmaier, Stephan; Tebartz Van Elst, Ludger; Tschacher, Wolfgang
2016-01-01
Neuroscience is a multidisciplinary effort to understand the structures and functions of the brain and brain-mind relations. This effort results in an increasing amount of data, generated by sophisticated technologies. However, these data enhance our descriptive knowledge, rather than improve our understanding of brain functions. This is caused by methodological gaps both within and between subdisciplines constituting neuroscience, and the atomistic approach that limits the study of macro- and mesoscopic issues. Whole-brain measurement technologies do not resolve these issues, but rather aggravate them by the complexity problem. The present article is devoted to methodological and epistemic problems that obstruct the development of human neuroscience. We neither discuss ontological questions (e.g., the nature of the mind) nor review data, except when it is necessary to demonstrate a methodological issue. As regards intradisciplinary methodological problems, we concentrate on those within neurobiology (e.g., the gap between electrical and chemical approaches to neurophysiological processes) and psychology (missing theoretical concepts). As regards interdisciplinary problems, we suggest that core disciplines of neuroscience can be integrated using systemic concepts that also entail human-environment relations. We emphasize the necessity of a meta-discussion that should entail a closer cooperation with philosophy as a discipline of systematic reflection. The atomistic reduction should be complemented by the explicit consideration of the embodiedness of the brain and the embeddedness of humans. The discussion is aimed at the development of an explicit methodology of integrative human neuroscience, which will not only link different fields and levels, but also help in understanding clinical phenomena. PMID:27965548
Layer Stripping Solutions of Inverse Seismic Problems.
1985-03-21
problems--more so than has generally been recognized. The subject of this thesis is the theoretical development of the . layer-stripping methodology , and...medium varies sharply at each interface, which would be expected to cause difficulties for the algorithm, since it was designed for a smoothy varying... methodology was applied in a novel way. The inverse problem considered in this chapter was that of reconstructing a layered medium from measurement of its
Experimental Matching of Instances to Heuristics for Constraint Satisfaction Problems.
Moreno-Scott, Jorge Humberto; Ortiz-Bayliss, José Carlos; Terashima-Marín, Hugo; Conant-Pablos, Santiago Enrique
2016-01-01
Constraint satisfaction problems are of special interest for the artificial intelligence and operations research community due to their many applications. Although heuristics involved in solving these problems have largely been studied in the past, little is known about the relation between instances and the respective performance of the heuristics used to solve them. This paper focuses on both the exploration of the instance space to identify relations between instances and good performing heuristics and how to use such relations to improve the search. Firstly, the document describes a methodology to explore the instance space of constraint satisfaction problems and evaluate the corresponding performance of six variable ordering heuristics for such instances in order to find regions on the instance space where some heuristics outperform the others. Analyzing such regions favors the understanding of how these heuristics work and contribute to their improvement. Secondly, we use the information gathered from the first stage to predict the most suitable heuristic to use according to the features of the instance currently being solved. This approach proved to be competitive when compared against the heuristics applied in isolation on both randomly generated and structured instances of constraint satisfaction problems.
Experimental Matching of Instances to Heuristics for Constraint Satisfaction Problems
Moreno-Scott, Jorge Humberto; Ortiz-Bayliss, José Carlos; Terashima-Marín, Hugo; Conant-Pablos, Santiago Enrique
2016-01-01
Constraint satisfaction problems are of special interest for the artificial intelligence and operations research community due to their many applications. Although heuristics involved in solving these problems have largely been studied in the past, little is known about the relation between instances and the respective performance of the heuristics used to solve them. This paper focuses on both the exploration of the instance space to identify relations between instances and good performing heuristics and how to use such relations to improve the search. Firstly, the document describes a methodology to explore the instance space of constraint satisfaction problems and evaluate the corresponding performance of six variable ordering heuristics for such instances in order to find regions on the instance space where some heuristics outperform the others. Analyzing such regions favors the understanding of how these heuristics work and contribute to their improvement. Secondly, we use the information gathered from the first stage to predict the most suitable heuristic to use according to the features of the instance currently being solved. This approach proved to be competitive when compared against the heuristics applied in isolation on both randomly generated and structured instances of constraint satisfaction problems. PMID:26949383
Researching Street Children: Methodological and Ethical Issues.
ERIC Educational Resources Information Center
Hutz, Claudio S.; And Others
This paper describes the ethical and methodological problems associated with studying prosocial moral reasoning of street children and children of low and high SES living with their families, and problems associated with studying sexual attitudes and behavior of street children and their knowledge of sexually transmitted diseases, especially AIDS.…
Problem-Based Learning: Lessons for Administrators, Educators and Learners
ERIC Educational Resources Information Center
Yeo, Roland
2005-01-01
Purpose: The paper aims to explore the challenges of problem-based learning (PBL) as an unconventional teaching methodology experienced by a higher learning institute in Singapore. Design/methodology/approach: The exploratory study was conducted using focus group discussions and semi-structured interviews. Four groups of people were invited to…
Incorporation of lean methodology into pharmacy residency programs.
John, Natalie; Snider, Holly; Edgerton, Lisa; Whalin, Laurie
2017-03-15
The implementation of lean methodology into pharmacy residency programs at a community teaching hospital is described. New Hanover Regional Medical Center, a community teaching hospital in southeastern North Carolina, fully adopted a lean culture in 2010. Given the success of lean strategies organizationally, this methodology was used to assist with the evaluation and development of its pharmacy residency programs in 2014. Lean tools and activities have also been incorporated into residency requirements and rotation learning activities. The majority of lean events correspond to the required competency areas evaluating leadership and management, teaching, and education. These events have included participation in and facilitation of various lean problem-solving and communication tools. The application of the 4 rules of lean has resulted in enhanced management of the programs and provides a set of tools by which continual quality improvement can be ensured. Regular communication and direct involvement of all invested parties have been critical in developing and sustaining new improvements. In addition to program enhancements, lean methodology offers novel methods by which residents may be incorporated into leadership activities. The incorporation of lean methodology into pharmacy residency programs has translated into a variety of realized and potential benefits for the programs, the preceptors and residents, and the health system. Specific areas of growth have included quality-improvement processes, the expansion of leadership opportunities for residents, and improved communication among program directors, preceptors, and residents. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
The Speaker Respoken: Material Rhetoric as Feminist Methodology.
ERIC Educational Resources Information Center
Collins, Vicki Tolar
1999-01-01
Presents a methodology based on the concept of "material rhetoric" that can help scholars avoid problems as they reclaim women's historical texts. Defines material rhetoric and positions it theoretically in relation to other methodologies, including bibliographical studies, reception theory, and established feminist methodologies. Illustrates…
Bunch, Martin J
2003-02-01
This paper discusses the integration of soft systems methodology (SSM) within an ecosystem approach in research to support rehabilitation and management of the Cooum River and environs in Chennai, India. The Cooum is an extremely polluted urban stream. Its management is complicated by high rates of population growth, poverty, uncontrolled urban development, jurisdictional conflicts, institutional culture, flat topography, tidal action, blockage of the river mouth, and monsoon flooding. The situation is characterized by basic uncertainty about main processes and activities, and the nature of relationships among actors and elements in the system.SSM is an approach for dealing with messy or ill-structured problematic situations involving human activity. In this work SSM contributed techniques (such as "rich picture" and "CATWOE" tools) to description of the Cooum situation as a socioecological system and informed the approach itself at a theoretical level. Application of three general phases in SSM is discussed in the context of the Cooum River research: (1) problem definition and exploration of the problem situation, (2) development of conceptual models of relevant systems, and (3) the use of these to generate insight and stimulate debate about desirable and feasible change. Its use here gives weight to the statement by others that SSM would be a particularly appropriate methodology to operate the ecosystem approach. As well as informing efforts at management of the Cooum system, this work led the way to explore an adaptive ecosystem approach more broadly to management of the urban environment for human health in Chennai.
Sugimoto-Matsuda, Jeanelle J; Hishinuma, Earl S; Momohara, Christie-Brianna K; Rehuher, Davis; Soli, Fa'apisa M; Bautista, Randy Paul M; Chang, Janice Y
2012-10-01
Youth violence (YV) is a complex public health issue that spans geographic, ethnic, and socioeconomic lines. The Asian/Pacific Islander Youth Violence Prevention Center conducts qualitative and quantitative research on YV in Hawai'i. A critical element in YV prevention involves measuring YV and its risk-protective factors to determine the scope of the problem and to monitor changes across time. Under the Asian/Pacific Islander Youth Violence Prevention Center's (APIYVPC's) surveillance umbrella, a variety of methodologies are utilized. The major forms of active surveillance are a School-Wide Survey for youth, and a Safe Community Household Survey for adults. A variety of secondary data sources are accessed, such as the Centers for Disease Control and Prevention (Youth Risk Behavior Surveillance System), the Hawai'i State Department of the Attorney General, the Hawai'i State Department of Education, and the Hawai'i State Department of Health. State data are especially important for the Center, because most of these sources disaggregate ethnicity data for Asian Americans/Pacific Islanders. This paper details the surveillance methodologies utilized by the APIYVPC to monitor YV in one specific community and in Hawai'i, in comparison to the rest of the State and nation. Empirical results demonstrate the utility of each methodology and how they complement one another. Individually, each data source lends valuable information to the field of YV prevention; however, collectively, the APIYVPC's surveillance methods help to paint a more complete picture regarding violence rates and the relationship between YV and its risk-protective factors, particularly for minority communities.
Axelrod, Noel; Radko, Anna; Lewis, Aaron; Ben-Yosef, Nissim
2004-04-10
A methodology is described for phase restoration of an object function from differential interference contrast (DIC) images. The methodology involves collecting a set of DIC images in the same plane with different bias retardation between the two illuminating light components produced by a Wollaston prism. These images, together with one conventional bright-field image, allows for reduction of the phase deconvolution restoration problem from a highly complex nonlinear mathematical formulation to a set of linear equations that can be applied to resolve the phase for images with a relatively large number of pixels. Additionally, under certain conditions, an on-line atomic force imaging system that does not interfere with the standard DIC illumination modes resolves uncertainties in large topographical variations that generally lead to a basic problem in DIC imaging, i.e., phase unwrapping. Furthermore, the availability of confocal detection allows for a three-dimensional reconstruction with high accuracy of the refractive-index measurement of the object that is to be imaged. This has been applied to reconstruction of the refractive index of an arrayed waveguide in a region in which a defect in the sample is present. The results of this paper highlight the synergism of far-field microscopies integrated with scanned probe microscopies and restoration algorithms for phase reconstruction.
NASA Astrophysics Data System (ADS)
Reem, Daniel; De Pierro, Alvaro
2017-04-01
Many problems in science and engineering involve, as part of their solution process, the consideration of a separable function which is the sum of two convex functions, one of them possibly non-smooth. Recently a few works have discussed inexact versions of several accelerated proximal methods aiming at solving this minimization problem. This paper shows that inexact versions of a method of Beck and Teboulle (fast iterative shrinkable tresholding algorithm) preserve, in a Hilbert space setting, the same (non-asymptotic) rate of convergence under some assumptions on the decay rate of the error terms The notion of inexactness discussed here seems to be rather simple, but, interestingly, when comparing to related works, closely related decay rates of the errors terms yield closely related convergence rates. The derivation sheds some light on the somewhat mysterious origin of some parameters which appear in various accelerated methods. A consequence of the analysis is that the accelerated method is perturbation resilient, making it suitable, in principle, for the superiorization methodology. By taking this into account, we re-examine the superiorization methodology and significantly extend its scope. This work was supported by FAPESP 2013/19504-9. The second author was supported also by CNPq grant 306030/2014-4.
Dynamically consistent hydrography and absolute velocity in the eastern North Atlantic Ocean
NASA Technical Reports Server (NTRS)
Wunsch, Carl
1994-01-01
The problem of mapping a dynamically consistent hydrographic field and associated absolute geostrophic flow in the eastern North Atlantic between 24 deg and 36 deg N is related directly to the solution of the so-called thermocline equations. A nonlinear optimization problem involving Needler's P equation is solved to find the hydrography and resulting flow that minimizes the vertical mixing above about 1500 m in the ocean and is simultaneously consistent with the observations. A sharp minimum (at least in some dimensions) is found, apparently corresponding to a solution nearly conserving potential vorticity and with vertical eddy coefficient less than about 10(exp -5) sq m/s. Estimates of `residual' quantities such as eddy coefficients are extremely sensitive to slight modifications to the observed fields. Boundary conditions, vertical velocities, etc., are a product of the optimization and produce estimates differing quantitatively from prior ones relying directly upon observed hydrography. The results are generally insensitive to particular elements of the solution methodology, but many questions remain concerning the extent to which different synoptic sections can be asserted to represent the same ocean. The method can be regarded as a practical generalization of the beta spiral and geostrophic balance inverses for the estimate of absolute geostrophic flows. Numerous improvements to the methodology used in this preliminary attempt are possible.
Participatory Research Challenges in Drug Abuse Studies Among Transnational Mexican Migrants
Garcia, Victor; Gonzalez, Laura
2011-01-01
Participatory research is essential in public health studies, but using this methodology to examine sensitive public health problems among vulnerable populations is a challenge. We share some of our trials and tribulations in attempting to use participatory research in our substance abuse studies among transnational Mexican migrants in southeastern Pennsylvania. Major challenges did not permit partnerships across the community in all phases of research, including the dissemination of findings. Especially difficult was including transnational migrants and nearby relatives as partners in the research, similar to partnerships created with others in the community. The sensitive nature of our research and associated human subject concerns did not permit a more participatory methodology. Another problem involved partnerships with members of the larger community, given the apathy and ambivalence towards drug use by transnational migrants. Finally, collaborating with community stakeholders to develop and implement research-based recommendations was also problematic. As we learned, there are more to generating substance abuse recommendations in partnership with stakeholders than simply working together on recommendations, which also require an effective implementation strategy. Based on these experiences, we elaborate useful suggestions in development and application of local-level programs aimed at curtailing substance abuse among transnational migrant workers while they are at their work sites in Pennsylvania. PMID:22003376
Community resilience assessment and literature analysis.
Weiner, John M; Walsh, John J
2015-01-01
Earlier and current disaster-related research emphasised the sociological/behavioural perspective. This led to a significant amount of literature devoted to descriptive context of natural, man-made and technological disasters and sequelae. This paper considers a next step involving a more expanded approach in research methodology. The phases include: (1) the development of a comprehensive database of ideas provided by authors of scholarly and scientific papers; (2) the development of computer-supported algorithms to prepare an array of scenarios representing relationships, gaps and inconsistencies in existing knowledge; (3) a process for evaluating the scenarios to determine a feasible and interesting next research strategy or programmatic action that will provide enhanced description of the problems as well as possible insights to their correction by interventions. The intent is to develop interventions as an essential component for better prevention, mitigation, rehabilitation, reconstruction and problem-solving affected by disaster events. To illustrate this approach, community resilience, a relatively new and important idea was studied. The phrase was used to describe relationships and omissions. The ideas associated with this central idea were considered in the building of a new instrument for evaluation of community vulnerability and readiness. This methodology addresses the time constraints realised by practitioners and investigators. The methods should eliminate tedious, clerical functions and focus on the intellectual functions representing optimal use of human energy.
A methodology for the assessment of manned flight simulator fidelity
NASA Technical Reports Server (NTRS)
Hess, Ronald A.; Malsbury, Terry N.
1989-01-01
A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.
Nanosatellite and Plug-and-Play Architecture 2 (NAPA 2)
2017-02-28
potentially other militarily relevant roles. The "i- Missions" focus area studies the kinetics of rapid mission development. The methodology involves...the US and Sweden in the Nanosatellite and Plug-and-play Architecture or "NAPA" program) is to pioneer a methodology for creating mission capable 6U...spacecraft. The methodology involves interchangeable blackbox (self-describing) components, software (middleware and applications), advanced
Intelligent systems engineering methodology
NASA Technical Reports Server (NTRS)
Fouse, Scott
1990-01-01
An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.
Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.
Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun
2018-01-01
Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.
Study on the performance of different craniofacial superimposition approaches (I).
Ibáñez, O; Vicente, R; Navega, D S; Wilkinson, C; Jayaprakash, P T; Huete, M I; Briers, T; Hardiman, R; Navarro, F; Ruiz, E; Cavalli, F; Imaizumi, K; Jankauskas, R; Veselovskaya, E; Abramov, A; Lestón, P; Molinero, F; Cardoso, J; Çağdır, A S; Humpire, D; Nakanishi, Y; Zeuner, A; Ross, A H; Gaudio, D; Damas, S
2015-12-01
As part of the scientific tasks coordinated throughout The 'New Methodologies and Protocols of Forensic Identification by Craniofacial Superimposition (MEPROCS)' project, the current study aims to analyse the performance of a diverse set of CFS methodologies and the corresponding technical approaches when dealing with a common dataset of real-world cases. Thus, a multiple-lab study on craniofacial superimposition has been carried out for the first time. In particular, 26 participants from 17 different institutions in 13 countries were asked to deal with 14 identification scenarios, some of them involving the comparison of multiple candidates and unknown skulls. In total, 60 craniofacial superimposition problems divided in two set of females and males. Each participant follow her/his own methodology and employed her/his particular technological means. For each single case they were asked to report the final identification decision (either positive or negative) along with the rationale supporting the decision and at least one image illustrating the overlay/superimposition outcome. This study is expected to provide important insights to better understand the most convenient characteristics of every method included in this study. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Invited Commentary: The Need for Cognitive Science in Methodology.
Greenland, Sander
2017-09-15
There is no complete solution for the problem of abuse of statistics, but methodological training needs to cover cognitive biases and other psychosocial factors affecting inferences. The present paper discusses 3 common cognitive distortions: 1) dichotomania, the compulsion to perceive quantities as dichotomous even when dichotomization is unnecessary and misleading, as in inferences based on whether a P value is "statistically significant"; 2) nullism, the tendency to privilege the hypothesis of no difference or no effect when there is no scientific basis for doing so, as when testing only the null hypothesis; and 3) statistical reification, treating hypothetical data distributions and statistical models as if they reflect known physical laws rather than speculative assumptions for thought experiments. As commonly misused, null-hypothesis significance testing combines these cognitive problems to produce highly distorted interpretation and reporting of study results. Interval estimation has so far proven to be an inadequate solution because it involves dichotomization, an avenue for nullism. Sensitivity and bias analyses have been proposed to address reproducibility problems (Am J Epidemiol. 2017;186(6):646-647); these methods can indeed address reification, but they can also introduce new distortions via misleading specifications for bias parameters. P values can be reframed to lessen distortions by presenting them without reference to a cutoff, providing them for relevant alternatives to the null, and recognizing their dependence on all assumptions used in their computation; they nonetheless require rescaling for measuring evidence. I conclude that methodological development and training should go beyond coverage of mechanistic biases (e.g., confounding, selection bias, measurement error) to cover distortions of conclusions produced by statistical methods and psychosocial forces. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
2013-01-01
Background Understanding the function of a particular gene under various stresses is important for engineering plants for broad-spectrum stress tolerance. Although virus-induced gene silencing (VIGS) has been used to characterize genes involved in abiotic stress tolerance, currently available gene silencing and stress imposition methodology at the whole plant level is not suitable for high-throughput functional analyses of genes. This demands a robust and reliable methodology for characterizing genes involved in abiotic and multi-stress tolerance. Results Our methodology employs VIGS-based gene silencing in leaf disks combined with simple stress imposition and effect quantification methodologies for easy and faster characterization of genes involved in abiotic and multi-stress tolerance. By subjecting leaf disks from gene-silenced plants to various abiotic stresses and inoculating silenced plants with various pathogens, we show the involvement of several genes for multi-stress tolerance. In addition, we demonstrate that VIGS can be used to characterize genes involved in thermotolerance. Our results also showed the functional relevance of NtEDS1 in abiotic stress, NbRBX1 and NbCTR1 in oxidative stress; NtRAR1 and NtNPR1 in salinity stress; NbSOS1 and NbHSP101 in biotic stress; and NtEDS1, NbETR1, NbWRKY2 and NbMYC2 in thermotolerance. Conclusions In addition to widening the application of VIGS, we developed a robust, easy and high-throughput methodology for functional characterization of genes involved in multi-stress tolerance. PMID:24289810
A methodology to enhance electromagnetic compatibility in joint military operations
NASA Astrophysics Data System (ADS)
Buckellew, William R.
The development and validation of an improved methodology to identify, characterize, and prioritize potential joint EMI (electromagnetic interference) interactions and identify and develop solutions to reduce the effects of the interference are discussed. The methodology identifies potential EMI problems using results from field operations, historical data bases, and analytical modeling. Operational expertise, engineering analysis, and testing are used to characterize and prioritize the potential EMI problems. Results can be used to resolve potential EMI during the development and acquisition of new systems and to develop engineering fixes and operational workarounds for systems already employed. The analytic modeling portion of the methodology is a predictive process that uses progressive refinement of the analysis and the operational electronic environment to eliminate noninterfering equipment pairs, defer further analysis on pairs lacking operational significance, and resolve the remaining EMI problems. Tests are conducted on equipment pairs to ensure that the analytical models provide a realistic description of the predicted interference.
NASA Astrophysics Data System (ADS)
Babaveisi, Vahid; Paydar, Mohammad Mahdi; Safaei, Abdul Sattar
2018-07-01
This study aims to discuss the solution methodology for a closed-loop supply chain (CLSC) network that includes the collection of used products as well as distribution of the new products. This supply chain is presented on behalf of the problems that can be solved by the proposed meta-heuristic algorithms. A mathematical model is designed for a CLSC that involves three objective functions of maximizing the profit, minimizing the total risk and shortages of products. Since three objective functions are considered, a multi-objective solution methodology can be advantageous. Therefore, several approaches have been studied and an NSGA-II algorithm is first utilized, and then the results are validated using an MOSA and MOPSO algorithms. Priority-based encoding, which is used in all the algorithms, is the core of the solution computations. To compare the performance of the meta-heuristics, random numerical instances are evaluated by four criteria involving mean ideal distance, spread of non-dominance solution, the number of Pareto solutions, and CPU time. In order to enhance the performance of the algorithms, Taguchi method is used for parameter tuning. Finally, sensitivity analyses are performed and the computational results are presented based on the sensitivity analyses in parameter tuning.
NASA Astrophysics Data System (ADS)
Babaveisi, Vahid; Paydar, Mohammad Mahdi; Safaei, Abdul Sattar
2017-07-01
This study aims to discuss the solution methodology for a closed-loop supply chain (CLSC) network that includes the collection of used products as well as distribution of the new products. This supply chain is presented on behalf of the problems that can be solved by the proposed meta-heuristic algorithms. A mathematical model is designed for a CLSC that involves three objective functions of maximizing the profit, minimizing the total risk and shortages of products. Since three objective functions are considered, a multi-objective solution methodology can be advantageous. Therefore, several approaches have been studied and an NSGA-II algorithm is first utilized, and then the results are validated using an MOSA and MOPSO algorithms. Priority-based encoding, which is used in all the algorithms, is the core of the solution computations. To compare the performance of the meta-heuristics, random numerical instances are evaluated by four criteria involving mean ideal distance, spread of non-dominance solution, the number of Pareto solutions, and CPU time. In order to enhance the performance of the algorithms, Taguchi method is used for parameter tuning. Finally, sensitivity analyses are performed and the computational results are presented based on the sensitivity analyses in parameter tuning.
Methodological Issues and Practical Problems in Conducting Research on Abused Children.
ERIC Educational Resources Information Center
Kinard, E. Milling
In order to inform policy and programs, research on child abuse must be not only methodologically rigorous, but also practically feasible. However, practical problems make child abuse research difficult to conduct. Definitions of abuse must be explicit and different types of abuse must be assessed separately. Study samples should be as…
ERIC Educational Resources Information Center
Soh, Kaycheng
2013-01-01
Recent research into university ranking methodologies uncovered several methodological problems among the systems currently in vogue. One of these is the discrepancy between the nominal and attained weights. The problem is the summation of unstandardized indicators for the total scores used in ranking. It is demonstrated that weight discrepancy…
A Methodological Critique of "Interventions for Boys with Conduct Problems"
ERIC Educational Resources Information Center
Kent, Ronald; And Others
1976-01-01
Kent criticizes Patterson's study on treating the behavior problems of boys, on several methodological bases concluding that more rigorous research is required in this field. Patterson answers Kent's criticisms arguing that they are not based on sound grounds. Patterson offers further evidence to support the efficacy of his treatment procedures.…
Research Methodology in Second Language Studies: Trends, Concerns, and New Directions
ERIC Educational Resources Information Center
King, Kendall A.; Mackey, Alison
2016-01-01
The field of second language studies is using increasingly sophisticated methodological approaches to address a growing number of urgent, real-world problems. These methodological developments bring both new challenges and opportunities. This article briefly reviews recent ontological and methodological debates in the field, then builds on these…
Costs of Addressing Heroin Addiction in Malaysia and 32 Comparable Countries Worldwide
Ruger, Jennifer Prah; Chawarski, Marek; Mazlan, Mahmud; Luekens, Craig; Ng, Nora; Schottenfeld, Richard
2012-01-01
Objective Develop and apply new costing methodologies to estimate costs of opioid dependence treatment in countries worldwide. Data Sources/Study Setting Micro-costing methodology developed and data collected during randomized controlled trial (RCT) involving 126 patients (July 2003–May 2005) in Malaysia. Gross-costing methodology developed to estimate costs of treatment replication in 32 countries with data collected from publicly available sources. Study Design Fixed, variable, and societal cost components of Malaysian RCT micro-costed and analytical framework created and employed for gross-costing in 32 countries selected by three criteria relative to Malaysia: major heroin problem, geographic proximity, and comparable gross domestic product (GDP) per capita. Principal Findings Medication, and urine and blood testing accounted for the greatest percentage of total costs for both naltrexone (29–53 percent) and buprenorphine (33–72 percent) interventions. In 13 countries, buprenorphine treatment could be provided for under $2,000 per patient. For all countries except United Kingdom and Singapore, incremental costs per person were below $1,000 when comparing buprenorphine to naltrexone. An estimated 100 percent of opiate users in Cambodia and Lao People's Democratic Republic could be treated for $8 and $30 million, respectively. Conclusions Buprenorphine treatment can be provided at low cost in countries across the world. This study's new costing methodologies provide tools for health systems worldwide to determine the feasibility and cost of similar interventions. PMID:22091732
Costs of addressing heroin addiction in Malaysia and 32 comparable countries worldwide.
Ruger, Jennifer Prah; Chawarski, Marek; Mazlan, Mahmud; Luekens, Craig; Ng, Nora; Schottenfeld, Richard
2012-04-01
Develop and apply new costing methodologies to estimate costs of opioid dependence treatment in countries worldwide. Micro-costing methodology developed and data collected during randomized controlled trial (RCT) involving 126 patients (July 2003-May 2005) in Malaysia. Gross-costing methodology developed to estimate costs of treatment replication in 32 countries with data collected from publicly available sources. Fixed, variable, and societal cost components of Malaysian RCT micro-costed and analytical framework created and employed for gross-costing in 32 countries selected by three criteria relative to Malaysia: major heroin problem, geographic proximity, and comparable gross domestic product (GDP) per capita. Medication, and urine and blood testing accounted for the greatest percentage of total costs for both naltrexone (29-53 percent) and buprenorphine (33-72 percent) interventions. In 13 countries, buprenorphine treatment could be provided for under $2,000 per patient. For all countries except United Kingdom and Singapore, incremental costs per person were below $1,000 when comparing buprenorphine to naltrexone. An estimated 100 percent of opiate users in Cambodia and Lao People's Democratic Republic could be treated for $8 and $30 million, respectively. Buprenorphine treatment can be provided at low cost in countries across the world. This study's new costing methodologies provide tools for health systems worldwide to determine the feasibility and cost of similar interventions. © Health Research and Educational Trust.
Ojaveer, Henn; Eero, Margit
2011-04-29
Assessments of the environmental status of marine ecosystems are increasingly needed to inform management decisions and regulate human pressures to meet the objectives of environmental policies. This paper addresses some generic methodological challenges and related uncertainties involved in marine ecosystem assessment, using the central Baltic Sea as a case study. The objectives of good environmental status of the Baltic Sea are largely focusing on biodiversity, eutrophication and hazardous substances. In this paper, we conduct comparative evaluations of the status of these three segments, by applying different methodological approaches. Our analyses indicate that the assessment results are sensitive to a selection of indicators for ecological quality objectives that are affected by a broad spectrum of human activities and natural processes (biodiversity), less so for objectives that are influenced by a relatively narrow array of drivers (eutrophications, hazardous substances). The choice of indicator aggregation rule appeared to be of essential importance for assessment results for all three segments, whereas the hierarchical structure of indicators had only a minor influence. Trend-based assessment was shown to be a useful supplement to reference-based evaluation, being independent of the problems related to defining reference values and indicator aggregation methodologies. Results of this study will help in setting priorities for future efforts to improve environmental assessments in the Baltic Sea and elsewhere, and to ensure the transparency of the assessment procedure.
Berne, Rosalyn W; Raviv, Daniel
2004-04-01
This paper introduces the Eight Dimensional Methodology for Innovative Thinking (the Eight Dimensional Methodology), for innovative problem solving, as a unified approach to case analysis that builds on comprehensive problem solving knowledge from industry, business, marketing, math, science, engineering, technology, arts, and daily life. It is designed to stimulate innovation by quickly generating unique "out of the box" unexpected and high quality solutions. It gives new insights and thinking strategies to solve everyday problems faced in the workplace, by helping decision makers to see otherwise obscure alternatives and solutions. Daniel Raviv, the engineer who developed the Eight Dimensional Methodology, and paper co-author, technology ethicist Rosalyn Berne, suggest that this tool can be especially useful in identifying solutions and alternatives for particular problems of engineering, and for the ethical challenges which arise with them. First, the Eight Dimensional Methodology helps to elucidate how what may appear to be a basic engineering problem also has ethical dimensions. In addition, it offers to the engineer a methodology for penetrating and seeing new dimensions of those problems. To demonstrate the effectiveness of the Eight Dimensional Methodology as an analytical tool for thinking about ethical challenges to engineering, the paper presents the case of the construction of the Large Binocular Telescope (LBT) on Mount Graham in Arizona. Analysis of the case offers to decision makers the use of the Eight Dimensional Methodology in considering alternative solutions for how they can proceed in their goals of exploring space. It then follows that same process through the second stage of exploring the ethics of each of those different solutions. The LBT project pools resources from an international partnership of universities and research institutes for the construction and maintenance of a highly sophisticated, powerful new telescope. It will soon mark the erection of the world's largest and most powerful optical telescope, designed to see fine detail otherwise visible only from space. It also represents a controversial engineering project that is being undertaken on land considered to be sacred by the local, native Apache people. As presented, the case features the University of Virginia, and its challenges in consideration of whether and how to join the LBT project consortium.
IMSF: Infinite Methodology Set Framework
NASA Astrophysics Data System (ADS)
Ota, Martin; Jelínek, Ivan
Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.
ERIC Educational Resources Information Center
Lacey, John H.; Kelley-Baker, Tara; Voas, Robert B.; Romano, Eduardo; Furr-Holden, C. Debra; Torres, Pedro; Berning, Amy
2011-01-01
This article describes the methodology used in the 2007 U.S. National Roadside Survey to estimate the prevalence of alcohol- and drug-impaired driving and alcohol- and drug-involved driving. This study involved randomly stopping drivers at 300 locations across the 48 continental U.S. states at sites selected through a stratified random sampling…
A predictive machine learning approach for microstructure optimization and materials design
NASA Astrophysics Data System (ADS)
Liu, Ruoqian; Kumar, Abhishek; Chen, Zhengzhang; Agrawal, Ankit; Sundararaghavan, Veera; Choudhary, Alok
2015-06-01
This paper addresses an important materials engineering question: How can one identify the complete space (or as much of it as possible) of microstructures that are theoretically predicted to yield the desired combination of properties demanded by a selected application? We present a problem involving design of magnetoelastic Fe-Ga alloy microstructure for enhanced elastic, plastic and magnetostrictive properties. While theoretical models for computing properties given the microstructure are known for this alloy, inversion of these relationships to obtain microstructures that lead to desired properties is challenging, primarily due to the high dimensionality of microstructure space, multi-objective design requirement and non-uniqueness of solutions. These challenges render traditional search-based optimization methods incompetent in terms of both searching efficiency and result optimality. In this paper, a route to address these challenges using a machine learning methodology is proposed. A systematic framework consisting of random data generation, feature selection and classification algorithms is developed. Experiments with five design problems that involve identification of microstructures that satisfy both linear and nonlinear property constraints show that our framework outperforms traditional optimization methods with the average running time reduced by as much as 80% and with optimality that would not be achieved otherwise.
Autonomous interplanetary constellation design
NASA Astrophysics Data System (ADS)
Chow, Cornelius Channing, II
According to NASA's integrated space technology roadmaps, space-based infrastructures are envisioned as necessary ingredients to a sustained effort in continuing space exploration. Whether it be for extra-terrestrial habitats, roving/cargo vehicles, or space tourism, autonomous space networks will provide a vital communications lifeline for both future robotic and human missions alike. Projecting that the Moon will be a bustling hub of activity within a few decades, a near-term opportunity for in-situ infrastructure development is within reach. This dissertation addresses the anticipated need for in-space infrastructure by investigating a general design methodology for autonomous interplanetary constellations; to illustrate the theory, this manuscript presents results from an application to the Earth-Moon neighborhood. The constellation design methodology is formulated as an optimization problem, involving a trajectory design step followed by a spacecraft placement sequence. Modeling the dynamics as a restricted 3-body problem, the investigated design space consists of families of periodic orbits which play host to the constellations, punctuated by arrangements of spacecraft autonomously guided by a navigation strategy called LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation). Instead of more traditional exhaustive search methods, a numerical continuation approach is implemented to map the admissible configuration space. In particular, Keller's pseudo-arclength technique is used to follow folding/bifurcating solution manifolds, which are otherwise inaccessible with other parameter continuation schemes. A succinct characterization of the underlying structure of the local, as well as global, extrema is thus achievable with little a priori intuition of the solution space. Furthermore, the proposed design methodology offers benefits in computation speed plus the ability to handle mildly stochastic systems. An application of the constellation design methodology to the restricted Earth-Moon system, reveals optimal pairwise configurations for various L1, L2, and L5 (halo, axial, and vertical) periodic orbit families. Navigation accuracies, ranging from O (10+/-1) meters in position space, are obtained for the optimal Earth-Moon constellations, given measurement noise on the order of 1 meter.
NASA Technical Reports Server (NTRS)
Fogarty, Jennifer A.; Rando, Cynthia; Baumann, David; Richard, Elizabeth; Davis, Jeffrey
2010-01-01
In an effort to expand routes for open communication and create additional opportunities for public involvement with NASA, Open Innovation Service Provider (OISP) methodologies have been incorporated as a tool in NASA's problem solving strategy. NASA engaged the services of two OISP providers, InnoCentive and Yet2.com, to test this novel approach and its feasibility in solving NASA s space flight challenges. The OISPs were chosen based on multiple factors including: network size and knowledge area span, established process, methodology, experience base, and cost. InnoCentive and Yet2.com each met the desired criteria; however each company s approach to Open Innovation is distinctly different. InnoCentive focuses on posting individual challenges to an established web-based network of approximately 200,000 solvers; viable solutions are sought and granted a financial award if found. Based on a specific technological need, Yet2.com acts as a talent scout providing a broad external network of experts as potential collaborators to NASA. A relationship can be established with these contacts to develop technologies and/or maintained as an established network of future collaborators. The results from the first phase of the pilot study have shown great promise for long term efficacy of utilizing the OISP methodologies. Solution proposals have been received for the challenges posted on InnoCentive and are currently under review for final disposition. In addition, Yet2.com has identified new external partners for NASA and we are in the process of understanding and acting upon these new opportunities. Compared to NASA's traditional routes for external problem solving, the OISP methodologies offered NASA a substantial savings in terms of time and resources invested. In addition, these strategies will help NASA extend beyond its current borders to build an ever expanding network of experts and global solvers.
Perspective: Quantum mechanical methods in biochemistry and biophysics.
Cui, Qiang
2016-10-14
In this perspective article, I discuss several research topics relevant to quantum mechanical (QM) methods in biophysical and biochemical applications. Due to the immense complexity of biological problems, the key is to develop methods that are able to strike the proper balance of computational efficiency and accuracy for the problem of interest. Therefore, in addition to the development of novel ab initio and density functional theory based QM methods for the study of reactive events that involve complex motifs such as transition metal clusters in metalloenzymes, it is equally important to develop inexpensive QM methods and advanced classical or quantal force fields to describe different physicochemical properties of biomolecules and their behaviors in complex environments. Maintaining a solid connection of these more approximate methods with rigorous QM methods is essential to their transferability and robustness. Comparison to diverse experimental observables helps validate computational models and mechanistic hypotheses as well as driving further development of computational methodologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wahl, D.E.; Jakowatz, C.V. Jr.; Ghiglia, D.C.
1991-01-01
Autofocus methods in SAR and self-survey techniques in SONAR have a common mathematical basis in that they both involve estimation and correction of phase errors introduced by sensor position uncertainties. Time delay estimation and correlation methods have been shown to be effective in solving the self-survey problem for towed SONAR arrays. Since it can be shown that platform motion errors introduce similar time-delay estimation problems in SAR imaging, the question arises as to whether such techniques could be effectively employed for autofocus of SAR imagery. With a simple mathematical model for motion errors in SAR, we will show why suchmore » correlation/time-delay techniques are not nearly as effective as established SAR autofocus algorithms such as phase gradient autofocus or sub-aperture based methods. This analysis forms an important bridge between signal processing methodologies for SAR and SONAR. 5 refs., 4 figs.« less
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
Das, Arpita; Bhattacharya, Mahua
2011-01-01
In the present work, authors have developed a treatment planning system implementing genetic based neuro-fuzzy approaches for accurate analysis of shape and margin of tumor masses appearing in breast using digital mammogram. It is obvious that a complicated structure invites the problem of over learning and misclassification. In proposed methodology, genetic algorithm (GA) has been used for searching of effective input feature vectors combined with adaptive neuro-fuzzy model for final classification of different boundaries of tumor masses. The study involves 200 digitized mammograms from MIAS and other databases and has shown 86% correct classification rate.
A Selective Review of Group Selection in High-Dimensional Models
Huang, Jian; Breheny, Patrick; Ma, Shuangge
2013-01-01
Grouping structures arise naturally in many statistical modeling problems. Several methods have been proposed for variable selection that respect grouping structure in variables. Examples include the group LASSO and several concave group selection methods. In this article, we give a selective review of group selection concerning methodological developments, theoretical properties and computational algorithms. We pay particular attention to group selection methods involving concave penalties. We address both group selection and bi-level selection methods. We describe several applications of these methods in nonparametric additive models, semiparametric regression, seemingly unrelated regressions, genomic data analysis and genome wide association studies. We also highlight some issues that require further study. PMID:24174707
Performance-costs evaluation for urban storm drainage.
Baptista, M; Barraud, S; Alfakih, E; Nascimento, N; Fernandes, W; Moura, P; Castro, L
2005-01-01
The design process of urban stormwater systems incorporating BMPs involves more complexity unlike the design of classic drainage systems for which just the technique of pipes is likely to be used. This paper presents a simple decision aid methodology and an associated software (AvDren) concerning urban stormwater systems, devoted to the evaluation and the comparison of drainage scenarios using BMPs according to different technical, sanitary, social environmental and economical aspects. This kind of tool is particularly interesting so as to help the decision makers to select the appropriate alternative and to plan the investments especially for developing countries, with important sanitary problems and severe budget restrictions.
Mokel, Melissa Jennifer; Shellman, Juliette M
2013-01-01
Many instruments in which religious involvement is measured often (a) contain unclear, poorly developed constructs; (b) lack methodological rigor in scale development; and (c) contain language and content culturally incongruent with the religious experiences of diverse ethnic groups. The primary aims of this review were to (a) synthesize the research on instruments designed to measure religious involvement, (b) evaluate the methodological quality of instruments that measure religious involvement, and (c) examine these instruments for conceptual congruency with African American religious involvement. An updated integrative research review method guided the process (Whittemore & Knafl, 2005). 152 articles were reviewed and 23 articles retrieved. Only 3 retained instruments were developed under methodologically rigorous conditions. All 3 instruments were congruent with a conceptual model of African American religious involvement. The Fetzer Multidimensional Measure of Religious Involvement and Spirituality (FMMRS; Idler et al., 2003) was found to have favorable characteristics. Further examination and psychometric testing is warranted to determine its acceptability, readability, and cultural sensitivity in an African American population.
Reft, Chester; Alecu, Rodica; Das, Indra J; Gerbi, Bruce J; Keall, Paul; Lief, Eugene; Mijnheer, Ben J; Papanikolaou, Nikos; Sibata, Claudio; Van Dyk, Jake
2003-06-01
This document is the report of a task group of the Radiation Therapy Committee of the AAPM and has been prepared primarily to advise hospital physicists involved in external beam treatment of patients with pelvic malignancies who have high atomic number (Z) hip prostheses. The purpose of the report is to make the radiation oncology community aware of the problems arising from the presence of these devices in the radiation beam, to quantify the dose perturbations they cause, and, finally, to provide recommendations for treatment planning and delivery. Some of the data and recommendations are also applicable to patients having implanted high-Z prosthetic devices such as pins, humeral head replacements. The scientific understanding and methodology of clinical dosimetry for these situations is still incomplete. This report is intended to reflect the current state of scientific understanding and technical methodology in clinical dosimetry for radiation oncology patients with high-Z hip prostheses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, M.D.; Beck, R.N.
1988-06-01
This document describes several years research to improve PET imaging and diagnostic techniques in man. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefitmore » from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. The reports in the study were processed separately for the data bases. (TEM)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1990-09-01
This report summarizes goals and accomplishments of the research program supported under DOE Grant No. FG02-86ER60418 entitled Instrumentation and Quantitative Methods of Evaluation, with R. Beck, P. I. and M. Cooper, Co-P.I. during the period January 15, 1990 through September 1, 1990. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development andmore » transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 7 figs.« less
Towards a general object-oriented software development methodology
NASA Technical Reports Server (NTRS)
Seidewitz, ED; Stark, Mike
1986-01-01
An object is an abstract software model of a problem domain entity. Objects are packages of both data and operations of that data (Goldberg 83, Booch 83). The Ada (tm) package construct is representative of this general notion of an object. Object-oriented design is the technique of using objects as the basic unit of modularity in systems design. The Software Engineering Laboratory at the Goddard Space Flight Center is currently involved in a pilot program to develop a flight dynamics simulator in Ada (approximately 40,000 statements) using object-oriented methods. Several authors have applied object-oriented concepts to Ada (e.g., Booch 83, Cherry 85). It was found that these methodologies are limited. As a result a more general approach was synthesized with allows a designer to apply powerful object-oriented principles to a wide range of applications and at all stages of design. An overview is provided of this approach. Further, how object-oriented design fits into the overall software life-cycle is considered.
Bioinspired Methodology for Artificial Olfaction
Raman, Baranidharan; Hertz, Joshua L.; Benkstein, Kurt D.; Semancik, Steve
2008-01-01
Artificial olfaction is a potential tool for noninvasive chemical monitoring. Application of “electronic noses” typically involves recognition of “pretrained” chemicals, while long-term operation and generalization of training to allow chemical classification of “unknown” analytes remain challenges. The latter analytical capability is critically important, as it is unfeasible to pre-expose the sensor to every analyte it might encounter. Here, we demonstrate a biologically inspired approach where the recognition and generalization problems are decoupled and resolved in a hierarchical fashion. Analyte composition is refined in a progression from general (e.g., target is a hydrocarbon) to precise (e.g., target is ethane), using highly optimized response features for each step. We validate this approach using a MEMS-based chemiresistive microsensor array. We show that this approach, a unique departure from existing methodologies in artificial olfaction, allows the recognition module to better mitigate sensor-aging effects and to better classify unknowns, enhancing the utility of chemical sensors for real-world applications. PMID:18855409
Lean methodology in health care.
Kimsey, Diane B
2010-07-01
Lean production is a process management philosophy that examines organizational processes from a customer perspective with the goal of limiting the use of resources to those processes that create value for the end customer. Lean manufacturing emphasizes increasing efficiency, decreasing waste, and using methods to decide what matters rather than accepting preexisting practices. A rapid improvement team at Lehigh Valley Health Network, Allentown, Pennsylvania, implemented a plan, do, check, act cycle to determine problems in the central sterile processing department, test solutions, and document improved processes. By using A3 thinking, a consensus building process that graphically depicts the current state, the target state, and the gaps between the two, the team worked to improve efficiency and safety, and to decrease costs. Use of this methodology has increased teamwork, created user-friendly work areas and processes, changed management styles and expectations, increased staff empowerment and involvement, and streamlined the supply chain within the perioperative area. Copyright (c) 2010 AORN, Inc. Published by Elsevier Inc. All rights reserved.
[Cooperative learning for improving healthy housing conditions in Bogota: a case study].
Torres-Parra, Camilo A; García-Ubaque, Juan C; García-Ubaque, César A
2014-01-01
This was a community-based effort at constructing an educational proposal orientated towards self-empowerment aimed at improving the target population's sanitary, housing and living conditions through cooperative learning. A constructivist approach was adopted based on a programme called "Habitat community manger". The project involved working with fifteen families living in the Mochuelo Bajo barrio in Ciudad Bolívar in Bogotá, Colombia, for identifying the most relevant sanitary aspects for improving their homes and proposing a methodology and organisation for an educational proposal. Twenty-one poor housing-related epidemiological indicators were identified which formed the basis for defining specific problems and establishing a methodology for designing an educational proposal. The course which emerged from the cooperative learning experience was designed to promote the community's skills and education regarding health aimed at improving households' living conditions and ensuring a healthy environment which would allow them to develop an immediate habitat ensuring their own welfare and dignity.
The Beliefs of Teachers and Daycare Staff regarding Children of Divorce: A Q Methodological Study
ERIC Educational Resources Information Center
Overland, Klara; Thorsen, Arlene Arstad; Storksen, Ingunn
2012-01-01
This Q methodological study explores beliefs of daycare staff and teachers regarding young children's reactions related to divorce. The Q factor analysis resulted in two viewpoints. Participants on the viewpoint "Child problems" believe that children show various emotional and behavioral problems related to divorce, while those on the "Structure…
Integration of PBL Methodologies into Online Learning Courses and Programs
ERIC Educational Resources Information Center
van Oostveen, Roland; Childs, Elizabeth; Flynn, Kathleen; Clarkson, Jessica
2014-01-01
Problem-based learning (PBL) challenges traditional views of teaching and learning as the learner determines, to a large extent with support from a skilled facilitator, what topics will be explored, to what depth and which processes will be used. This paper presents the implementation of problem-based learning methodologies in an online Bachelor's…
A New Paradigm for Satellite Retrieval of Hydrologic Variables: The CDRD Methodology
NASA Astrophysics Data System (ADS)
Smith, E. A.; Mugnai, A.; Tripoli, G. J.
2009-09-01
Historically, retrieval of thermodynamically active geophysical variables in the atmosphere (e.g., temperature, moisture, precipitation) involved some time of inversion scheme - embedded within the retrieval algorithm - to transform radiometric observations (a vector) to the desired geophysical parameter(s) (either a scalar or a vector). Inversion is fundamentally a mathematical operation involving some type of integral-differential radiative transfer equation - often resisting a straightforward algebraic solution - in which the integral side of the equation (typically the right-hand side) contains the desired geophysical vector, while the left-hand side contains the radiative measurement vector often free of operators. Inversion was considered more desirable than forward modeling because the forward model solution had to be selected from a generally unmanageable set of parameter-observation relationships. However, in the classical inversion problem for retrieval of temperature using multiple radiative frequencies along the wing of an absorption band (or line) of a well-mixed radiatively active gas, in either the infrared or microwave spectrums, the inversion equation to be solved consists of a Fredholm integral equation of the 2nd kind - a specific type of transform problem in which there are an infinite number of solutions. This meant that special treatment of the transform process was required in order to obtain a single solution. Inversion had become the method of choice for retrieval in the 1950s because it appealed to the use of mathematical elegance, and because the numerical approaches used to solve the problems (typically some type of relaxation or perturbation scheme) were computationally fast in an age when computers speeds were slow. Like many solution schemes, inversion has lingered on regardless of the fact that computer speeds have increased many orders of magnitude and forward modeling itself has become far more elegant in combination with Bayesian averaging procedures given that the a priori probabilities of occurrence in the true environment of the parameter(s) in question can be approximated (or are actually known). In this presentation, the theory of the more modern retrieval approach using a combination of cloud, radiation and other specialized forward models in conjunction with Bayesian weighted averaging will be reviewed in light of a brief history of inversion. The application of the theory will be cast in the framework of what we call the Cloud-Dynamics-Radiation-Database (CDRD) methodology - which we now use for the retrieval of precipitation from spaceborne passive microwave radiometers. In a companion presentation, we will specifically describe the CDRD methodology and present results for its application within the Mediterranean basin.
Application of kaizen methodology to foster departmental engagement in quality improvement.
Knechtges, Paul; Decker, Michael Christopher
2014-12-01
The Toyota Production System, also known as Lean, is a structured approach to continuous quality improvement that has been developed over the past 50 years to transform the automotive manufacturing process. In recent years, these techniques have been successfully applied to quality and safety improvement in the medical field. One of these techniques is kaizen, which is the Japanese word for "good change." The central tenant of kaizen is the quick analysis of the small, manageable components of a problem and the rapid implementation of a solution with ongoing, real-time reassessment. Kaizen adds an additional "human element" that all stakeholders, not just management, must be involved in such change. Because of the small size of the changes involved in a kaizen event and the inherent focus on human factors and change management, a kaizen event can serve as good introduction to continuous quality improvement for a radiology department. Copyright © 2014. Published by Elsevier Inc.
What does it mean to be an exemplary science teacher?
NASA Astrophysics Data System (ADS)
Tobin, Kenneth; Fraser, Barry J.
In order to provide a refreshing alternative to the majority of research reports, which malign science education and highlight its major problems and shortcomings, a series of case studies of exemplary practice was initiated to provide a focus on the successful and positive facets of schooling. The major data-collection approach was qualitative and involved 13 researchers in hundreds of hours of intensive classroom observation involving 20 exemplary teachers and a comparison group of nonexemplary teachers. A distinctive feature of the methodology was that the qualitative information was complemented by quantitative information obtained from the administration of questionnaires assessing student perceptions of classroom psychosocial environment. The major trends were that exemplary science teachers (1) used management strategies that facilitated sustained student engagement, (2) used strategies designed to increase student understanding of science, (3) utilized strategies that encouraged students to participate in learning activities, and (4) maintained a favorable classroom learning environment.
NASA Astrophysics Data System (ADS)
Rana, Sachin; Ertekin, Turgay; King, Gregory R.
2018-05-01
Reservoir history matching is frequently viewed as an optimization problem which involves minimizing misfit between simulated and observed data. Many gradient and evolutionary strategy based optimization algorithms have been proposed to solve this problem which typically require a large number of numerical simulations to find feasible solutions. Therefore, a new methodology referred to as GP-VARS is proposed in this study which uses forward and inverse Gaussian processes (GP) based proxy models combined with a novel application of variogram analysis of response surface (VARS) based sensitivity analysis to efficiently solve high dimensional history matching problems. Empirical Bayes approach is proposed to optimally train GP proxy models for any given data. The history matching solutions are found via Bayesian optimization (BO) on forward GP models and via predictions of inverse GP model in an iterative manner. An uncertainty quantification method using MCMC sampling in conjunction with GP model is also presented to obtain a probabilistic estimate of reservoir properties and estimated ultimate recovery (EUR). An application of the proposed GP-VARS methodology on PUNQ-S3 reservoir is presented in which it is shown that GP-VARS provides history match solutions in approximately four times less numerical simulations as compared to the differential evolution (DE) algorithm. Furthermore, a comparison of uncertainty quantification results obtained by GP-VARS, EnKF and other previously published methods shows that the P50 estimate of oil EUR obtained by GP-VARS is in close agreement to the true values for the PUNQ-S3 reservoir.
Behavioral problems after early life stress: contributions of the hippocampus and amygdala.
Hanson, Jamie L; Nacewicz, Brendon M; Sutterer, Matthew J; Cayo, Amelia A; Schaefer, Stacey M; Rudolph, Karen D; Shirtcliff, Elizabeth A; Pollak, Seth D; Davidson, Richard J
2015-02-15
Early life stress (ELS) can compromise development, with higher amounts of adversity linked to behavioral problems. To understand this linkage, a growing body of research has examined two brain regions involved with socioemotional functioning-amygdala and hippocampus. Yet empirical studies have reported increases, decreases, and no differences within human and nonhuman animal samples exposed to different forms of ELS. This divergence in findings may stem from methodological factors, nonlinear effects of ELS, or both. We completed rigorous hand-tracing of the amygdala and hippocampus in three samples of children who experienced different forms of ELS (i.e., physical abuse, early neglect, or low socioeconomic status). Interviews were also conducted with children and their parents or guardians to collect data about cumulative life stress. The same data were also collected in a fourth sample of comparison children who had not experienced any of these forms of ELS. Smaller amygdala volumes were found for children exposed to these different forms of ELS. Smaller hippocampal volumes were also noted for children who were physically abused or from low socioeconomic status households. Smaller amygdala and hippocampal volumes were also associated with greater cumulative stress exposure and behavioral problems. Hippocampal volumes partially mediated the relationship between ELS and greater behavioral problems. This study suggests ELS may shape the development of brain areas involved with emotion processing and regulation in similar ways. Differences in the amygdala and hippocampus may be a shared diathesis for later negative outcomes related to ELS. Copyright © 2015 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Inverse problems in complex material design: Applications to non-crystalline solids
NASA Astrophysics Data System (ADS)
Biswas, Parthapratim; Drabold, David; Elliott, Stephen
The design of complex amorphous materials is one of the fundamental problems in disordered condensed-matter science. While impressive developments of ab-initio simulation methods during the past several decades have brought tremendous success in understanding materials property from micro- to mesoscopic length scales, a major drawback is that they fail to incorporate existing knowledge of the materials in simulation methodologies. Since an essential feature of materials design is the synergy between experiment and theory, a properly developed approach to design materials should be able to exploit all available knowledge of the materials from measured experimental data. In this talk, we will address the design of complex disordered materials as an inverse problem involving experimental data and available empirical information. We show that the problem can be posed as a multi-objective non-convex optimization program, which can be addressed using a number of recently-developed bio-inspired global optimization techniques. In particular, we will discuss how a population-based stochastic search procedure can be used to determine the structure of non-crystalline solids (e.g. a-SiH, a-SiO2, amorphous graphene, and Fe and Ni clusters). The work is partially supported by NSF under Grant Nos. DMR 1507166 and 1507670.
An improved exploratory search technique for pure integer linear programming problems
NASA Technical Reports Server (NTRS)
Fogle, F. R.
1990-01-01
The development is documented of a heuristic method for the solution of pure integer linear programming problems. The procedure draws its methodology from the ideas of Hooke and Jeeves type 1 and 2 exploratory searches, greedy procedures, and neighborhood searches. It uses an efficient rounding method to obtain its first feasible integer point from the optimal continuous solution obtained via the simplex method. Since this method is based entirely on simple addition or subtraction of one to each variable of a point in n-space and the subsequent comparison of candidate solutions to a given set of constraints, it facilitates significant complexity improvements over existing techniques. It also obtains the same optimal solution found by the branch-and-bound technique in 44 of 45 small to moderate size test problems. Two example problems are worked in detail to show the inner workings of the method. Furthermore, using an established weighted scheme for comparing computational effort involved in an algorithm, a comparison of this algorithm is made to the more established and rigorous branch-and-bound method. A computer implementation of the procedure, in PC compatible Pascal, is also presented and discussed.
Roehner, Nicholas; Myers, Chris J
2014-02-21
Recently, we have begun to witness the potential of synthetic biology, noted here in the form of bacteria and yeast that have been genetically engineered to produce biofuels, manufacture drug precursors, and even invade tumor cells. The success of these projects, however, has often failed in translation and application to new projects, a problem exacerbated by a lack of engineering standards that combine descriptions of the structure and function of DNA. To address this need, this paper describes a methodology to connect the systems biology markup language (SBML) to the synthetic biology open language (SBOL), existing standards that describe biochemical models and DNA components, respectively. Our methodology involves first annotating SBML model elements such as species and reactions with SBOL DNA components. A graph is then constructed from the model, with vertices corresponding to elements within the model and edges corresponding to the cause-and-effect relationships between these elements. Lastly, the graph is traversed to assemble the annotating DNA components into a composite DNA component, which is used to annotate the model itself and can be referenced by other composite models and DNA components. In this way, our methodology can be used to build up a hierarchical library of models annotated with DNA components. Such a library is a useful input to any future genetic technology mapping algorithm that would automate the process of composing DNA components to satisfy a behavioral specification. Our methodology for SBML-to-SBOL annotation is implemented in the latest version of our genetic design automation (GDA) software tool, iBioSim.
A Framework for Developing the Structure of Public Health Economic Models.
Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P
2016-01-01
A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
William H. Cooke; Dennis M. Jacobs
2002-01-01
FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land useiland cover information....
Capturing security requirements for software systems.
El-Hadary, Hassan; El-Kassas, Sherif
2014-07-01
Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.
Capturing security requirements for software systems
El-Hadary, Hassan; El-Kassas, Sherif
2014-01-01
Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way. PMID:25685514
Roca, Judith; Reguant, Mercedes; Canet, Olga
2016-11-01
Teaching strategies are essential in order to facilitate meaningful learning and the development of high-level thinking skills in students. To compare three teaching methodologies (problem-based learning, case-based teaching and traditional methods) in terms of the learning outcomes achieved by nursing students. This quasi-experimental research was carried out in the Nursing Degree programme in a group of 74 students who explored the subject of The Oncology Patient through the aforementioned strategies. A performance test was applied based on Bloom's Revised Taxonomy. A significant correlation was found between the intragroup theoretical and theoretical-practical dimensions. Likewise, intergroup differences were related to each teaching methodology. Hence, significant differences were estimated between the traditional methodology (x-=9.13), case-based teaching (x-=12.96) and problem-based learning (x-=14.84). Problem-based learning was shown to be the most successful learning method, followed by case-based teaching and the traditional methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.
Artificial intelligence and design: Opportunities, research problems and directions
NASA Technical Reports Server (NTRS)
Amarel, Saul
1990-01-01
The issues of industrial productivity and economic competitiveness are of major significance in the U.S. at present. By advancing the science of design, and by creating a broad computer-based methodology for automating the design of artifacts and of industrial processes, we can attain dramatic improvements in productivity. It is our thesis that developments in computer science, especially in Artificial Intelligence (AI) and in related areas of advanced computing, provide us with a unique opportunity to push beyond the present level of computer aided automation technology and to attain substantial advances in the understanding and mechanization of design processes. To attain these goals, we need to build on top of the present state of AI, and to accelerate research and development in areas that are especially relevant to design problems of realistic complexity. We propose an approach to the special challenges in this area, which combines 'core work' in AI with the development of systems for handling significant design tasks. We discuss the general nature of design problems, the scientific issues involved in studying them with the help of AI approaches, and the methodological/technical issues that one must face in developing AI systems for handling advanced design tasks. Looking at basic work in AI from the perspective of design automation, we identify a number of research problems that need special attention. These include finding solution methods for handling multiple interacting goals, formation problems, problem decompositions, and redesign problems; choosing representations for design problems with emphasis on the concept of a design record; and developing approaches for the acquisition and structuring of domain knowledge with emphasis on finding useful approximations to domain theories. Progress in handling these research problems will have major impact both on our understanding of design processes and their automation, and also on several fundamental questions that are of intrinsic concern to AI. We present examples of current AI work on specific design tasks, and discuss new directions of research, both as extensions of current work and in the context of new design tasks where domain knowledge is either intractable or incomplete. The domains discussed include Digital Circuit Design, Mechanical Design of Rotational Transmissions, Design of Computer Architectures, Marine Design, Aircraft Design, and Design of Chemical Processes and Materials. Work in these domains is significant on technical grounds, and it is also important for economic and policy reasons.
Ravishankar, Saiprasad; Nadakuditi, Raj Rao; Fessler, Jeffrey A
2017-12-01
The sparsity of signals in a transform domain or dictionary has been exploited in applications such as compression, denoising and inverse problems. More recently, data-driven adaptation of synthesis dictionaries has shown promise compared to analytical dictionary models. However, dictionary learning problems are typically non-convex and NP-hard, and the usual alternating minimization approaches for these problems are often computationally expensive, with the computations dominated by the NP-hard synthesis sparse coding step. This paper exploits the ideas that drive algorithms such as K-SVD, and investigates in detail efficient methods for aggregate sparsity penalized dictionary learning by first approximating the data with a sum of sparse rank-one matrices (outer products) and then using a block coordinate descent approach to estimate the unknowns. The resulting block coordinate descent algorithms involve efficient closed-form solutions. Furthermore, we consider the problem of dictionary-blind image reconstruction, and propose novel and efficient algorithms for adaptive image reconstruction using block coordinate descent and sum of outer products methodologies. We provide a convergence study of the algorithms for dictionary learning and dictionary-blind image reconstruction. Our numerical experiments show the promising performance and speedups provided by the proposed methods over previous schemes in sparse data representation and compressed sensing-based image reconstruction.
Ravishankar, Saiprasad; Nadakuditi, Raj Rao; Fessler, Jeffrey A.
2017-01-01
The sparsity of signals in a transform domain or dictionary has been exploited in applications such as compression, denoising and inverse problems. More recently, data-driven adaptation of synthesis dictionaries has shown promise compared to analytical dictionary models. However, dictionary learning problems are typically non-convex and NP-hard, and the usual alternating minimization approaches for these problems are often computationally expensive, with the computations dominated by the NP-hard synthesis sparse coding step. This paper exploits the ideas that drive algorithms such as K-SVD, and investigates in detail efficient methods for aggregate sparsity penalized dictionary learning by first approximating the data with a sum of sparse rank-one matrices (outer products) and then using a block coordinate descent approach to estimate the unknowns. The resulting block coordinate descent algorithms involve efficient closed-form solutions. Furthermore, we consider the problem of dictionary-blind image reconstruction, and propose novel and efficient algorithms for adaptive image reconstruction using block coordinate descent and sum of outer products methodologies. We provide a convergence study of the algorithms for dictionary learning and dictionary-blind image reconstruction. Our numerical experiments show the promising performance and speedups provided by the proposed methods over previous schemes in sparse data representation and compressed sensing-based image reconstruction. PMID:29376111
NASA Astrophysics Data System (ADS)
Aminu, M.; Matori, A. N.; Yusof, K. W.
2014-02-01
The study describes a methodological approach based on an integrated use of Geographic Information System (GIS) and Analytic Network Process (ANP) of Multi Criteria Evaluation (MCE) to determine nature conservation and tourism development priorities among the highland areas. A set of criteria and indicators were defined to evaluate the highlands biodiversity conservation and tourism development. Pair wise comparison technique was used in order to support solution of a decision problem by evaluating possible alternatives from different perspectives. After the weights have been derived from the pairwise comparison technique, the next step was to compute the unweighted supermatrix, weighted supermatrix and the limit matrix. The limit matrix was normalized to obtain the priorities and the results transferred into GIS environment. Elements evaluated and ranked were represented by criterion maps. Map layers reflecting the opinion of different experts involved were summed using the weighted overlay approach of GIS. Subsequently sustainable tourism development scenarios were generated. The generation of scenarios highlighted the critical issues of the decision problem because it allows one to gradually narrow down a problem.
NASA Technical Reports Server (NTRS)
Banks, H. T.; Brown, D. E.; Metcalf, Vern L.; Silcox, R. J.; Smith, Ralph C.; Wang, Yun
1994-01-01
A problem of continued interest concerns the control of vibrations in a flexible structure and the related problem of reducing structure-borne noise in structural acoustic systems. In both cases, piezoceramic patches bonded to the structures have been successfully used as control actuators. Through the application of a controlling voltage, the patches can be used to reduce structural vibrations which in turn lead to methods for reducing structure-borne noise. A PDE-based methodology for modeling, estimating physical parameters, and implementing a feedback control scheme for problems of this type is discussed. While the illustrating example is a circular plate, the methodology is sufficiently general so as to be applicable in a variety of structural and structural acoustic systems.
Problems related to the integration of fault tolerant aircraft electronic systems
NASA Technical Reports Server (NTRS)
Bannister, J. A.; Adlakha, V.; Triyedi, K.; Alspaugh, T. A., Jr.
1982-01-01
Problems related to the design of the hardware for an integrated aircraft electronic system are considered. Taxonomies of concurrent systems are reviewed and a new taxonomy is proposed. An informal methodology intended to identify feasible regions of the taxonomic design space is described. Specific tools are recommended for use in the methodology. Based on the methodology, a preliminary strawman integrated fault tolerant aircraft electronic system is proposed. Next, problems related to the programming and control of inegrated aircraft electronic systems are discussed. Issues of system resource management, including the scheduling and allocation of real time periodic tasks in a multiprocessor environment, are treated in detail. The role of software design in integrated fault tolerant aircraft electronic systems is discussed. Conclusions and recommendations for further work are included.
Meaning and Problems of Planning
ERIC Educational Resources Information Center
Brieve, Fred J.; Johnston, A. P.
1973-01-01
Examines the educational planning process. Discusses what planning is, how methodological planning can work in education, misunderstandings about planning, and difficulties in applying the planning methodology. (DN)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-01
... DEPARTMENT OF COMMERCE International Trade Administration Antidumping Methodologies in Proceedings Involving Non-Market Economies: Valuing the Factor of Production: Labor; Correction to Request for Comment AGENCY: Import Administration, International Trade Administration, Department of Commerce DATES: Effective Date: March 1, 2011. FOR FURTHER...
Evidence on public policy: methodological issues, political issues and examples.
Attanasio, Orazio P
2014-03-01
In this paper I discuss how evidence on public policy is generated and in particular the issue of evaluation of public policies. In economics, the issue of attribution and the identification of causal links has recently received considerable attention. Important methodological issues have been tackled and new techniques have been proposed and used. Randomized Control Trials have become some sort of gold standard. However, they are not exempt from problems and have important limitations: in some case they cannot be constructed and, more generally, problems of external validity and transferability of results can be important. The paper then moves on to discuss the political economy of policy evaluations for policy evaluations to have an impact for the conduct of actual policy, it is important that the demand for evaluation comes directly from the policy making process and is generated endogenously within it. In this sense it is important that the institutional design of policy making is such that policy making institutions are incentivized to use rigorous evaluation in the process of designing policies and allocating resources to alternative options. Economists are currently involved in the design and evaluation of many policies, including policies about health, nutrition and education. The role they can play in these fields is not completely obvious. The paper argues that their main contribution is in the modelling of how individual reacts to incentives (including those provided by public policies).
Maybe Small Is Too Small a Term: Introduction to Advancing Small Sample Prevention Science.
Fok, Carlotta Ching Ting; Henry, David; Allen, James
2015-10-01
Prevention research addressing health disparities often involves work with small population groups experiencing such disparities. The goals of this special section are to (1) address the question of what constitutes a small sample; (2) identify some of the key research design and analytic issues that arise in prevention research with small samples; (3) develop applied, problem-oriented, and methodologically innovative solutions to these design and analytic issues; and (4) evaluate the potential role of these innovative solutions in describing phenomena, testing theory, and evaluating interventions in prevention research. Through these efforts, we hope to promote broader application of these methodological innovations. We also seek whenever possible, to explore their implications in more general problems that appear in research with small samples but concern all areas of prevention research. This special section includes two sections. The first section aims to provide input for researchers at the design phase, while the second focuses on analysis. Each article describes an innovative solution to one or more challenges posed by the analysis of small samples, with special emphasis on testing for intervention effects in prevention research. A concluding article summarizes some of their broader implications, along with conclusions regarding future directions in research with small samples in prevention science. Finally, a commentary provides the perspective of the federal agencies that sponsored the conference that gave rise to this special section.
Mannarini, Stefania
2009-11-01
The main scope of the present study was to devise a method in order to define a dimension characteristic of self-awareness behaviors with clinical subjects. To do so, I adopted a latent trait methodological approach. I studied the way patients expressed their treatment requests through their behaviors, both during their admission to a medical center in Northern Italy and after a period of treatment that involved an integrated (psychoanalytical and pharmacological) approach. The subjects were 48 females suffering from affective disorders, often combined with personality disorders. Five self-awareness indicators were identified, based both on interviews conducted with the patients and on the literature on the subject. The data gathered were analyzed by means of the many-facet Rasch model (Linacre, 1989). The results confirmed the existence of a self-awareness dimension characterized by the five indicators. Moreover, there was evidence that an improvement in self-awareness occurred during the pretreatment to posttreatment time period for both the affective disorders with personality problems patients and the affective disorders without personality problems patients. The estimation of bias/interactions showed the existence of specific behavioral differences between the two groups of patients. This study demonstrates the appropriateness of the methodological tool adopted, opening new expectations with regard to the integration of two approaches-psychoanalytical and pharmacological ones-in the treatment of psychiatric subjects.
The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis
NASA Astrophysics Data System (ADS)
Yu, Huiling; Fan, Delin; Zhang, Yizhuo
This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.
NASA Astrophysics Data System (ADS)
Alemany, Kristina
Electric propulsion has recently become a viable technology for spacecraft, enabling shorter flight times, fewer required planetary gravity assists, larger payloads, and/or smaller launch vehicles. With the maturation of this technology, however, comes a new set of challenges in the area of trajectory design. Because low-thrust trajectory optimization has historically required long run-times and significant user-manipulation, mission design has relied on expert-based knowledge for selecting departure and arrival dates, times of flight, and/or target bodies and gravitational swing-bys. These choices are generally based on known configurations that have worked well in previous analyses or simply on trial and error. At the conceptual design level, however, the ability to explore the full extent of the design space is imperative to locating the best solutions in terms of mass and/or flight times. Beginning in 2005, the Global Trajectory Optimization Competition posed a series of difficult mission design problems, all requiring low-thrust propulsion and visiting one or more asteroids. These problems all had large ranges on the continuous variables---launch date, time of flight, and asteroid stay times (when applicable)---as well as being characterized by millions or even billions of possible asteroid sequences. Even with recent advances in low-thrust trajectory optimization, full enumeration of these problems was not possible within the stringent time limits of the competition. This investigation develops a systematic methodology for determining a broad suite of good solutions to the combinatorial, low-thrust, asteroid tour problem. The target application is for conceptual design, where broad exploration of the design space is critical, with the goal being to rapidly identify a reasonable number of promising solutions for future analysis. The proposed methodology has two steps. The first step applies a three-level heuristic sequence developed from the physics of the problem, which allows for efficient pruning of the design space. The second phase applies a global optimization scheme to locate a broad suite of good solutions to the reduced problem. The global optimization scheme developed combines a novel branch-and-bound algorithm with a genetic algorithm and an industry-standard low-thrust trajectory optimization program to solve for the following design variables: asteroid sequence, launch date, times of flight, and asteroid stay times. The methodology is developed based on a small sample problem, which is enumerated and solved so that all possible discretized solutions are known. The methodology is then validated by applying it to a larger intermediate sample problem, which also has a known solution. Next, the methodology is applied to several larger combinatorial asteroid rendezvous problems, using previously identified good solutions as validation benchmarks. These problems include the 2nd and 3rd Global Trajectory Optimization Competition problems. The methodology is shown to be capable of achieving a reduction in the number of asteroid sequences of 6-7 orders of magnitude, in terms of the number of sequences that require low-thrust optimization as compared to the number of sequences in the original problem. More than 70% of the previously known good solutions are identified, along with several new solutions that were not previously reported by any of the competitors. Overall, the methodology developed in this investigation provides an organized search technique for the low-thrust mission design of asteroid rendezvous problems.
Use of Invariant Manifolds for Transfers Between Three-Body Systems
NASA Technical Reports Server (NTRS)
Beckman, Mark; Howell, Kathleen
2003-01-01
The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits does not exist. This paper presents the initial approaches to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing 7-dimensional invariant manifold data are presented. Some particular solutions are presented for the transfer problem, though the emphasis is on developing methodology for solving the general problem.
Representations of Invariant Manifolds for Applications in Three-Body Systems
NASA Technical Reports Server (NTRS)
Howell, K.; Beckman, M.; Patterson, C.; Folta, D.
2004-01-01
The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits is currently being studied. This paper presents an initial approach to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing the invariant manifold data are presented. Some particular solutions are presented for two types of transfer problems, though the emphasis is on developing the methodology for solving the general problem.
Modeling Collective Animal Behavior with a Cognitive Perspective: A Methodological Framework
Weitz, Sebastian; Blanco, Stéphane; Fournier, Richard; Gautrais, Jacques; Jost, Christian; Theraulaz, Guy
2012-01-01
The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the benefits expected during the often long and difficult process of refining a behavioral model, designing adapted experimental protocols and inversing model parameters. PMID:22761685
Modeling collective animal behavior with a cognitive perspective: a methodological framework.
Weitz, Sebastian; Blanco, Stéphane; Fournier, Richard; Gautrais, Jacques; Jost, Christian; Theraulaz, Guy
2012-01-01
The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the benefits expected during the often long and difficult process of refining a behavioral model, designing adapted experimental protocols and inversing model parameters.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-19
... DEPARTMENT OF COMMERCE International Trade Administration Methodological Change for Implementation..., the Department of Commerce (``the Department'') will implement a methodological change to reduce... administrative reviews involving merchandise from the PRC and Vietnam. Methodological Change In antidumping duty...
[Methodological problems in the scientific research on HIV /AIDS in Bolivia].
Hita, Susana Ramírez
2013-05-01
This paper discusses the methodological problems in the scientific research on HIV/AIDS in Bolivia, both in the areas of epidemiology and social sciences. Studies associated with this research served as the basis for the implementation of health programs run by The Global Fund, The Pan-American Health Organization, International Cooperation, Non-Governmental Organizations and the Bolivian Ministry of Health and Sports. An analysis of the methodological contradictions and weaknesses was made by reviewing the bibliography of the studies and by conducting qualitative methodological research, that was focused on the quality of health care available to people living with HIV/AIDS in public hospitals and health centers, and looked at how programs targeted at this sector of the population are designed and delivered. In this manner, it was possible to observe the shortcomings of the methodological design in the epidemiological and social science studies which serve as the basis for the implementation of these health programs.
Lees-Haley, Paul R; Greiffenstein, M Frank; Larrabee, Glenn J; Manning, Edward L
2004-08-01
Recently, Kaiser (2003) raised concerns over the increase in brain damage claims reportedly due to exposure to welding fumes. In the present article, we discuss methodological problems in conducting neuropsychological research on the effects of welding exposure, using a recent paper by Bowler et al. (2003) as an example to illustrate problems common in the neurotoxicity literature. Our analysis highlights difficulties in conducting such quasi-experimental investigations, including subject selection bias, litigation effects on symptom report and neuropsychological test performance, response bias, and scientifically inadequate casual reasoning.
William H. Cooke; Dennis M. Jacobs
2005-01-01
FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land use/land cover information....
Multidisciplinary System Reliability Analysis
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)
2001-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
A Socio-Technical Exploration for Reducing & Mitigating the Risk of Retained Foreign Objects
Corrigan, Siobhán; Kay, Alison; O’Byrne, Katie; Slattery, Dubhfeasa; Sheehan, Sharon; McDonald, Nick; Smyth, David; Mealy, Ken; Cromie, Sam
2018-01-01
A Retained Foreign Object (RFO) is a fairly infrequent but serious adverse event. An accurate rate of RFOs is difficult to establish due to underreporting but it has been estimated that incidences range between 1/1000 and 1/19,000 procedures. The cost of a RFO incident may be substantial and three-fold: (i) the cost to the patient of physical and/or psychological harm; (ii) the reputational cost to an institution and/or healthcare provider; and (iii) the financial cost to the taxpayer in the event of a legal claim. This Health Research Board-funded project aims to analyse and understand the problem of RFOs in surgical and maternity settings in Ireland and develop hospital-specific foreign object management processes and implementation roadmaps. This project will deploy an integrated evidence-based assessment methodology for social-technical modelling (Supply, Context, Organising, Process & Effects/ SCOPE Analysis Cube) and bow tie methodologies that focuses on managing the risks in effectively implementing and sustaining change. It comprises a multi-phase research approach that involves active and ongoing collaboration with clinical and other healthcare staff through each phase of the research. The specific objective of this paper is to present the methodological approach and outline the potential to produce generalisable results which could be applied to other health-related issues. PMID:29642646
Multi-Disciplinary System Reliability Analysis
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Han, Song
1997-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
Science and Television Commercials: Adding Relevance to the Research Methodology Course.
ERIC Educational Resources Information Center
Solomon, Paul R.
1979-01-01
Contends that research methodology courses can be relevant to issues outside of psychology and describes a method which relates the course to consumer problems. Students use experimental methodology to test claims made in television commercials advertising deodorant, bathroom tissues, and soft drinks. (KC)
NASA Technical Reports Server (NTRS)
Smalley, Kurt B.; Tinker, Michael L.; Fischer, Richard T.
2001-01-01
This paper is written for the purpose of providing an introduction and set of guidelines for the use of a methodology for NASTRAN eigenvalue modeling of thin film inflatable structures. It is hoped that this paper will spare the reader from the problems and headaches the authors were confronted with during their investigation by presenting here not only an introduction and verification of the methodology, but also a discussion of the problems that this methodology can ensue. Our goal in this investigation was to verify the basic methodology through the creation and correlation of a simple model. An overview of thin film structures, their history, and their applications is given. Previous modeling work is then briefly discussed. An introduction is then given for the method of modeling. The specific mechanics of the method are then discussed in parallel with a basic discussion of NASTRAN s implementation of these mechanics. The problems encountered with the method are then given along with suggestions for their work-a-rounds. The methodology is verified through the correlation between an analytical model and modal test results of a thin film strut. Recommendations are given for the needed advancement of our understanding of this method and ability to accurately model thin film structures. Finally, conclusions are drawn regarding the usefulness of the methodology.
Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika
2017-01-01
Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness.
Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika
2017-01-01
ABSTRACT Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness. PMID:28901217
Development of a Aerothermoelastic-Acoustics Simulation Capability of Flight Vehicles
NASA Technical Reports Server (NTRS)
Gupta, K. K.; Choi, S. B.; Ibrahim, A.
2010-01-01
A novel numerical, finite element based analysis methodology is presented in this paper suitable for accurate and efficient simulation of practical, complex flight vehicles. An associated computer code, developed in this connection, is also described in some detail. Thermal effects of high speed flow obtained from a heat conduction analysis are incorporated in the modal analysis which in turn affects the unsteady flow arising out of interaction of elastic structures with the air. Numerical examples pertaining to representative problems are given in much detail testifying to the efficacy of the advocated techniques. This is a unique implementation of temperature effects in a finite element CFD based multidisciplinary simulation analysis capability involving large scale computations.
Financial Support for the Humanities: A Special Methodological Report.
ERIC Educational Resources Information Center
Gomberg, Irene L.; Atelsek, Frank J.
Findings and methodological problems of a survey on financial support for humanities in higher education are discussed. Usable data were gathered from 351 of 671 Higher Education Panel member institutions. Two weighting methodologies were employed. The conventional method assumed that nonrespondents were similar to respondents, whereas a…
An automated methodology development. [software design for combat simulation
NASA Technical Reports Server (NTRS)
Hawley, L. R.
1985-01-01
The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.
Technology transfer methodology
NASA Technical Reports Server (NTRS)
Labotz, Rich
1991-01-01
Information on technology transfer methodology is given in viewgraph form. Topics covered include problems in economics, technology drivers, inhibitors to using improved technology in development, technology application opportunities, and co-sponsorship of technology.
Engineering tradeoff problems viewed as multiple objective optimizations and the VODCA methodology
NASA Astrophysics Data System (ADS)
Morgan, T. W.; Thurgood, R. L.
1984-05-01
This paper summarizes a rational model for making engineering tradeoff decisions. The model is a hybrid from the fields of social welfare economics, communications, and operations research. A solution methodology (Vector Optimization Decision Convergence Algorithm - VODCA) firmly grounded in the economic model is developed both conceptually and mathematically. The primary objective for developing the VODCA methodology was to improve the process for extracting relative value information about the objectives from the appropriate decision makers. This objective was accomplished by employing data filtering techniques to increase the consistency of the relative value information and decrease the amount of information required. VODCA is applied to a simplified hypothetical tradeoff decision problem. Possible use of multiple objective analysis concepts and the VODCA methodology in product-line development and market research are discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-18
... parties to comment on these methodological issues described above. Request for Comment on Interim Industry... comments. \\15\\ Indicator: GNI per capita, Atlas Method (current US$) is obtained from http://data.worldbank... methodology, the Department has encountered a number of methodological and practical challenges that must be...
Peirlinck, Mathias; De Beule, Matthieu; Segers, Patrick; Rebelo, Nuno
2018-05-28
Patient-specific biomechanical modeling of the cardiovascular system is complicated by the presence of a physiological pressure load given that the imaged tissue is in a pre-stressed and -strained state. Neglect of this prestressed state into solid tissue mechanics models leads to erroneous metrics (e.g. wall deformation, peak stress, wall shear stress) which in their turn are used for device design choices, risk assessment (e.g. procedure, rupture) and surgery planning. It is thus of utmost importance to incorporate this deformed and loaded tissue state into the computational models, which implies solving an inverse problem (calculating an undeformed geometry given the load and the deformed geometry). Methodologies to solve this inverse problem can be categorized into iterative and direct methodologies, both having their inherent advantages and disadvantages. Direct methodologies are typically based on the inverse elastostatics (IE) approach and offer a computationally efficient single shot methodology to compute the in vivo stress state. However, cumbersome and problem-specific derivations of the formulations and non-trivial access to the finite element analysis (FEA) code, especially for commercial products, refrain a broad implementation of these methodologies. For that reason, we developed a novel, modular IE approach and implemented this methodology in a commercial FEA solver with minor user subroutine interventions. The accuracy of this methodology was demonstrated in an arterial tube and porcine biventricular myocardium model. The computational power and efficiency of the methodology was shown by computing the in vivo stress and strain state, and the corresponding unloaded geometry, for two models containing multiple interacting incompressible, anisotropic (fiber-embedded) and hyperelastic material behaviors: a patient-specific abdominal aortic aneurysm and a full 4-chamber heart model. Copyright © 2018 Elsevier Ltd. All rights reserved.
Goulart Coelho, Lineker M; Lange, Liséte C; Coelho, Hosmanny Mg
2017-01-01
Solid waste management is a complex domain involving the interaction of several dimensions; thus, its analysis and control impose continuous challenges for decision makers. In this context, multi-criteria decision-making models have become important and convenient supporting tools for solid waste management because they can handle problems involving multiple dimensions and conflicting criteria. However, the selection of the multi-criteria decision-making method is a hard task since there are several multi-criteria decision-making approaches, each one with a large number of variants whose applicability depends on information availability and the aim of the study. Therefore, to support researchers and decision makers, the objectives of this article are to present a literature review of multi-criteria decision-making applications used in solid waste management, offer a critical assessment of the current practices, and provide suggestions for future works. A brief review of fundamental concepts on this topic is first provided, followed by the analysis of 260 articles related to the application of multi-criteria decision making in solid waste management. These studies were investigated in terms of the methodology, including specific steps such as normalisation, weighting, and sensitivity analysis. In addition, information related to waste type, the study objective, and aspects considered was recorded. From the articles analysed it is noted that studies using multi-criteria decision making in solid waste management are predominantly addressed to problems related to municipal solid waste involving facility location or management strategy.
Varieties of second modernity: the cosmopolitan turn in social and political theory and research.
Beck, Ulrich; Grande, Edgar
2010-09-01
The theme of this special issue is the necessity of a cosmopolitan turn in social and political theory. The question at the heart of this introductory chapter takes the challenge of 'methodological cosmopolitanism', already addressed in a Special Issue on Cosmopolitan Sociology in this journal (Beck and Sznaider 2006), an important step further: How can social and political theory be opened up, theoretically as well as methodologically and normatively, to a historically new, entangled Modernity which threatens its own foundations? How can it account for the fundamental fragility, the mutability of societal dynamics (of unintended side effects, domination and power), shaped by the globalization of capital and risks at the beginning of the twenty-first century? What theoretical and methodological problems arise and how can they be addressed in empirical research? In the following, we will develop this 'cosmopolitan turn' in four steps: firstly, we present the major conceptual tools for a theory of cosmopolitan modernities; secondly, we de-construct Western modernity by using examples taken from research on individualization and risk; thirdly, we address the key problem of methodological cosmopolitanism, namely the problem of defining the appropriate unit of analysis; and finally,we discuss normative questions, perspectives, and dilemmas of a theory of cosmopolitan modernities, in particular problems of political agency and prospects of political realization.
A Bayesian approach to truncated data sets: An application to Malmquist bias in Supernova Cosmology
NASA Astrophysics Data System (ADS)
March, Marisa Cristina
2018-01-01
A problem commonly encountered in statistical analysis of data is that of truncated data sets. A truncated data set is one in which a number of data points are completely missing from a sample, this is in contrast to a censored sample in which partial information is missing from some data points. In astrophysics this problem is commonly seen in a magnitude limited survey such that the survey is incomplete at fainter magnitudes, that is, certain faint objects are simply not observed. The effect of this `missing data' is manifested as Malmquist bias and can result in biases in parameter inference if it is not accounted for. In Frequentist methodologies the Malmquist bias is often corrected for by analysing many simulations and computing the appropriate correction factors. One problem with this methodology is that the corrections are model dependent. In this poster we derive a Bayesian methodology for accounting for truncated data sets in problems of parameter inference and model selection. We first show the methodology for a simple Gaussian linear model and then go on to show the method for accounting for a truncated data set in the case for cosmological parameter inference with a magnitude limited supernova Ia survey.
The Davey-Stewartson Equation on the Half-Plane
NASA Astrophysics Data System (ADS)
Fokas, A. S.
2009-08-01
The Davey-Stewartson (DS) equation is a nonlinear integrable evolution equation in two spatial dimensions. It provides a multidimensional generalisation of the celebrated nonlinear Schrödinger (NLS) equation and it appears in several physical situations. The implementation of the Inverse Scattering Transform (IST) to the solution of the initial-value problem of the NLS was presented in 1972, whereas the analogous problem for the DS equation was solved in 1983. These results are based on the formulation and solution of certain classical problems in complex analysis, namely of a Riemann Hilbert problem (RH) and of either a d-bar or a non-local RH problem respectively. A method for solving the mathematically more complicated but physically more relevant case of boundary-value problems for evolution equations in one spatial dimension, like the NLS, was finally presented in 1997, after interjecting several novel ideas to the panoply of the IST methodology. Here, this method is further extended so that it can be applied to evolution equations in two spatial dimensions, like the DS equation. This novel extension involves several new steps, including the formulation of a d-bar problem for a sectionally non-analytic function, i.e. for a function which has different non-analytic representations in different domains of the complex plane. This, in addition to the computation of a d-bar derivative, also requires the computation of the relevant jumps across the different domains. This latter step has certain similarities (but is more complicated) with the corresponding step for those initial-value problems in two dimensions which can be solved via a non-local RH problem, like KPI.
A predictive machine learning approach for microstructure optimization and materials design
Liu, Ruoqian; Kumar, Abhishek; Chen, Zhengzhang; ...
2015-06-23
This paper addresses an important materials engineering question: How can one identify the complete space (or as much of it as possible) of microstructures that are theoretically predicted to yield the desired combination of properties demanded by a selected application? We present a problem involving design of magnetoelastic Fe-Ga alloy microstructure for enhanced elastic, plastic and magnetostrictive properties. While theoretical models for computing properties given the microstructure are known for this alloy, inversion of these relationships to obtain microstructures that lead to desired properties is challenging, primarily due to the high dimensionality of microstructure space, multi-objective design requirement and non-uniquenessmore » of solutions. These challenges render traditional search-based optimization methods incompetent in terms of both searching efficiency and result optimality. In this paper, a route to address these challenges using a machine learning methodology is proposed. A systematic framework consisting of random data generation, feature selection and classification algorithms is developed. In conclusion, experiments with five design problems that involve identification of microstructures that satisfy both linear and nonlinear property constraints show that our framework outperforms traditional optimization methods with the average running time reduced by as much as 80% and with optimality that would not be achieved otherwise.« less
NASA Astrophysics Data System (ADS)
Fulkerson, David E.
2010-02-01
This paper describes a new methodology for characterizing the electrical behavior and soft error rate (SER) of CMOS and SiGe HBT integrated circuits that are struck by ions. A typical engineering design problem is to calculate the SER of a critical path that commonly includes several circuits such as an input buffer, several logic gates, logic storage, clock tree circuitry, and an output buffer. Using multiple 3D TCAD simulations to solve this problem is too costly and time-consuming for general engineering use. The new and simple methodology handles the problem with ease by simple SPICE simulations. The methodology accurately predicts the measured threshold linear energy transfer (LET) of a bulk CMOS SRAM. It solves for circuit currents and voltage spikes that are close to those predicted by expensive 3D TCAD simulations. It accurately predicts the measured event cross-section vs. LET curve of an experimental SiGe HBT flip-flop. The experimental cross section vs. frequency behavior and other subtle effects are also accurately predicted.
Schedule Risks Due to Delays in Advanced Technology Development
NASA Technical Reports Server (NTRS)
Reeves, John D. Jr.; Kayat, Kamal A.; Lim, Evan
2008-01-01
This paper discusses a methodology and modeling capability that probabilistically evaluates the likelihood and impacts of delays in advanced technology development prior to the start of design, development, test, and evaluation (DDT&E) of complex space systems. The challenges of understanding and modeling advanced technology development considerations are first outlined, followed by a discussion of the problem in the context of lunar surface architecture analysis. The current and planned methodologies to address the problem are then presented along with sample analyses and results. The methodology discussed herein provides decision-makers a thorough understanding of the schedule impacts resulting from the inclusion of various enabling advanced technology assumptions within system design.
Methodological Problems of Soviet Pedagogy
ERIC Educational Resources Information Center
Noah, Harold J., Ed.; Beach, Beatrice S., Ed.
1974-01-01
Selected papers presented at the First Scientific Conference of Pedagogical Scholars of Socialist Countries, Moscow, 1971, deal with methodology in relation to science, human development, sociology, psychology, cybernetics, and the learning process. (KM)
Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?
ERIC Educational Resources Information Center
Brondani, Mario; He, Sarah
2013-01-01
Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…
Charan, J; Saxena, D
2014-01-01
Biased negative studies not only reflect poor research effort but also have an impact on 'patient care' as they prevent further research with similar objectives, leading to potential research areas remaining unexplored. Hence, published 'negative studies' should be methodologically strong. All parameters that may help a reader to judge validity of results and conclusions should be reported in published negative studies. There is a paucity of data on reporting of statistical and methodological parameters in negative studies published in Indian Medical Journals. The present systematic review was designed with an aim to critically evaluate negative studies published in prominent Indian Medical Journals for reporting of statistical and methodological parameters. Systematic review. All negative studies published in 15 Science Citation Indexed (SCI) medical journals published from India were included in present study. Investigators involved in the study evaluated all negative studies for the reporting of various parameters. Primary endpoints were reporting of "power" and "confidence interval." Power was reported in 11.8% studies. Confidence interval was reported in 15.7% studies. Majority of parameters like sample size calculation (13.2%), type of sampling method (50.8%), name of statistical tests (49.1%), adjustment of multiple endpoints (1%), post hoc power calculation (2.1%) were reported poorly. Frequency of reporting was more in clinical trials as compared to other study designs and in journals having impact factor more than 1 as compared to journals having impact factor less than 1. Negative studies published in prominent Indian medical journals do not report statistical and methodological parameters adequately and this may create problems in the critical appraisal of findings reported in these journals by its readers.
Quality Assurance of UMLS Semantic Type Assignments Using SNOMED CT Hierarchies.
Gu, H; Chen, Y; He, Z; Halper, M; Chen, L
2016-01-01
The Unified Medical Language System (UMLS) is one of the largest biomedical terminological systems, with over 2.5 million concepts in its Metathesaurus repository. The UMLS's Semantic Network (SN) with its collection of 133 high-level semantic types serves as an abstraction layer on top of the Metathesaurus. In particular, the SN elaborates an aspect of the Metathesaurus's concepts via the assignment of one or more types to each concept. Due to the scope and complexity of the Metathesaurus, errors are all but inevitable in this semantic-type assignment process. To develop a semi-automated methodology to help assure the quality of semantic-type assignments within the UMLS. The methodology uses a cross-validation strategy involving SNOMED CT's hierarchies in combination with UMLS semantic types. Semantically uniform, disjoint concept groups are generated programmatically by partitioning the collection of all concepts in the same SNOMED CT hierarchy according to their respective semantic-type assignments in the UMLS. Domain experts are then called upon to review the concepts in any group having a small number of concepts. It is our hypothesis that a semantic-type assignment combination applicable only to a very small number of concepts in a SNOMED CT hierarchy is an indicator of potential problems. The methodology was applied to the UMLS 2013AA release along with the SNOMED CT from January 2013. An overall error rate of 33% was found for concepts proposed by the quality-assurance methodology. Supporting our hypothesis, that number was four times higher than the error rate found in control samples. The results show that the quality-assurance methodology can aid in effective and efficient identification of UMLS semantic-type assignment errors.
Inverse problems in quantum chemistry
NASA Astrophysics Data System (ADS)
Karwowski, Jacek
Inverse problems constitute a branch of applied mathematics with well-developed methodology and formalism. A broad family of tasks met in theoretical physics, in civil and mechanical engineering, as well as in various branches of medical and biological sciences has been formulated as specific implementations of the general theory of inverse problems. In this article, it is pointed out that a number of approaches met in quantum chemistry can (and should) be classified as inverse problems. Consequently, the methodology used in these approaches may be enriched by applying ideas and theorems developed within the general field of inverse problems. Several examples, including the RKR method for the construction of potential energy curves, determining parameter values in semiempirical methods, and finding external potentials for which the pertinent Schrödinger equation is exactly solvable, are discussed in detail.
Methodological Behaviorism from the Standpoint of a Radical Behaviorist.
Moore, J
2013-01-01
Methodological behaviorism is the name for a prescriptive orientation to psychological science. Its first and original feature is that the terms and concepts deployed in psychological theories and explanations should be based on observable stimuli and behavior. I argue that the interpretation of the phrase "based on" has changed over the years because of the influence of operationism. Its second feature, which developed after the first and is prominent in contemporary psychology, is that research should emphasize formal testing of a theory that involves mediating theoretical entities from an nonbehavioral dimension according to the hypothetico-deductive method. I argue that for contemporary methodological behaviorism, explanations of the behavior of both participants and scientists appeal to the mediating entities as mental causes, if only indirectly. In contrast to methodological behaviorism is the radical behaviorism of B. F. Skinner. Unlike methodological behaviorism, radical behaviorism conceives of verbal behavior in terms of an operant process that involves antecedent circumstances and reinforcing consequences, rather than in terms of a nonbehavioral process that involves reference and symbolism. In addition, radical behaviorism recognizes private behavioral events and subscribes to research and explanatory practices that do not include testing hypotheses about supposed mediating entities from another dimension. I conclude that methodological behaviorism is actually closer to mentalism than to Skinner's radical behaviorism.
Methodological Behaviorism from the Standpoint of a Radical Behaviorist
2013-01-01
Methodological behaviorism is the name for a prescriptive orientation to psychological science. Its first and original feature is that the terms and concepts deployed in psychological theories and explanations should be based on observable stimuli and behavior. I argue that the interpretation of the phrase “based on” has changed over the years because of the influence of operationism. Its second feature, which developed after the first and is prominent in contemporary psychology, is that research should emphasize formal testing of a theory that involves mediating theoretical entities from an nonbehavioral dimension according to the hypothetico-deductive method. I argue that for contemporary methodological behaviorism, explanations of the behavior of both participants and scientists appeal to the mediating entities as mental causes, if only indirectly. In contrast to methodological behaviorism is the radical behaviorism of B. F. Skinner. Unlike methodological behaviorism, radical behaviorism conceives of verbal behavior in terms of an operant process that involves antecedent circumstances and reinforcing consequences, rather than in terms of a nonbehavioral process that involves reference and symbolism. In addition, radical behaviorism recognizes private behavioral events and subscribes to research and explanatory practices that do not include testing hypotheses about supposed mediating entities from another dimension. I conclude that methodological behaviorism is actually closer to mentalism than to Skinner's radical behaviorism. PMID:28018031
Montecinos, P; Rodewald, A M
1994-06-01
The aim this work was to assess and compare the achievements of medical students, subjected to problem based learning methodology. The information and comprehension categories of Bloom were tested in 17 medical students in four different occasions during the physiopathology course, using a multiple choice knowledge test. There was a significant improvement in the number of correct answers towards the end of the course. It is concluded that these medical students obtained adequate learning achievements in the information subcategory of Bloom using problem based learning methodology, during the physiopathology course.
Borodulin, V I; Gliantsev, S P
2017-07-01
The article considers particular key methodological aspects of problem of scientific clinical school in national medicine. These aspects have to do with notion of school, its profile, issues of pedagogues, teachings and followers, subsidiary schools and issue of ethical component of scientific school. The article is a polemic one hence one will find no definite answers to specified questions. The reader is proposed to ponder over answers independently adducing examples of pro and contra. The conclusion is made about necessity of studying scientific schools in other areas of medicine and further elaboration of problem.
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.
Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-09-12
Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.
ERIC Educational Resources Information Center
Riazi, A. Mehdi
2016-01-01
Mixed-methods research (MMR), as an inter-discourse (quantitative and qualitative) methodology, can provide applied linguistics researchers the opportunity to draw on and integrate the strengths of the two research methodological approaches in favour of making more rigorous inferences about research problems. In this article, the argument is made…
Active Methodologies in a Queueing Systems Course for Telecommunication Engineering Studies
ERIC Educational Resources Information Center
Garcia, J.; Hernandez, A.
2010-01-01
This paper presents the results of a one-year experiment in incorporating active methodologies in a Queueing Systems course as part of the Telecommunication Engineering degree at the University of Zaragoza, Spain, during the period of adaptation to the European Higher Education Area. A problem-based learning methodology has been introduced, and…
NASA Technical Reports Server (NTRS)
Newman, James C., III
1995-01-01
The limiting factor in simulating flows past realistic configurations of interest has been the discretization of the physical domain on which the governing equations of fluid flow may be solved. In an attempt to circumvent this problem, many Computational Fluid Dynamic (CFD) methodologies that are based on different grid generation and domain decomposition techniques have been developed. However, due to the costs involved and expertise required, very few comparative studies between these methods have been performed. In the present work, the two CFD methodologies which show the most promise for treating complex three-dimensional configurations as well as unsteady moving boundary problems are evaluated. These are namely the structured-overlapped and the unstructured grid schemes. Both methods use a cell centered, finite volume, upwind approach. The structured-overlapped algorithm uses an approximately factored, alternating direction implicit scheme to perform the time integration, whereas, the unstructured algorithm uses an explicit Runge-Kutta method. To examine the accuracy, efficiency, and limitations of each scheme, they are applied to the same steady complex multicomponent configurations and unsteady moving boundary problems. The steady complex cases consist of computing the subsonic flow about a two-dimensional high-lift multielement airfoil and the transonic flow about a three-dimensional wing/pylon/finned store assembly. The unsteady moving boundary problems are a forced pitching oscillation of an airfoil in a transonic freestream and a two-dimensional, subsonic airfoil/store separation sequence. Accuracy was accessed through the comparison of computed and experimentally measured pressure coefficient data on several of the wing/pylon/finned store assembly's components and at numerous angles-of-attack for the pitching airfoil. From this study, it was found that both the structured-overlapped and the unstructured grid schemes yielded flow solutions of comparable accuracy for these simulations. This study also indicated that, overall, the structured-overlapped scheme was slightly more CPU efficient than the unstructured approach.
Towards Methodologies for Building Knowledge-Based Instructional Systems.
ERIC Educational Resources Information Center
Duchastel, Philippe
1992-01-01
Examines the processes involved in building instructional systems that are based on artificial intelligence and hypermedia technologies. Traditional instructional systems design methodology is discussed; design issues including system architecture and learning strategies are addressed; and a new methodology for building knowledge-based…
Hierarchical Strategy for Rapid Analysis Environment
NASA Technical Reports Server (NTRS)
Whitcomb, John
2003-01-01
A new philosophy is developed wherein the hierarchical definition of data is made use of in creating a better environment to conduct analyses of practical problems. This system can be adapted to conduct virtually any type of analysis, since this philosophy is not bound to any specific kind of analysis. It provides a framework to manage different models and its results and more importantly, the interaction between the different models. Thus, it is ideal for many types of finite element analyses like globalAoca1 analysis and those that involve multiple scales and fields. The system developed during the course of this work is just a demonstrator of the basic concepts. A complete implementation of this strategy could potentially make a major impact on the way analyses are conducted. It could considerably reduce the time frame required to conduct the analysis of real-life problems by efficient management of the data involved and reducing the human effort involved. It also helps in better decision making because of more ways to interpret the results. The strategy has been currently implemented for structural analysis, but with more work it could be extended to other fields of science when the finite element method is used to solve the differential equations numerically. This report details the work that has been done during the course of this project and its achievements and results. The following section discusses the meaning of the word hierarchical and the different references to the term in the literature. It talks about the development of the finite element method, its different versions and how hierarchy has been used to improve the methodology. The next section describes the hierarchical philosophy in detail and explains the different concepts and terms associated with it. It goes on to describe the implementation and the features of the demonstrator. A couple of problems are analyzed using the demonstrator program to show the working of the system. The two problems considered are two dimensional plane stress analysis problems. The results are compared with those obtained using conventional analysis. The different challenges faced during the development of this system are discussed. Finally, we conclude with suggestions for future work to add more features and extend it to a wider range of problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Xiangqi; Zhang, Yingchen
This paper presents an optimal voltage control methodology with coordination among different voltage-regulating resources, including controllable loads, distributed energy resources such as energy storage and photovoltaics (PV), and utility voltage-regulating devices such as voltage regulators and capacitors. The proposed methodology could effectively tackle the overvoltage and voltage regulation device distortion problems brought by high penetrations of PV to improve grid operation reliability. A voltage-load sensitivity matrix and voltage-regulator sensitivity matrix are used to deploy the resources along the feeder to achieve the control objectives. Mixed-integer nonlinear programming is used to solve the formulated optimization control problem. The methodology has beenmore » tested on the IEEE 123-feeder test system, and the results demonstrate that the proposed approach could actively tackle the voltage problem brought about by high penetrations of PV and improve the reliability of distribution system operation.« less
Topology optimization for nonlinear dynamic problems: Considerations for automotive crashworthiness
NASA Astrophysics Data System (ADS)
Kaushik, Anshul; Ramani, Anand
2014-04-01
Crashworthiness of automotive structures is most often engineered after an optimal topology has been arrived at using other design considerations. This study is an attempt to incorporate crashworthiness requirements upfront in the topology synthesis process using a mathematically consistent framework. It proposes the use of equivalent linear systems from the nonlinear dynamic simulation in conjunction with a discrete-material topology optimizer. Velocity and acceleration constraints are consistently incorporated in the optimization set-up. Issues specific to crash problems due to the explicit solution methodology employed, nature of the boundary conditions imposed on the structure, etc. are discussed and possible resolutions are proposed. A demonstration of the methodology on two-dimensional problems that address some of the structural requirements and the types of loading typical of frontal and side impact is provided in order to show that this methodology has the potential for topology synthesis incorporating crashworthiness requirements.
Intelligence for education: as described by Piaget and measured by psychometrics.
Shayer, Michael
2008-03-01
Two separate paths to the concept of intelligence are discussed: the psychometric path being concerned with the measurement of intelligence, involving the methodology of norm-referenced testing; the path followed by Piaget, and others, addresses from the start the related question of how intelligence can be described, and employs a criterion-referenced methodology. The achievements of psychometrics are briefly described, with an argument that they now remain important tools of what Kuhn called 'normal science'. The criterion-referenced approach of Piaget and others is described, with evidence from intervention studies that the Genevan descriptions of children-in-action have allowed the choice of contexts within which children can profitably be challenged to go further in their thinking. Hence, Genevan psychology is also now a part of the normal science with important uses, shown both in neo-Piagetian studies and further research stemming from Geneva. Discussion of the 'Flynn effect' sheds light on both paths, with problems still unresolved. The argument is then developed that the relevance of neuroscience needs to be discussed to try to decide in what ways it may provide useful insights into intelligence.
Shreeve, Michael W.
2008-01-01
In a chiropractic college that utilizes a hybrid curriculum model composed of adult-based learning strategies along with traditional lecture-based course delivery, a literature search for educational delivery methods that would integrate the affective domain and the cognitive domain of learning provided some insights into the use of problem-based learning (PBL), experiential learning theory (ELT), and the emerging use of appreciative inquiry (AI) to enhance the learning experience. The purpose of this literature review is to provide a brief overview of key components of PBL, ELT, and AI in educational methodology and to discuss how these might be used within the chiropractic curriculum to supplement traditional didactic lecture courses. A growing body of literature describes the use of PBL and ELT in educational settings across many disciplines, both at the undergraduate and graduate levels. The use of appreciative inquiry as an instructional methodology presents a new area for exploration and study in the academic environment. Educational research in the chiropractic classroom incorporating ELT and appreciative inquiry might provide some valuable insights for future curriculum development. PMID:18483586
Nuclear medicine and imaging research (quantitative studies in radiopharmaceutical science)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, M.D.; Beck, R.N.
1990-09-01
This is a report of progress in Year Two (January 1, 1990--December 31, 1990) of Grant FG02-86ER60438, Quantitative Studies in Radiopharmaceutical Science,'' awarded for the three-year period January 1, 1989--December 31, 1991 as a competitive renewal following site visit in the fall of 1988. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further themore » development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 25 refs., 13 figs., 1 tab.« less
Shreeve, Michael W
2008-01-01
In a chiropractic college that utilizes a hybrid curriculum model composed of adult-based learning strategies along with traditional lecture-based course delivery, a literature search for educational delivery methods that would integrate the affective domain and the cognitive domain of learning provided some insights into the use of problem-based learning (PBL), experiential learning theory (ELT), and the emerging use of appreciative inquiry (AI) to enhance the learning experience. The purpose of this literature review is to provide a brief overview of key components of PBL, ELT, and AI in educational methodology and to discuss how these might be used within the chiropractic curriculum to supplement traditional didactic lecture courses. A growing body of literature describes the use of PBL and ELT in educational settings across many disciplines, both at the undergraduate and graduate levels. The use of appreciative inquiry as an instructional methodology presents a new area for exploration and study in the academic environment. Educational research in the chiropractic classroom incorporating ELT and appreciative inquiry might provide some valuable insights for future curriculum development.
NASA Astrophysics Data System (ADS)
Felipe-Sesé, Luis; López-Alba, Elías; Siegmann, Philip; Díaz, Francisco A.
2016-12-01
A low-cost approach for three-dimensional (3-D) full-field displacement measurement is applied for the analysis of large displacements involved in two different mechanical events. The method is based on a combination of fringe projection and two-dimensional digital image correlation (DIC) techniques. The two techniques have been employed simultaneously using an RGB camera and a color encoding method; therefore, it is possible to measure in-plane and out-of-plane displacements at the same time with only one camera even at high speed rates. The potential of the proposed methodology has been employed for the analysis of large displacements during contact experiments in a soft material block. Displacement results have been successfully compared with those obtained using a 3D-DIC commercial system. Moreover, the analysis of displacements during an impact test on a metal plate was performed to emphasize the application of the methodology for dynamics events. Results show a good level of agreement, highlighting the potential of FP + 2D DIC as low-cost alternative for the analysis of large deformations problems.
Building an adaptive agent to monitor and repair the electrical power system of an orbital satellite
NASA Technical Reports Server (NTRS)
Tecuci, Gheorghe; Hieb, Michael R.; Dybala, Tomasz
1995-01-01
Over several years we have developed a multistrategy apprenticeship learning methodology for building knowledge-based systems. Recently we have developed and applied our methodology to building intelligent agents. This methodology allows a subject matter expert to build an agent in the same way in which the expert would teach a human apprentice. The expert will give the agent specific examples of problems and solutions, explanations of these solutions, or supervise the agent as it solves new problems. During such interactions, the agent learns general rules and concepts, continuously extending and improving its knowledge base. In this paper we present initial results on applying this methodology to build an intelligent adaptive agent for monitoring and repair of the electrical power system of an orbital satellite, stressing the interaction with the expert during apprenticeship learning.
Explorers of the Universe: Metacognitive Tools for Learning Science Concepts
NASA Technical Reports Server (NTRS)
Alvarez, Marino C.
1998-01-01
Much of school learning consists of rote memorization of facts with little emphasis on meaningful interpretations. Knowledge construction is reduced to factual knowledge production with little regard for critical thinking, problem solving, or clarifying misconceptions. An important role of a middle and secondary teacher when teaching science is to aid students' ability to reflect upon what they know about a given topic and make available strategies that will enhance their understanding of text and science experiments. Developing metacognition, the ability to monitor one's own knowledge about a topic of study and to activate appropriate strategies, enhances students' learning when faced with reading, writing and problem solving situations. Two instructional strategies that can involve students in developing metacognitive awareness are hierarchical concept mapping, and Vee diagrams. Concept maps enable students to organize their ideas and reveal visually these ideas to others. A Vee diagram is a structured visual means of relating the methodological aspects of an activity to its underlying conceptual aspect in ways that aid learners in meaningful understanding of scientific investigations.
Method for Assessing Risk of Road Accidents in Transportation of School Children
NASA Astrophysics Data System (ADS)
Pogotovkina, N. S.; Volodkin, P. P.; Demakhina, E. S.
2017-11-01
The rationale behind the problem being investigated is explained by the remaining high level of the accident rates with the participation of vehicles carrying groups of children, including school buses, in the Russian Federation over the period of several years. The article is aimed at the identification of new approaches to improve the safety of transportation of schoolchildren in accordance with the Concept of children transportation by buses and the plan for its implementation. The leading approach to solve the problem under consideration is the prediction of accidents in the schoolchildren transportation. The article presents the results of the accident rate analysis with the participation of school buses in the Russian Federation for five years. Besides, a system to monitor the transportation of schoolchildren is proposed; the system will allow analyzing and forecasting traffic accidents which involve buses carrying groups of children, including school buses. In addition, the article presents a methodology for assessing the risk of road accidents during the transportation of schoolchildren.
Saravana Kumar, Gurunathan; George, Subin Philip
2017-02-01
This work proposes a methodology involving stiffness optimization for subject-specific cementless hip implant design based on finite element analysis for reducing stress-shielding effect. To assess the change in the stress-strain state of the femur and the resulting stress-shielding effect due to insertion of the implant, a finite element analysis of the resected femur with implant assembly is carried out for a clinically relevant loading condition. Selecting the von Mises stress as the criterion for discriminating regions for elastic modulus difference, a stiffness minimization method was employed by varying the elastic modulus distribution in custom implant stem. The stiffness minimization problem is formulated as material distribution problem without explicitly penalizing partial volume elements. This formulation enables designs that could be fabricated using additive manufacturing to make porous implant with varying levels of porosity. Stress-shielding effect, measured as difference between the von Mises stress in the intact and implanted femur, decreased as the elastic modulus distribution is optimized.
Inverse problems in heterogeneous and fractured media using peridynamics
Turner, Daniel Z.; van Bloemen Waanders, Bart G.; Parks, Michael L.
2015-12-10
The following work presents an adjoint-based methodology for solving inverse problems in heterogeneous and fractured media using state-based peridynamics. We show that the inner product involving the peridynamic operators is self-adjoint. The proposed method is illustrated for several numerical examples with constant and spatially varying material parameters as well as in the context of fractures. We also present a framework for obtaining material parameters by integrating digital image correlation (DIC) with inverse analysis. This framework is demonstrated by evaluating the bulk and shear moduli for a sample of nuclear graphite using digital photographs taken during the experiment. The resulting measuredmore » values correspond well with other results reported in the literature. Lastly, we show that this framework can be used to determine the load state given observed measurements of a crack opening. Furthermore, this type of analysis has many applications in characterizing subsurface stress-state conditions given fracture patterns in cores of geologic material.« less
Adjoint-Based Methodology for Time-Dependent Optimization
NASA Technical Reports Server (NTRS)
Yamaleev, N. K.; Diskin, B.; Nielsen, E. J.
2008-01-01
This paper presents a discrete adjoint method for a broad class of time-dependent optimization problems. The time-dependent adjoint equations are derived in terms of the discrete residual of an arbitrary finite volume scheme which approximates unsteady conservation law equations. Although only the 2-D unsteady Euler equations are considered in the present analysis, this time-dependent adjoint method is applicable to the 3-D unsteady Reynolds-averaged Navier-Stokes equations with minor modifications. The discrete adjoint operators involving the derivatives of the discrete residual and the cost functional with respect to the flow variables are computed using a complex-variable approach, which provides discrete consistency and drastically reduces the implementation and debugging cycle. The implementation of the time-dependent adjoint method is validated by comparing the sensitivity derivative with that obtained by forward mode differentiation. Our numerical results show that O(10) optimization iterations of the steepest descent method are needed to reduce the objective functional by 3-6 orders of magnitude for test problems considered.
A framework for multi-stakeholder decision-making and ...
We propose a decision-making framework to compute compromise solutions that balance conflicting priorities of multiple stakeholders on multiple objectives. In our setting, we shape the stakeholder dis-satisfaction distribution by solving a conditional-value-at-risk (CVaR) minimization problem. The CVaR problem is parameterized by a probability level that shapes the tail of the dissatisfaction distribution. The proposed approach allows us to compute a family of compromise solutions and generalizes multi-stakeholder settings previously proposed in the literature that minimize average and worst-case dissatisfactions. We use the concept of the CVaR norm to give a geometric interpretation to this problem +and use the properties of this norm to prove that the CVaR minimization problem yields Pareto optimal solutions for any choice of the probability level. We discuss a broad range of potential applications of the framework that involve complex decision-making processes. We demonstrate the developments using a biowaste facility location case study in which we seek to balance stakeholder priorities on transportation, safety, water quality, and capital costs. This manuscript describes the methodology of a new decision-making framework that computes compromise solutions that balance conflicting priorities of multiple stakeholders on multiple objectives as needed for SHC Decision Science and Support Tools project. A biowaste facility location is employed as the case study
Applications of fuzzy theories to multi-objective system optimization
NASA Technical Reports Server (NTRS)
Rao, S. S.; Dhingra, A. K.
1991-01-01
Most of the computer aided design techniques developed so far deal with the optimization of a single objective function over the feasible design space. However, there often exist several engineering design problems which require a simultaneous consideration of several objective functions. This work presents several techniques of multiobjective optimization. In addition, a new formulation, based on fuzzy theories, is also introduced for the solution of multiobjective system optimization problems. The fuzzy formulation is useful in dealing with systems which are described imprecisely using fuzzy terms such as, 'sufficiently large', 'very strong', or 'satisfactory'. The proposed theory translates the imprecise linguistic statements and multiple objectives into equivalent crisp mathematical statements using fuzzy logic. The effectiveness of all the methodologies and theories presented is illustrated by formulating and solving two different engineering design problems. The first one involves the flight trajectory optimization and the main rotor design of helicopters. The second one is concerned with the integrated kinematic-dynamic synthesis of planar mechanisms. The use and effectiveness of nonlinear membership functions in fuzzy formulation is also demonstrated. The numerical results indicate that the fuzzy formulation could yield results which are qualitatively different from those provided by the crisp formulation. It is felt that the fuzzy formulation will handle real life design problems on a more rational basis.
Calculating excess lifetime risk in relative risk models.
Vaeth, M; Pierce, D A
1990-01-01
When assessing the impact of radiation exposure it is common practice to present the final conclusions in terms of excess lifetime cancer risk in a population exposed to a given dose. The present investigation is mainly a methodological study focusing on some of the major issues and uncertainties involved in calculating such excess lifetime risks and related risk projection methods. The age-constant relative risk model used in the recent analyses of the cancer mortality that was observed in the follow-up of the cohort of A-bomb survivors in Hiroshima and Nagasaki is used to describe the effect of the exposure on the cancer mortality. In this type of model the excess relative risk is constant in age-at-risk, but depends on the age-at-exposure. Calculation of excess lifetime risks usually requires rather complicated life-table computations. In this paper we propose a simple approximation to the excess lifetime risk; the validity of the approximation for low levels of exposure is justified empirically as well as theoretically. This approximation provides important guidance in understanding the influence of the various factors involved in risk projections. Among the further topics considered are the influence of a latent period, the additional problems involved in calculations of site-specific excess lifetime cancer risks, the consequences of a leveling off or a plateau in the excess relative risk, and the uncertainties involved in transferring results from one population to another. The main part of this study relates to the situation with a single, instantaneous exposure, but a brief discussion is also given of the problem with a continuous exposure at a low-dose rate. PMID:2269245
Calculating excess lifetime risk in relative risk models.
Vaeth, M; Pierce, D A
1990-07-01
When assessing the impact of radiation exposure it is common practice to present the final conclusions in terms of excess lifetime cancer risk in a population exposed to a given dose. The present investigation is mainly a methodological study focusing on some of the major issues and uncertainties involved in calculating such excess lifetime risks and related risk projection methods. The age-constant relative risk model used in the recent analyses of the cancer mortality that was observed in the follow-up of the cohort of A-bomb survivors in Hiroshima and Nagasaki is used to describe the effect of the exposure on the cancer mortality. In this type of model the excess relative risk is constant in age-at-risk, but depends on the age-at-exposure. Calculation of excess lifetime risks usually requires rather complicated life-table computations. In this paper we propose a simple approximation to the excess lifetime risk; the validity of the approximation for low levels of exposure is justified empirically as well as theoretically. This approximation provides important guidance in understanding the influence of the various factors involved in risk projections. Among the further topics considered are the influence of a latent period, the additional problems involved in calculations of site-specific excess lifetime cancer risks, the consequences of a leveling off or a plateau in the excess relative risk, and the uncertainties involved in transferring results from one population to another. The main part of this study relates to the situation with a single, instantaneous exposure, but a brief discussion is also given of the problem with a continuous exposure at a low-dose rate.
Creativity and psychopathology: a systematic review.
Thys, Erik; Sabbe, Bernard; De Hert, Marc
2014-01-01
The possible link between creativity and psychopathology has been a long-time focus of research up to the present day. However, the research results in this field are heterogeneous and contradictory. Links between creativity and specific psychiatric disorders have been confirmed and refuted in different studies. This disparity is partly explained by the methodological challenges peculiar to this field. In this systematic review of the literature from 1950, research articles in the field of creativity and psychopathology are presented, focusing on the methodology and results of the collected studies. This review confirms the methodological problems and the heterogeneity of the study designs and results. The assessment of psychopathology, but more so of creativity, remains a fundamental challenge. On the whole, study results cautiously confirm an association between creativity and both bipolar disorder and schizotypy. The research on creativity and psychopathology is hampered by serious methodological problems. Study results are to be interpreted with caution and future research needs more methodological rigor. © 2014 S. Karger AG, Basel.
Informal schooling and problem-solving skills in second-grade science: A naturalistic investigation
NASA Astrophysics Data System (ADS)
Griffin, Georgia Inez Hunt
The influence of informal schooling on the problem solving skills of urban elementary school children is unclear. The relationship between culture and problem solving can be studied using subjective methodologies, particularly when investigating problem solving strategies that are culturally situated. Yet, little research has been conducted to investigate how informal learning of African American children are integrated as part of the problem solving used in school. This study has been designed to expand the existing literature in this area. The purpose of this study is therefore to explore how 15 African American children attending school in Southwest Philadelphia solve problems presented to them in second grade science. This was accomplished by assessing their ability to observe, classify, recall, and perceive space/time relationships. Think-aloud protocols were used for this examination. A naturalistic approach to the investigation was implemented. Individual children were selected because he or she exhibited unique and subjective characteristics associated with individual approaches to problem solving. Children responded to three tasks: interviews of their parents, an essay on community gardens, and a group diorama collaboratively designed. Content analysis was used to infer themes that were evident in the children's work and that revealed the extent to which informal schooling influenced solutions to a community garden problem. The investigations did increase the researcher's ability to understand and build upon the understanding of African American children in their indigenous community. The study also demonstrated how these same strategies can be used to involve parents in the science curriculum. Additionally, the researcher gained insight on how to bridge the gap between home, community, and school.
Analyzing Problem's Difficulty Based on Neural Networks and Knowledge Map
ERIC Educational Resources Information Center
Kuo, Rita; Lien, Wei-Peng; Chang, Maiga; Heh, Jia-Sheng
2004-01-01
This paper proposes a methodology to calculate both the difficulty of the basic problems and the difficulty of solving a problem. The method to calculate the difficulty of problem is according to the process of constructing a problem, including Concept Selection, Unknown Designation, and Proposition Construction. Some necessary measures observed…
HESS Opinions "Integration of groundwater and surface water research: an interdisciplinary problem?"
NASA Astrophysics Data System (ADS)
Barthel, R.
2014-07-01
Today there is a great consensus that water resource research needs to become more holistic, integrating perspectives of a large variety of disciplines. Groundwater and surface water (hereafter: GW and SW) are typically identified as different compartments of the hydrological cycle and were traditionally often studied and managed separately. However, despite this separation, these respective fields of study are usually not considered to be different disciplines. They are often seen as different specializations of hydrology with a different focus yet similar theory, concepts, and methodology. The present article discusses how this notion may form a substantial obstacle in the further integration of GW and SW research and management. The article focuses on the regional scale (areas of approximately 103 to 106 km2), which is identified as the scale where integration is most greatly needed, but ironically where the least amount of fully integrated research seems to be undertaken. The state of research on integrating GW and SW research is briefly reviewed and the most essential differences between GW hydrology (or hydrogeology, geohydrology) and SW hydrology are presented. Groundwater recharge and baseflow are used as examples to illustrate different perspectives on similar phenomena that can cause severe misunderstandings and errors in the conceptualization of integration schemes. The fact that integration of GW and SW research on the regional scale necessarily must move beyond the hydrological aspects, by collaborating with the social sciences and increasing the interaction between science and society in general, is also discussed. The typical elements of an ideal interdisciplinary workflow are presented and their relevance with respect to the integration of GW and SW is discussed. The overall conclusions are that GW hydrology and SW hydrogeology study rather different objects of interest, using different types of observation, working on different problem settings. They have thus developed a different theory, methodology and terminology. However, there seems to be a widespread lack of awareness of these differences, which hinders the detection of the existing interdisciplinary aspects of GW and SW integration and consequently the development of a truly unifying interdisciplinary theory and methodology. Thus, despite having the ultimate goal of creating a more holistic approach, we may have to start integration by analyzing potential disciplinary differences. Improved understanding among hydrologists of what interdisciplinary means and how it works is needed. Hydrologists, despite frequently being involved in multidisciplinary projects, are not sufficiently involved in developing interdisciplinary strategies and do usually not regard the process of integration as such as a research topic of its own. There seems to be a general reluctance to apply a (truly) interdisciplinary methodology because this is tedious and few immediate incentives are experienced. The objective of the present opinion paper is to stimulate a discussion rather than to provide recipes on how to integrate GW and SW research or to explain how specific problems of GW-SW interaction should be solved on a technical level. For that purpose it presents complicated topics in a rather simplified, bold way, ignoring to some degree subtleties and potentially controversial issues.
Luzardo, Octavio P; Almeida-González, Maira; Ruiz-Suárez, Norberto; Zumbado, Manuel; Henríquez-Hernández, Luis A; Meilán, María José; Camacho, María; Boada, Luis D
2015-09-01
Pesticides are frequently responsible for human poisoning and often the information on the involved substance is lacking. The great variety of pesticides that could be responsible for intoxication makes necessary the development of powerful and versatile analytical methodologies, which allows the identification of the unknown toxic substance. Here we developed a methodology for simultaneous identification and quantification in human blood of 109 highly toxic pesticides. The application of this analytical scheme would help in minimizing the cost of this type of chemical identification, maximizing the chances of identifying the pesticide involved. In the methodology that we present here, we use a liquid-liquid extraction, followed by one single purification step, and quantitation of analytes by a combination of liquid and gas chromatography, both coupled to triple quadrupole mass spectrometry, which is operated in the mode of multiple reaction monitoring. The methodology has been fully validated, and its applicability has been demonstrated in two recent cases involving one self-poisoning fatality and one non-fatal homicidal attempt. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
Innovation design of medical equipment based on TRIZ.
Gao, Changqing; Guo, Leiming; Gao, Fenglan; Yang, Bo
2015-01-01
Medical equipment is closely related to personal health and safety, and this can be of concern to the equipment user. Furthermore, there is much competition among medical equipment manufacturers. Innovative design is the key to success for those enterprises. The design of medical equipment usually covers vastly different domains of knowledge. The application of modern design methodology in medical equipment and technology invention is an urgent requirement. TRIZ (Russian abbreviation of what can be translated as `theory of inventive problem solving') was born in Russia, which contain some problem-solving methods developed by patent analysis around the world, including Conflict Matrix, Substance Field Analysis, Standard Solution, Effects, etc. TRIZ is an inventive methodology for problems solving. As an Engineering example, infusion system is analyzed and re-designed by TRIZ. The innovative idea is generated to liberate the caretaker from the infusion bag watching out. The research in this paper shows the process of the application of TRIZ in medical device inventions. It is proved that TRIZ is an inventive methodology for problems solving and can be used widely in medical device development.
NASA Astrophysics Data System (ADS)
Pedretti, Daniele; Beckie, Roger Daniel
2014-05-01
Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were first tested on synthetic examples, to have a complete control of the impact of several variables such as minimum amount of data required to obtain reliable statistical distributions from the selected parametric functions. Then, we applied the methodology to precipitation datasets collected in the Vancouver area and on a mining site in Peru.
A risk assessment methodology for critical transportation infrastructure.
DOT National Transportation Integrated Search
2002-01-01
Infrastructure protection typifies a problem of risk assessment and management in a large-scale system. This study offers a methodological framework to identify, prioritize, assess, and manage risks. It includes the following major considerations: (1...
Methodology for nonwork travel analysis in suburban communities.
DOT National Transportation Integrated Search
1994-01-01
The increase in the number of nonwork trips during the past decade has contributed substantially to congestion and to environmental problems. Data collection methodologies, descriptive information, and reliable models of nonwork travel behavior are n...
2017-06-15
the methodology of reducing the online-algorithm-selecting problem as a contextual bandit problem, which is yet another interactive learning...KH2016a] Kuan-Hao Huang and Hsuan-Tien Lin. Linear upper confidence bound algorithm for contextual bandit problem with piled rewards. In Proceedings
Decomposition of timed automata for solving scheduling problems
NASA Astrophysics Data System (ADS)
Nishi, Tatsushi; Wakatake, Masato
2014-03-01
A decomposition algorithm for scheduling problems based on timed automata (TA) model is proposed. The problem is represented as an optimal state transition problem for TA. The model comprises of the parallel composition of submodels such as jobs and resources. The procedure of the proposed methodology can be divided into two steps. The first step is to decompose the TA model into several submodels by using decomposable condition. The second step is to combine individual solution of subproblems for the decomposed submodels by the penalty function method. A feasible solution for the entire model is derived through the iterated computation of solving the subproblem for each submodel. The proposed methodology is applied to solve flowshop and jobshop scheduling problems. Computational experiments demonstrate the effectiveness of the proposed algorithm compared with a conventional TA scheduling algorithm without decomposition.
Modular hardware synthesis using an HDL. [Hardware Description Language
NASA Technical Reports Server (NTRS)
Covington, J. A.; Shiva, S. G.
1981-01-01
Although hardware description languages (HDL) are becoming more and more necessary to automated design systems, their application is complicated due to the difficulty in translating the HDL description into an implementable format, nonfamiliarity of hardware designers with high-level language programming, nonuniform design methodologies and the time and costs involved in transfering HDL design software. Digital design language (DDL) suffers from all of the above problems and in addition can only by synthesized on a complete system and not on its subparts, making it unsuitable for synthesis using standard modules or prefabricated chips such as those required in LSI or VLSI circuits. The present paper presents a method by which the DDL translator can be made to generate modular equations that will allow the system to be synthesized as an interconnection of lower-level modules. The method involves the introduction of a new language construct called a Module which provides for the separate translation of all equations bounded by it.
The search for a hippocampal engram.
Mayford, Mark
2014-01-05
Understanding the molecular and cellular changes that underlie memory, the engram, requires the identification, isolation and manipulation of the neurons involved. This presents a major difficulty for complex forms of memory, for example hippocampus-dependent declarative memory, where the participating neurons are likely to be sparse, anatomically distributed and unique to each individual brain and learning event. In this paper, I discuss several new approaches to this problem. In vivo calcium imaging techniques provide a means of assessing the activity patterns of large numbers of neurons over long periods of time with precise anatomical identification. This provides important insight into how the brain represents complex information and how this is altered with learning. The development of techniques for the genetic modification of neural ensembles based on their natural, sensory-evoked, activity along with optogenetics allows direct tests of the coding function of these ensembles. These approaches provide a new methodological framework in which to examine the mechanisms of complex forms of learning at the level of the neurons involved in a specific memory.
The search for a hippocampal engram
Mayford, Mark
2014-01-01
Understanding the molecular and cellular changes that underlie memory, the engram, requires the identification, isolation and manipulation of the neurons involved. This presents a major difficulty for complex forms of memory, for example hippocampus-dependent declarative memory, where the participating neurons are likely to be sparse, anatomically distributed and unique to each individual brain and learning event. In this paper, I discuss several new approaches to this problem. In vivo calcium imaging techniques provide a means of assessing the activity patterns of large numbers of neurons over long periods of time with precise anatomical identification. This provides important insight into how the brain represents complex information and how this is altered with learning. The development of techniques for the genetic modification of neural ensembles based on their natural, sensory-evoked, activity along with optogenetics allows direct tests of the coding function of these ensembles. These approaches provide a new methodological framework in which to examine the mechanisms of complex forms of learning at the level of the neurons involved in a specific memory. PMID:24298162
Cost-benefit analysis of space technology
NASA Technical Reports Server (NTRS)
Hein, G. F.; Stevenson, S. M.; Sivo, J. N.
1976-01-01
A discussion of the implications and problems associated with the use of cost-benefit techniques is presented. Knowledge of these problems is useful in the structure of a decision making process. A methodology of cost-benefit analysis is presented for the evaluation of space technology. The use of the methodology is demonstrated with an evaluation of ion thrusters for north-south stationkeeping aboard geosynchronous communication satellites. A critique of the concept of consumers surplus for measuring benefits is also presented.
ERIC Educational Resources Information Center
Lee, Chwee Beng; Ling, Keck Voon; Reimann, Peter; Diponegoro, Yudho Ahmad; Koh, Chia Heng; Chew, Derwin
2014-01-01
Purpose: The purpose of this paper is to argue for the need to develop pre-service teachers' problem solving ability, in particular, in the context of real-world complex problems. Design/methodology/approach: To argue for the need to develop pre-service teachers' problem solving skills, the authors describe a web-based problem representation…
Using the CPGI to Determine Problem Gambling Prevalence in Australia: Measurement Issues
ERIC Educational Resources Information Center
Jackson, Alun C.; Wynne, Harold; Dowling, Nicki A.; Tomnay, Jane E.; Thomas, Shane A.
2010-01-01
Most states and territories in Australia have adopted the Problem Gambling Severity Index (PGSI) of the Canadian Problem Gambling Index as the standard measure of problem gambling in their prevalence studies and research programs. However, notwithstanding this attempted standardisation, differences in sampling and recruitment methodologies and in…
Using Problem-Based Learning to Enhance Team and Player Development in Youth Soccer
ERIC Educational Resources Information Center
Hubball, Harry; Robertson, Scott
2004-01-01
Problem-based learning (PBL) is a coaching and teaching methodology that develops knowledge, abilities, and skills. It also encourages participation, collaborative investigation, and the resolution of authentic, "ill-structured" problems through the use of problem definition, teamwork, communication, data collection, decision-making,…
A methodology to find the elementary landscape decomposition of combinatorial optimization problems.
Chicano, Francisco; Whitley, L Darrell; Alba, Enrique
2011-01-01
A small number of combinatorial optimization problems have search spaces that correspond to elementary landscapes, where the objective function f is an eigenfunction of the Laplacian that describes the neighborhood structure of the search space. Many problems are not elementary; however, the objective function of a combinatorial optimization problem can always be expressed as a superposition of multiple elementary landscapes if the underlying neighborhood used is symmetric. This paper presents theoretical results that provide the foundation for algebraic methods that can be used to decompose the objective function of an arbitrary combinatorial optimization problem into a sum of subfunctions, where each subfunction is an elementary landscape. Many steps of this process can be automated, and indeed a software tool could be developed that assists the researcher in finding a landscape decomposition. This methodology is then used to show that the subset sum problem is a superposition of two elementary landscapes, and to show that the quadratic assignment problem is a superposition of three elementary landscapes.
HESS Opinions "Integration of groundwater and surface water research: an interdisciplinary problem?"
NASA Astrophysics Data System (ADS)
Barthel, R.
2014-02-01
Today there is a great consensus that water resources research needs to become more holistic, integrating perspectives of a large variety of disciplines. Groundwater and surface water (hereafter: GW and SW) are typically identified as different compartments of the hydrological cycle and were traditionally often studied and managed separately. However, despite this separation, these respective fields of study are usually not considered to be different disciplines. They are often seen as different specialisations of hydrology with different focus, yet similar theory, concepts, methodology. The present article discusses how this notion may form a substantial obstacle in the further integration of GW and SW research and management. The article focusses on the regional scale (areas of approx. 103 to 106 km2), which is identified as the scale where integration is most greatly needed, but ironically the least amount of fully integrated research seems to be undertaken. The state of research on integrating GW and SW research is briefly reviewed and the most essential differences between GW hydrology (or hydrogeology, geohydrology) and SW hydrology are presented. Groundwater recharge and baseflow are used as examples to illustrate different perspectives on similar phenomena that can cause severe misunderstandings and errors in the conceptualisation of integration schemes. It is also discussed that integration of GW and SW research on the regional scale necessarily must move beyond the hydrological aspects, by collaborating with social sciences and increasing the interaction between science and the society in general. The typical elements of an ideal interdisciplinary workflow are presented and their relevance with respect to integration of GW and SW is discussed. The overall conclusions are that GW hydrology and SW hydrogeology study rather different objects of interest, using different types of observation, working on different problem settings. They have thus developed different theory, methodology and terminology. Yet, there seems to be a widespread lack of awareness of these differences which hinders the detection of the existing interdisciplinary aspects of GW and SW integration and consequently the development of truly unifying, interdisciplinary theory and methodology. Thus, despite having the ultimate goal of creating a more holistic approach, we should start integration by analysing potential disciplinary differences. Improved understanding among hydrologists of what interdisciplinary means and how it works is needed. Hydrologists, despite frequently being involved in multidisciplinary projects, are not sufficiently involved in developing interdisciplinary strategies and do usually not regard the process of integration as such as a research topic of its own. There seems to be a general reluctance to apply (truly) interdisciplinary methodology because this is tedious and few, immediate incentives are experienced.
Transshipment site selection using the AHP and TOPSIS approaches under fuzzy environment.
Onüt, Semih; Soner, Selin
2008-01-01
Site selection is an important issue in waste management. Selection of the appropriate solid waste site requires consideration of multiple alternative solutions and evaluation criteria because of system complexity. Evaluation procedures involve several objectives, and it is often necessary to compromise among possibly conflicting tangible and intangible factors. For these reasons, multiple criteria decision-making (MCDM) has been found to be a useful approach to solve this kind of problem. Different MCDM models have been applied to solve this problem. But most of them are basically mathematical and ignore qualitative and often subjective considerations. It is easier for a decision-maker to describe a value for an alternative by using linguistic terms. In the fuzzy-based method, the rating of each alternative is described using linguistic terms, which can also be expressed as triangular fuzzy numbers. Furthermore, there have not been any studies focused on the site selection in waste management using both fuzzy TOPSIS (technique for order preference by similarity to ideal solution) and AHP (analytical hierarchy process) techniques. In this paper, a fuzzy TOPSIS based methodology is applied to solve the solid waste transshipment site selection problem in Istanbul, Turkey. The criteria weights are calculated by using the AHP.
Artificial Intelligence and Information Management
NASA Astrophysics Data System (ADS)
Fukumura, Teruo
After reviewing the recent popularization of the information transmission and processing technologies, which are supported by the progress of electronics, the authors describe that by the introduction of the opto-electronics into the information technology, the possibility of applying the artificial intelligence (AI) technique to the mechanization of the information management has emerged. It is pointed out that althuogh AI deals with problems in the mental world, its basic methodology relies upon the verification by evidence, so the experiment on computers become indispensable for the study of AI. The authors also describe that as computers operate by the program, the basic intelligence which is concerned in AI is that expressed by languages. This results in the fact that the main tool of AI is the logical proof and it involves an intrinsic limitation. To answer a question “Why do you employ AI in your problem solving”, one must have ill-structured problems and intend to conduct deep studies on the thinking and the inference, and the memory and the knowledge-representation. Finally the authors discuss the application of AI technique to the information management. The possibility of the expert-system, processing of the query, and the necessity of document knowledge-base are stated.
Use of multicriteria analysis (MCA) for sustainable hydropower planning and management.
Vassoney, Erica; Mammoliti Mochet, Andrea; Comoglio, Claudio
2017-07-01
Multicriteria analysis (MCA) is a decision-making tool applied to a wide range of environmental management problems, including renewable energy planning and management. An interesting field of application of MCA is the evaluation and analysis of the conflicting aspects of hydropower (HP) exploitation, affecting the three pillars of sustainability and involving several different stakeholders. The present study was aimed at reviewing the state of the art of MCA applications to sustainable hydropower production and related decision-making problems, based on a detailed analysis of the scientific papers published over the last 15 years on this topic. The papers were analysed and compared, focusing on the specific features of the MCA methods applied in the described case studies, highlighting the general aspects of the MCA application (purpose, spatial scale, software used, stakeholders, etc.) and the specific operational/technical features of the selected MCA technique (methodology, criteria, evaluation, approach, sensitivity, etc.). Some specific limitations of the analysed case studies were identified and a set of "quality indexes" of an exhaustive MCA application were suggested as potential improvements for more effectively support decision-making processes in sustainable HP planning and management problems. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onuet, Semih; Soner, Selin
Site selection is an important issue in waste management. Selection of the appropriate solid waste site requires consideration of multiple alternative solutions and evaluation criteria because of system complexity. Evaluation procedures involve several objectives, and it is often necessary to compromise among possibly conflicting tangible and intangible factors. For these reasons, multiple criteria decision-making (MCDM) has been found to be a useful approach to solve this kind of problem. Different MCDM models have been applied to solve this problem. But most of them are basically mathematical and ignore qualitative and often subjective considerations. It is easier for a decision-maker tomore » describe a value for an alternative by using linguistic terms. In the fuzzy-based method, the rating of each alternative is described using linguistic terms, which can also be expressed as triangular fuzzy numbers. Furthermore, there have not been any studies focused on the site selection in waste management using both fuzzy TOPSIS (technique for order preference by similarity to ideal solution) and AHP (analytical hierarchy process) techniques. In this paper, a fuzzy TOPSIS based methodology is applied to solve the solid waste transshipment site selection problem in Istanbul, Turkey. The criteria weights are calculated by using the AHP.« less
NASA Astrophysics Data System (ADS)
Nowotarski, Piotr; Paslawski, Jerzy; Wysocki, Bartosz
2017-12-01
Ground works are one of the first processes connected with erecting structures. Based on ground conditions like the type of soil or level of underground water different types and solutions for foundations are designed. Foundations are the base for the buildings, and their proper design and execution is the key for the long and faultless use of the whole construction and might influence on the future costs of the eventual repairs (especially when ground water level is high, and there is no proper water insulation made). Article presents the introduction of chosen Lean Management tools for quality improvement of the process of ground works based on the analysis made on the construction site of vehicle control station located in Poznan, Poland. Processes assessment is made from different perspectives taking into account that 3 main groups of workers were directly involved in the process: blue collar-workers, site manager and site engineers. What is more comparison is made on the 3 points of view to the problems that might occur during this type of works, with details analysis on the causes of such situation? Authors presents also the change of approach of workers directly involved in the mentioned processes regarding introduction of Lean Management methodology, which illustrates the problem of scepticism for new ideas of the people used to perform works and actions in traditional way. Using Lean Management philosophy in construction is a good idea to streamline processes in company, get rid of constantly recurring problems, and in this way improve the productivity and quality of executed activities. Performed analysis showed that different groups of people have very different idea and opinion on the problems connected with executing the same process - ground works and only having full picture of the situation (especially in construction processes) management can take proper problems-preventing actions that consequently can influence on the amount of waste generated on the construction cite which positively influence on the external environment.
A Monte Carlo analysis of breast screening randomized trials.
Zamora, Luis I; Forastero, Cristina; Guirado, Damián; Lallena, Antonio M
2016-12-01
To analyze breast screening randomized trials with a Monte Carlo simulation tool. A simulation tool previously developed to simulate breast screening programmes was adapted for that purpose. The history of women participating in the trials was simulated, including a model for survival after local treatment of invasive cancers. Distributions of time gained due to screening detection against symptomatic detection and the overall screening sensitivity were used as inputs. Several randomized controlled trials were simulated. Except for the age range of women involved, all simulations used the same population characteristics and this permitted to analyze their external validity. The relative risks obtained were compared to those quoted for the trials, whose internal validity was addressed by further investigating the reasons of the disagreements observed. The Monte Carlo simulations produce results that are in good agreement with most of the randomized trials analyzed, thus indicating their methodological quality and external validity. A reduction of the breast cancer mortality around 20% appears to be a reasonable value according to the results of the trials that are methodologically correct. Discrepancies observed with Canada I and II trials may be attributed to a low mammography quality and some methodological problems. Kopparberg trial appears to show a low methodological quality. Monte Carlo simulations are a powerful tool to investigate breast screening controlled randomized trials, helping to establish those whose results are reliable enough to be extrapolated to other populations and to design the trial strategies and, eventually, adapting them during their development. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
On the generalized VIP time integral methodology for transient thermal problems
NASA Technical Reports Server (NTRS)
Mei, Youping; Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong
1993-01-01
The paper describes the development and applicability of a generalized VIrtual-Pulse (VIP) time integral method of computation for thermal problems. Unlike past approaches for general heat transfer computations, and with the advent of high speed computing technology and the importance of parallel computations for efficient use of computing environments, a major motivation via the developments described in this paper is the need for developing explicit computational procedures with improved accuracy and stability characteristics. As a consequence, a new and effective VIP methodology is described which inherits these improved characteristics. Numerical illustrative examples are provided to demonstrate the developments and validate the results obtained for thermal problems.
A robust optimization methodology for preliminary aircraft design
NASA Astrophysics Data System (ADS)
Prigent, S.; Maréchal, P.; Rondepierre, A.; Druot, T.; Belleville, M.
2016-05-01
This article focuses on a robust optimization of an aircraft preliminary design under operational constraints. According to engineers' know-how, the aircraft preliminary design problem can be modelled as an uncertain optimization problem whose objective (the cost or the fuel consumption) is almost affine, and whose constraints are convex. It is shown that this uncertain optimization problem can be approximated in a conservative manner by an uncertain linear optimization program, which enables the use of the techniques of robust linear programming of Ben-Tal, El Ghaoui, and Nemirovski [Robust Optimization, Princeton University Press, 2009]. This methodology is then applied to two real cases of aircraft design and numerical results are presented.
Towards Robust Designs Via Multiple-Objective Optimization Methods
NASA Technical Reports Server (NTRS)
Man Mohan, Rai
2006-01-01
Fabricating and operating complex systems involves dealing with uncertainty in the relevant variables. In the case of aircraft, flow conditions are subject to change during operation. Efficiency and engine noise may be different from the expected values because of manufacturing tolerances and normal wear and tear. Engine components may have a shorter life than expected because of manufacturing tolerances. In spite of the important effect of operating- and manufacturing-uncertainty on the performance and expected life of the component or system, traditional aerodynamic shape optimization has focused on obtaining the best design given a set of deterministic flow conditions. Clearly it is important to both maintain near-optimal performance levels at off-design operating conditions, and, ensure that performance does not degrade appreciably when the component shape differs from the optimal shape due to manufacturing tolerances and normal wear and tear. These requirements naturally lead to the idea of robust optimal design wherein the concept of robustness to various perturbations is built into the design optimization procedure. The basic ideas involved in robust optimal design will be included in this lecture. The imposition of the additional requirement of robustness results in a multiple-objective optimization problem requiring appropriate solution procedures. Typically the costs associated with multiple-objective optimization are substantial. Therefore efficient multiple-objective optimization procedures are crucial to the rapid deployment of the principles of robust design in industry. Hence the companion set of lecture notes (Single- and Multiple-Objective Optimization with Differential Evolution and Neural Networks ) deals with methodology for solving multiple-objective Optimization problems efficiently, reliably and with little user intervention. Applications of the methodologies presented in the companion lecture to robust design will be included here. The evolutionary method (DE) is first used to solve a relatively difficult problem in extended surface heat transfer wherein optimal fin geometries are obtained for different safe operating base temperatures. The objective of maximizing the safe operating base temperature range is in direct conflict with the objective of maximizing fin heat transfer. This problem is a good example of achieving robustness in the context of changing operating conditions. The evolutionary method is then used to design a turbine airfoil; the two objectives being reduced sensitivity of the pressure distribution to small changes in the airfoil shape and the maximization of the trailing edge wedge angle with the consequent increase in airfoil thickness and strength. This is a relevant example of achieving robustness to manufacturing tolerances and wear and tear in the presence of other objectives.
Exploratory High-Fidelity Aerostructural Optimization Using an Efficient Monolithic Solution Method
NASA Astrophysics Data System (ADS)
Zhang, Jenmy Zimi
This thesis is motivated by the desire to discover fuel efficient aircraft concepts through exploratory design. An optimization methodology based on tightly integrated high-fidelity aerostructural analysis is proposed, which has the flexibility, robustness, and efficiency to contribute to this goal. The present aerostructural optimization methodology uses an integrated geometry parameterization and mesh movement strategy, which was initially proposed for aerodynamic shape optimization. This integrated approach provides the optimizer with a large amount of geometric freedom for conducting exploratory design, while allowing for efficient and robust mesh movement in the presence of substantial shape changes. In extending this approach to aerostructural optimization, this thesis has addressed a number of important challenges. A structural mesh deformation strategy has been introduced to translate consistently the shape changes described by the geometry parameterization to the structural model. A three-field formulation of the discrete steady aerostructural residual couples the mesh movement equations with the three-dimensional Euler equations and a linear structural analysis. Gradients needed for optimization are computed with a three-field coupled adjoint approach. A number of investigations have been conducted to demonstrate the suitability and accuracy of the present methodology for use in aerostructural optimization involving substantial shape changes. Robustness and efficiency in the coupled solution algorithms is crucial to the success of an exploratory optimization. This thesis therefore also focuses on the design of an effective monolithic solution algorithm for the proposed methodology. This involves using a Newton-Krylov method for the aerostructural analysis and a preconditioned Krylov subspace method for the coupled adjoint solution. Several aspects of the monolithic solution method have been investigated. These include appropriate strategies for scaling and matrix-vector product evaluation, as well as block preconditioning techniques that preserve the modularity between subproblems. The monolithic solution method is applied to problems with varying degrees of fluid-structural coupling, as well as a wing span optimization study. The monolithic solution algorithm typically requires 20%-70% less computing time than its partitioned counterpart. This advantage increases with increasing wing flexibility. The performance of the monolithic solution method is also much less sensitive to the choice of the solution parameter.
Culture shock and synergy. Academic/managed care/corporate alliances in outcomes management.
Berman, W H; Darling, H; Hurt, S W; Hunkeler, E M
1994-01-01
The Behavioral Health Outcomes Study is a partnership in conducting outcomes measurement involving a corporate healthcare purchaser, five managed behavioral healthcare organizations and academic researchers. The goals of this study are to: evaluate the feasibility of incorporating patient self-reported data in outcomes research; identify factors that may be predictors of outcome; and evaluate the effectiveness of an employee-sponsored aftercare program. The differing perspectives and needs of the three partners have created a number of challenges in the areas of goals, confidentiality, proprietary vs. open access issues and methodology. However, after the study's first year, it is clear not only that outcomes research can be conducted under such a partnership, but that the partnership generates a kind of synergy in problem-solving.
Preprocessing film-copied MRI for studying morphological brain changes.
Pham, Tuan D; Eisenblätter, Uwe; Baune, Bernhard T; Berger, Klaus
2009-06-15
The magnetic resonance imaging (MRI) of the brain is one of the important data items for studying memory and morbidity in elderly as these images can provide useful information through the quantitative measures of various regions of interest of the brain. As an effort to fully automate the biomedical analysis of the brain that can be combined with the genetic data of the same human population and where the records of the original MRI data are missing, this paper presents two effective methods for addressing this imaging problem. The first method handles the restoration of the film-copied MRI. The second method involves the segmentation of the image data. Experimental results and comparisons with other methods suggest the usefulness of the proposed image analysis methodology.
Study of techniques for redundancy verification without disrupting systems, phases 1-3
NASA Technical Reports Server (NTRS)
1970-01-01
The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.
Colombo, Jorge A
2018-06-01
Assertions regarding attempts to link glial and macrostructural brain events with cognitive performance regarding Albert Einstein, are critically reviewed. One basic problem arises from attempting to draw causal relationships regarding complex, delicately interactive functional processes involving finely tuned molecular and connectivity phenomena expressed in cognitive performance, based on highly variable brain structural events of a single, aged, formalin fixed brain. Data weaknesses and logical flaws are considered. In other instances, similar neuroanatomical observations received different interpretations and conclusions, as those drawn, e.g., from schizophrenic brains. Observations on white matter events also raise methodological queries. Additionally, neurocognitive considerations on other intellectual aptitudes of A. Einstein were simply ignored.
DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS
Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...
Using soft systems methodology to develop a simulation of out-patient services.
Lehaney, B; Paul, R J
1994-10-01
Discrete event simulation is an approach to modelling a system in the form of a set of mathematical equations and logical relationships, usually used for complex problems, which are difficult to address by using analytical or numerical methods. Managing out-patient services is such a problem. However, simulation is not in itself a systemic approach, in that it provides no methodology by which system boundaries and system activities may be identified. The investigation considers the use of soft systems methodology as an aid to drawing system boundaries and identifying system activities, for the purpose of simulating the outpatients' department at a local hospital. The long term aims are to examine the effects that the participative nature of soft systems methodology has on the acceptability of the simulation model, and to provide analysts and managers with a process that may assist in planning strategies for health care.
Brain Dynamics: Methodological Issues and Applications in Psychiatric and Neurologic Diseases
NASA Astrophysics Data System (ADS)
Pezard, Laurent
The human brain is a complex dynamical system generating the EEG signal. Numerical methods developed to study complex physical dynamics have been used to characterize EEG since the mid-eighties. This endeavor raised several issues related to the specificity of EEG. Firstly, theoretical and methodological studies should address the major differences between the dynamics of the human brain and physical systems. Secondly, this approach of EEG signal should prove to be relevant for dealing with physiological or clinical problems. A set of studies performed in our group is presented here within the context of these two problematic aspects. After the discussion of methodological drawbacks, we review numerical simulations related to the high dimension and spatial extension of brain dynamics. Experimental studies in neurologic and psychiatric disease are then presented. We conclude that if it is now clear that brain dynamics changes in relation with clinical situations, methodological problems remain largely unsolved.
2017-09-01
with new methodologies of intratumoral phylogenetic analyses, will yield pivotal information in elucidating the key genes involved evolution of PCa...combined with both clinical and experimental genetic data produced by this study may empower patients and doctors to make personalized treatment decisions...sequencing, paired with new methodologies of intratumoral phylogenetic analyses, will yield pivotal information in elucidating the key genes involved
NASA Astrophysics Data System (ADS)
Bazilevs, Y.; Kamran, K.; Moutsanidis, G.; Benson, D. J.; Oñate, E.
2017-07-01
In this two-part paper we begin the development of a new class of methods for modeling fluid-structure interaction (FSI) phenomena for air blast. We aim to develop accurate, robust, and practical computational methodology, which is capable of modeling the dynamics of air blast coupled with the structure response, where the latter involves large, inelastic deformations and disintegration into fragments. An immersed approach is adopted, which leads to an a-priori monolithic FSI formulation with intrinsic contact detection between solid objects, and without formal restrictions on the solid motions. In Part I of this paper, the core air-blast FSI methodology suitable for a variety of discretizations is presented and tested using standard finite elements. Part II of this paper focuses on a particular instantiation of the proposed framework, which couples isogeometric analysis (IGA) based on non-uniform rational B-splines and a reproducing-kernel particle method (RKPM), which is a Meshfree technique. The combination of IGA and RKPM is felt to be particularly attractive for the problem class of interest due to the higher-order accuracy and smoothness of both discretizations, and relative simplicity of RKPM in handling fragmentation scenarios. A collection of mostly 2D numerical examples is presented in each of the parts to illustrate the good performance of the proposed air-blast FSI framework.
Chung, Ka-Fai; Chan, Man-Sum; Lam, Ying-Yin; Lai, Cindy Sin-Yee; Yeung, Wing-Fai
2017-06-01
Insufficient sleep among students is a major school health problem. School-based sleep education programs tailored to reach large number of students may be one of the solutions. A systematic review and meta-analysis was conducted to summarize the programs' effectiveness and current status. Electronic databases were searched up until May 2015. Randomized controlled trials of school-based sleep intervention among 10- to 19-year-old students with outcome on total sleep duration were included. Methodological quality of the studies was assessed using the Cochrane's risk of bias assessment. Seven studies were included, involving 1876 students receiving sleep education programs and 2483 attending classes-as-usual. Four weekly 50-minute sleep education classes were most commonly provided. Methodological quality was only moderate, with a high or an uncertain risk of bias in several domains. Compared to classes-as-usual, sleep education programs produced significantly longer weekday and weekend total sleep time and better mood among students at immediate post-treatment, but the improvements were not maintained at follow-up. Limited by the small number of studies and methodological limitations, the preliminary data showed that school-based sleep education programs produced short-term benefits. Future studies should explore integrating sleep education with delayed school start time or other more effective approaches. © 2017, American School Health Association.
Critical Communicative Methodology: Informing Real Social Transformation through Research
ERIC Educational Resources Information Center
Gomez, Aitor; Puigvert, Lidia; Flecha, Ramon
2011-01-01
The critical communicative methodology (CCM) is a methodological response to the dialogic turn of societies and sciences that has already had an important impact in transforming situations of inequality and exclusion. Research conducted with the CCM implies continuous and egalitarian dialogue among researchers and the people involved in the…
[Ethical considerations about research with women in situations of violence].
Rafael, Ricardo de Mattos Russo; Soares de Moura, Anna Tereza Miranda
2013-01-01
This essay aims at reflecting on the ethical and methodological principles involved in research with women in situation of violence. The text raises the discussion of the application of the principles of beneficence and non-maleficence during researches involving this issue, pointing to recommendations towards privacy, autonomy and immediate contributions for volunteers. Then, taking as theoretical reference the principles of justice and equity, the authors propose a debate on methodological aspects involved in protection of respondents, with a view at improving the quality of the data obtained and possible social contributions.
NASA Astrophysics Data System (ADS)
Nebot, Àngela; Mugica, Francisco
2012-10-01
Fuzzy inductive reasoning (FIR) is a modelling and simulation methodology derived from the General Systems Problem Solver. It compares favourably with other soft computing methodologies, such as neural networks, genetic or neuro-fuzzy systems, and with hard computing methodologies, such as AR, ARIMA, or NARMAX, when it is used to predict future behaviour of different kinds of systems. This paper contains an overview of the FIR methodology, its historical background, and its evolution.
Borba, Flávia de Souza Lins; Jawhari, Tariq; Saldanha Honorato, Ricardo; de Juan, Anna
2017-03-27
This article describes a non-destructive analytical method developed to solve forensic document examination problems involving crossed lines and obliteration. Different strategies combining confocal Raman imaging and multivariate curve resolution-alternating least squares (MCR-ALS) are presented. Multilayer images were acquired at subsequent depth layers into the samples. It is the first time that MCR-ALS is applied to multilayer images for forensic purposes. In this context, this method provides a single set of pure spectral ink signatures and related distribution maps for all layers examined from the sole information in the raw measurement. Four cases were investigated, namely, two concerning crossed lines with different degrees of ink similarity and two related to obliteration, where previous or no knowledge about the identity of the obliterated ink was available. In the crossing line scenario, MCR-ALS analysis revealed the ink nature and the chronological order in which strokes were drawn. For obliteration cases, results making active use of information about the identity of the obliterated ink in the chemometric analysis were of similar quality as those where the identity of the obliterated ink was unknown. In all obliteration scenarios, the identity of inks and the obliterated text were satisfactorily recovered. The analytical methodology proposed is of general use for analytical forensic document examination problems, and considers different degrees of complexity and prior available information. Besides, the strategies of data analysis proposed can be applicable to any other kind of problem in which multilayer Raman images from multicomponent systems have to be interpreted.
Gysels, Marjolein; Richardson, Alison; Higginson, Irene J
2007-03-01
To assess the effectiveness of the patient-held record (PHR) in cancer care. Patients with cancer may receive care from different services resulting in gaps. A PHR could provide continuity and patient involvement in care. Relevant literature was identified through five electronic databases (Medline, Embase, Cinahl, CCTR and CDSR) and hand searches. Patient-held records in cancer care with the purpose of improving communication and information exchange between and within different levels of care and to promote continuity of care and patients' involvement in their own care. Data extraction recorded characteristics of intervention, type of study and factors that contributed to methodological quality of individual studies. Data were then contrasted by setting, objectives, population, study design, outcome measures and changes in outcome, including knowledge, satisfaction, anxiety and depression. Methodological quality of randomized control trials and non-experimental studies were assessed with separate standard grading scales. Seven randomized control trials and six non-experimental studies were identified. Evaluations of the PHR have reached equivocal findings. Randomized trials found an absence of effect, non-experimental evaluations shed light on the conditions for its successful use. Most patients welcomed introduction of a PHR. Main problems related to its suitability for different patient groups and the lack of agreement between patients and health professionals regarding its function. Further research is required to determine the conditions under which the PHR can realize its potential as a tool to promote continuity of care and patient participation.
Environmental care in agricultural catchments: Toward the communicative catchment
NASA Astrophysics Data System (ADS)
Martin, Peter
1991-11-01
Substantial land degradation of agricultural catchments in Australia has resulted from the importation of European farming methods and the large-scale clearing of land. Rural communities are now being encouraged by government to take responsibility for environmental care. The importance of community involvement is supported by the view that environmental problems are a function of interactions between people and their environment. It is suggested that the commonly held view that community groups cannot care for their resources is due to inappropriate social institutions rather that any inherent disability in people. The communicative catchment is developed as a vision for environmental care into the future. This concept emerges from a critique of resource management through the catchment metaphors of the reduced, mechanical, and the complex, evolving catchment, which reflect the development of systemic and people-centered approaches to environmental care. The communicative catchment is one where both community and resource managers participate collaboratively in environmental care. A methodology based on action research and systemic thinking (systemic action research) is proposed as a way of moving towards the communicative catchment of the future. Action research is a way of taking action in organizations and communities that is participative and informed by theory, while systemic thinking takes into account the interconnections and relationships between social and natural worlds. The proposed vision, methodology, and practical operating principles stem from involvement in an action research project looking at extension strategies for the implementation of total catchment management in the Hunter Valley, New South Wales.
[Problem-solving approach in the training of healthcare professionals].
Batista, Nildo; Batista, Sylvia Helena; Goldenberg, Paulete; Seiffert, Otília; Sonzogno, Maria Cecília
2005-04-01
To discuss the problem-solving approach in the training of healthcare professionals who would be able to act both in academic life and in educational practices in services and communities. This is an analytical description of an experience of problem-based learning in specialization-level training that was developed within a university-level healthcare education institution. The analysis focuses on three perspectives: course design, student-centered learning and the teacher's role. The problem-solving approach provided impetus to the learning experience for these postgraduate students. There was increased motivation, leadership development and teamworking. This was translated through their written work, seminars and portfolio preparation. The evaluation process for these experiences presupposes well-founded practices that express the views of the subjects involved: self-assessment and observer assessment. The impact of this methodology on teaching practices is that there is a need for greater knowledge of the educational theories behind the principles of significant learning, teachers as intermediaries and research as an educational axiom. The problem-solving approach is an innovative response to the challenges of training healthcare professionals. Its potential is recognized, while it is noted that educational innovations are characterized by causing ruptures in consolidated methods and by establishing different ways of responding to demands presented at specific moments. The critical problems were identified, while highlighting the risk of considering this approach to be a technical tool that is unconnected with the design of the teaching policy. Experiences and analyses based on the problem-solving assumptions need to be shared, thus enabling the production of knowledge that strengthens the transformation of educational practices within healthcare.
OPUS: Optimal Projection for Uncertain Systems. Volume 1
1991-09-01
unifiedI control- design methodology that directly addresses these technology issues. 1 In particular, optimal projection theory addresses the need for...effects, and limited identification accuracy in a 1-g environment. The principal contribution of OPUS is a unified design methodology that...characterizing solutions to constrained control- design problems. Transforming OPUS into a practi- cal design methodology requires the development of
Suggested criteria for evaluating systems engineering methodologies
NASA Technical Reports Server (NTRS)
Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.
1989-01-01
Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.
Fuzzy Linear Programming and its Application in Home Textile Firm
NASA Astrophysics Data System (ADS)
Vasant, P.; Ganesan, T.; Elamvazuthi, I.
2011-06-01
In this paper, new fuzzy linear programming (FLP) based methodology using a specific membership function, named as modified logistic membership function is proposed. The modified logistic membership function is first formulated and its flexibility in taking up vagueness in parameter is established by an analytical approach. The developed methodology of FLP has provided a confidence in applying to real life industrial production planning problem. This approach of solving industrial production planning problem can have feedback with the decision maker, the implementer and the analyst.
Robust Feedback Control of Flow Induced Structural Radiation of Sound
NASA Technical Reports Server (NTRS)
Heatwole, Craig M.; Bernhard, Robert J.; Franchek, Matthew A.
1997-01-01
A significant component of the interior noise of aircraft and automobiles is a result of turbulent boundary layer excitation of the vehicular structure. In this work, active robust feedback control of the noise due to this non-predictable excitation is investigated. Both an analytical model and experimental investigations are used to determine the characteristics of the flow induced structural sound radiation problem. The problem is shown to be broadband in nature with large system uncertainties associated with the various operating conditions. Furthermore the delay associated with sound propagation is shown to restrict the use of microphone feedback. The state of the art control methodologies, IL synthesis and adaptive feedback control, are evaluated and shown to have limited success for solving this problem. A robust frequency domain controller design methodology is developed for the problem of sound radiated from turbulent flow driven plates. The control design methodology uses frequency domain sequential loop shaping techniques. System uncertainty, sound pressure level reduction performance, and actuator constraints are included in the design process. Using this design method, phase lag was added using non-minimum phase zeros such that the beneficial plant dynamics could be used. This general control approach has application to lightly damped vibration and sound radiation problems where there are high bandwidth control objectives requiring a low controller DC gain and controller order.
Applying Lakatos' Theory to the Theory of Mathematical Problem Solving.
ERIC Educational Resources Information Center
Nunokawa, Kazuhiko
1996-01-01
The relation between Lakatos' theory and issues in mathematics education, especially mathematical problem solving, is investigated by examining Lakatos' methodology of a scientific research program. (AIM)
Mining EEG with SVM for Understanding Cognitive Underpinnings of Math Problem Solving Strategies
López, Julio
2018-01-01
We have developed a new methodology for examining and extracting patterns from brain electric activity by using data mining and machine learning techniques. Data was collected from experiments focused on the study of cognitive processes that might evoke different specific strategies in the resolution of math problems. A binary classification problem was constructed using correlations and phase synchronization between different electroencephalographic channels as characteristics and, as labels or classes, the math performances of individuals participating in specially designed experiments. The proposed methodology is based on using well-established procedures of feature selection, which were used to determine a suitable brain functional network size related to math problem solving strategies and also to discover the most relevant links in this network without including noisy connections or excluding significant connections. PMID:29670667
Mining EEG with SVM for Understanding Cognitive Underpinnings of Math Problem Solving Strategies.
Bosch, Paul; Herrera, Mauricio; López, Julio; Maldonado, Sebastián
2018-01-01
We have developed a new methodology for examining and extracting patterns from brain electric activity by using data mining and machine learning techniques. Data was collected from experiments focused on the study of cognitive processes that might evoke different specific strategies in the resolution of math problems. A binary classification problem was constructed using correlations and phase synchronization between different electroencephalographic channels as characteristics and, as labels or classes, the math performances of individuals participating in specially designed experiments. The proposed methodology is based on using well-established procedures of feature selection, which were used to determine a suitable brain functional network size related to math problem solving strategies and also to discover the most relevant links in this network without including noisy connections or excluding significant connections.
On the Analysis of Two-Person Problem Solving Protocols.
ERIC Educational Resources Information Center
Schoenfeld, Alan H.
Methodological issues in the use of protocol analysis for research into human problem solving processes are examined through a case study in which two students were videotaped as they worked together to solve mathematical problems "out loud." The students' chosen strategic or executive behavior in examining and solving a problem was…
Problem? "No Problem!" Solving Technical Contradictions
ERIC Educational Resources Information Center
Kutz, K. Scott; Stefan, Victor
2007-01-01
TRIZ (pronounced TREES), the Russian acronym for the theory of inventive problem solving, enables a person to focus his attention on finding genuine, potential solutions in contrast to searching for ideas that "may" work through a happenstance way. It is a patent database-backed methodology that helps to reduce time spent on the problem,…
ERIC Educational Resources Information Center
Cormas, Peter C.
2016-01-01
Preservice teachers (N = 27) in two sections of a sequenced, methodological and process integrated mathematics/science course solved a levers problem with three similar learning processes and a problem-solving approach, and identified a problem-solving approach through one different learning process. Similar learning processes used included:…
A TAPS Interactive Multimedia Package to Solve Engineering Dynamics Problem
ERIC Educational Resources Information Center
Sidhu, S. Manjit; Selvanathan, N.
2005-01-01
Purpose: To expose engineering students to using modern technologies, such as multimedia packages, to learn, visualize and solve engineering problems, such as in mechanics dynamics. Design/methodology/approach: A multimedia problem-solving prototype package is developed to help students solve an engineering problem in a step-by-step approach. A…
Hardware proofs using EHDM and the RSRE verification methodology
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Sjogren, Jon A.
1988-01-01
Examined is a methodology for hardware verification developed by Royal Signals and Radar Establishment (RSRE) in the context of the SRI International's Enhanced Hierarchical Design Methodology (EHDM) specification/verification system. The methodology utilizes a four-level specification hierarchy with the following levels: functional level, finite automata model, block model, and circuit level. The properties of a level are proved as theorems in the level below it. This methodology is applied to a 6-bit counter problem and is critically examined. The specifications are written in EHDM's specification language, Extended Special, and the proofs are improving both the RSRE methodology and the EHDM system.
Mjøsund, Nina Helen; Eriksson, Monica; Espnes, Geir Arild; Haaland-Øverby, Mette; Jensen, Sven Liang; Norheim, Irene; Kjus, Solveig Helene Høymork; Portaasen, Inger-Lill; Vinje, Hege Forbech
2017-01-01
The aim of this study was to examine how service user involvement can contribute to the development of interpretative phenomenological analysis methodology and enhance research quality. Interpretative phenomenological analysis is a qualitative methodology used in nursing research internationally to understand human experiences that are essential to the participants. Service user involvement is requested in nursing research. We share experiences from 4 years of collaboration (2012-2015) on a mental health promotion project, which involved an advisory team. Five research advisors either with a diagnosis or related to a person with severe mental illness constituted the team. They collaborated with the research fellow throughout the entire research process and have co-authored this article. We examined the joint process of analysing the empirical data from interviews. Our analytical discussions were audiotaped, transcribed and subsequently interpreted following the guidelines for good qualitative analysis in interpretative phenomenological analysis studies. The advisory team became 'the researcher's helping hand'. Multiple perspectives influenced the qualitative analysis, which gave more insightful interpretations of nuances, complexity, richness or ambiguity in the interviewed participants' accounts. The outcome of the service user involvement was increased breadth and depth in findings. Service user involvement improved the research quality in a nursing research project on mental health promotion. The interpretative element of interpretative phenomenological analysis was enhanced by the emergence of multiple perspectives in the qualitative analysis of the empirical data. We argue that service user involvement and interpretative phenomenological analysis methodology can mutually reinforce each other and strengthen qualitative methodology. © 2016 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.
How Root Cause Analysis Can Improve the Value Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wixson, James Robert
2002-05-01
Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can bemore » developed in the creativity phase because the team better understands the problems associated with these functions.« less
Helicopter-V/STOL dynamic wind and turbulence design methodology
NASA Technical Reports Server (NTRS)
Bailey, J. Earl
1987-01-01
Aircraft and helicopter accidents due to severe dynamic wind and turbulence continue to present challenging design problems. The development of the current set of design analysis tools for a aircraft wind and turbulence design began in the 1940's and 1950's. The areas of helicopter dynamic wind and turbulence modeling and vehicle response to severe dynamic wind inputs (microburst type phenomena) during takeoff and landing remain as major unsolved design problems from a lack of both environmental data and computational methodology. The development of helicopter and V/STOL dynamic wind and turbulence response computation methology is reviewed, the current state of the design art in industry is outlined, and comments on design methodology are made which may serve to improve future flight vehicle design.
Ambient Vibration Testing for Story Stiffness Estimation of a Heritage Timber Building
Min, Kyung-Won; Kim, Junhee; Park, Sung-Ah; Park, Chan-Soo
2013-01-01
This paper investigates dynamic characteristics of a historic wooden structure by ambient vibration testing, presenting a novel estimation methodology of story stiffness for the purpose of vibration-based structural health monitoring. As for the ambient vibration testing, measured structural responses are analyzed by two output-only system identification methods (i.e., frequency domain decomposition and stochastic subspace identification) to estimate modal parameters. The proposed methodology of story stiffness is estimation based on an eigenvalue problem derived from a vibratory rigid body model. Using the identified natural frequencies, the eigenvalue problem is efficiently solved and uniquely yields story stiffness. It is noteworthy that application of the proposed methodology is not necessarily confined to the wooden structure exampled in the paper. PMID:24227999
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
Craig, Hugh; Berretta, Regina; Moscato, Pablo
2016-01-01
In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416
ERIC Educational Resources Information Center
Spillane, James P.; Camburn, Eric M.; Pustejovsky, James; Pareja, Amber Stitziel; Lewis, Geoff
2008-01-01
Purpose: This paper is concerned with the epistemological and methodological challenges involved in studying the distribution of leadership across people within the school--the leader-plus aspect of a distributed perspective, which it aims to investigate. Design/methodology/approach: The paper examines the entailments of the distributed…
[Problem-based learning, a strategy to employ it].
Guillamet Lloveras, Ana; Celma Vicente, Matilde; González Carrión, Pilar; Cano-Caballero Gálvez, Ma Dolores; Pérez Ramírez, Francisca
2009-02-01
The Virgen de las Nieves University School of Nursing has adopted the methodology of Problem-Based Learning (ABP in Spanish acronym) as a supplementary method to gain specific transversal competencies. In so doing, all basic required/obligatory subjects necessary for a degree have been partially affected. With the objective of identifying and administering all the structural and cultural barriers which could impede the success or effectiveness of its adoption, a strategic analysis at the School was carried out. This technique was based on a) knowing the strong and weak points the School has for adopting the Problem-Based Learning methodology; b) describing the structural problems and necessities to carry out this teaching innovation; c) to discover the needs professors have regarding knowledge and skills related to Problem-Based Learning; d) to prepare students by informing them about the characteristics of Problem-Based Learning; e) to evaluate the results obtained by means of professor and student opinions, f) to adopt the improvements identified. The stages followed were: strategic analysis, preparation, pilot program, adoption and evaluation.
A Nursing Process Methodology.
ERIC Educational Resources Information Center
Ryan-Wenger, Nancy M.
1990-01-01
A nursing methodology developed by the faculty at The Ohio State University teaches nursing students problem-solving techniques applicable to any nursing situation. It also provides faculty and students with a basis for measuring students' progress and ability in applying the nursing process. (Author)
Software production methodology tested project
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1976-01-01
The history and results of a 3 1/2-year study in software development methodology are reported. The findings of this study have become the basis for DSN software development guidelines and standard practices. The article discusses accomplishments, discoveries, problems, recommendations and future directions.
STUDYING FOREST ROOT SYSTEMS - AN OVERVIEW OF METHODOLOGICAL PROBLEMS
The study of tree root systems is central to understanding forest ecosystem carbon and nutrient cycles, nutrient and water uptake, C allocation patterns by trees, soil microbial populations, adaptation of trees to stress, soil organic matter production, etc. Methodological probl...
The mandate for a proper preservation in histopathological tissues.
Comănescu, Maria; Arsene, D; Ardeleanu, Carmen; Bussolati, G
2012-01-01
A sequence of technically reproducible procedures is mandatory to guarantee a proper preservation of tissues and to build up the basis for sound diagnoses. However, while the goal of these procedures was, until recently, to assure only structural (histological and cytological) preservation, an appropriate preservation of antigenic properties and of nucleic acid integrity is now additionally requested, in order to permit pathologists to provide the biological information necessary for the adoption of personalized therapies. The present review analyses the sequence of technical steps open to critical variations. Passages such as dehydration, paraffin embedding, sectioning and staining are relatively well standardized and allow adoption of dedicated (automatic) apparatuses, while other pre-analytical steps, i.e. time and modalities of transfer of surgical specimens from the surgical theatre to the pathology laboratory (s.c. "ischemia time") and the type and length of fixation are not standardized and are a potential cause of discrepancies in diagnostic results. Our group is involved in European-funded projects tackling these problems with the concrete objective of implementing a model of effective tumors investigations by high performance genetic and molecular methodologies. The problem of the discrepant quality level of histopathological and cytological preparations involved five European countries and exploiting the potential of "virtual slide technology". Concrete issues, techniques and pitfalls, as well as proposed guidelines for processing the tissues are shown in this presentation.
Transport dissipative particle dynamics model for mesoscopic advection- diffusion-reaction problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhen, Li; Yazdani, Alireza; Tartakovsky, Alexandre M.
2015-07-07
We present a transport dissipative particle dynamics (tDPD) model for simulating mesoscopic problems involving advection-diffusion-reaction (ADR) processes, along with a methodology for implementation of the correct Dirichlet and Neumann boundary conditions in tDPD simulations. tDPD is an extension of the classic DPD framework with extra variables for describing the evolution of concentration fields. The transport of concentration is modeled by a Fickian flux and a random flux between particles, and an analytical formula is proposed to relate the mesoscopic concentration friction to the effective diffusion coefficient. To validate the present tDPD model and the boundary conditions, we perform three tDPDmore » simulations of one-dimensional diffusion with different boundary conditions, and the results show excellent agreement with the theoretical solutions. We also performed two-dimensional simulations of ADR systems and the tDPD simulations agree well with the results obtained by the spectral element method. Finally, we present an application of the tDPD model to the dynamic process of blood coagulation involving 25 reacting species in order to demonstrate the potential of tDPD in simulating biological dynamics at the mesoscale. We find that the tDPD solution of this comprehensive 25-species coagulation model is only twice as computationally expensive as the DPD simulation of the hydrodynamics only, which is a significant advantage over available continuum solvers.« less
Sinha, Sunny
2016-01-01
While much has been said about the risks and safety issues experienced by female sex workers in India, there is a considerable dearth of information about the difficulties and problems that sex work researchers, especially female researchers, experience when navigating the highly political, ideological, and stigmatized environment of the Indian sex industry. As noted by scholars, there are several methodological and ethical issues involved with sex work research, such as privacy and confidentiality of the participants, representativeness of the sample, and informed consent. Yet, there has been reluctance among scholars to comment on their research process, especially with regard to how they deal with the protocols for research ethics when conducting social and behavioral epidemiological studies among female sex workers in India and elsewhere. Drawing on my 7 months of field-based ethnographic research with “flying” or non-brothel-based female sex workers in Kolkata, India, I provide in this article a reflexive account of the problems encountered in implementing the research process, particularly the ethical and safety issues involved in gaining access and acceptance into the sex industry and establishing contact and rapport with the participants. In doing so, it is my hope that future researchers can develop the knowledge necessary for the design of ethical and non-exploitative research projects with sex workers. PMID:27651071
Sinha, Sunny
2017-05-01
While much has been said about the risks and safety issues experienced by female sex workers in India, there is a considerable dearth of information about the difficulties and problems that sex work researchers, especially female researchers, experience when navigating the highly political, ideological, and stigmatized environment of the Indian sex industry. As noted by scholars, there are several methodological and ethical issues involved with sex work research, such as privacy and confidentiality of the participants, representativeness of the sample, and informed consent. Yet, there has been reluctance among scholars to comment on their research process, especially with regard to how they deal with the protocols for research ethics when conducting social and behavioral epidemiological studies among female sex workers in India and elsewhere. Drawing on my 7 months of field-based ethnographic research with "flying" or non-brothel-based female sex workers in Kolkata, India, I provide in this article a reflexive account of the problems encountered in implementing the research process, particularly the ethical and safety issues involved in gaining access and acceptance into the sex industry and establishing contact and rapport with the participants. In doing so, it is my hope that future researchers can develop the knowledge necessary for the design of ethical and non-exploitative research projects with sex workers.
Project management practices in engineering university
NASA Astrophysics Data System (ADS)
Sirazitdinova, Y.; Dulzon, A.; Mueller, B.
2015-10-01
The article presents the analysis of usage of project management methodology in Tomsk Polytechnic University, in particular the experience with the course Project management which started 15 years ago. The article presents the discussion around advantages of project management methodology for engineering education and administration of the university in general and the problems impeding extensive implementation of this methodology in teaching, research and management in the university.
NASA Technical Reports Server (NTRS)
David, J. W.; Mitchell, L. D.
1982-01-01
Difficulties in solution methodology to be used to deal with the potentially higher nonlinear rotor equations when dynamic coupling is included. A solution methodology is selected to solve the nonlinear differential equations. The selected method was verified to give good results even at large nonlinearity levels. The transfer matrix methodology is extended to the solution of nonlinear problems.
A study of the performance of patients with frontal lobe lesions in a financial planning task.
Goel, V; Grafman, J; Tajik, J; Gana, S; Danto, D
1997-10-01
It has long been argued that patients with lesions in the prefrontal cortex have difficulties in decision making and problem solving in real-world, ill-structured situations, particularly problem types involving planning and look-ahead components. Recently, several researchers have questioned our ability to capture and characterize these deficits adequately using just the standard neuropsychological test batteries, and have called for tests that reflect real-world task requirements more accurately. We present data from 10 patients with focal lesions to the prefrontal cortex and 10 normal control subjects engaged in a real-world financial planning task. We also introduce a theoretical framework and methodology developed in the cognitive science literature for quantifying and analysing the complex data generated by problem-solving tasks. Our findings indicate that patient performance is impoverished at a global level but not at the local level. Patients have difficulty in organizing and structuring their problem space. Once they begin problem solving, they have difficulty in allocating adequate effort to each problem-solving phase. Patients also have difficulty dealing with the fact that there are no right or wrong answers nor official termination points in real-world planning problems. They also find it problematic to generate their own feedback. They invariably terminate the session before the details are fleshed out and all the goals satisfied. Finally, patients do not take full advantage of the fact that constraints on real-world problems are negotiable. However, it is not necessary to postulate a 'planning' deficit. It is possible to understand the patients' difficulties in real world planning tasks in terms of the following four accepted deficits: inadequate access to 'structured event complexes', difficulty in generalizing from particulars, failure to shift between 'mental sets', and poor judgment regarding adequacy and completeness of a plan.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krummel, J.R.; Markin, J.B.; O'Neill, R.V.
Regional analyses of the interaction between human populations and natural resources must integrate landscape scale environmental problems. An approach that considers human culture, environmental processes, and resource needs offers an appropriate methodology. With this methodology, we analyze problems of food availability in African cattle-keeping societies. The analysis interrelates cattle biomass, forage availability, milk and blood production, crop yields, gathering, food subsidies, population, and variable precipitation. While an excess of cattle leads to overgrazing, cattle also serve as valuable food storage mechanisms during low rainfall periods. Food subsidies support higher population levels but do not alter drought-induced population fluctuations. Variable precipitationmore » patterns require solutions that stabilize year-to-year food production and also address problems of overpopulation.« less
Automating the packing heuristic design process with genetic programming.
Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John
2012-01-01
The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.
INTEGRATION OF POLLUTION PREVENTION TOOLS
A prototype computer-based decision support system was designed to provide small businesses with an integrated pollution prevention methodology. Preliminary research involved compilation of an inventory of existing pollution prevention tools (i.e., methodologies, software, etc.),...
The Dogma of "The" Scientific Method.
ERIC Educational Resources Information Center
Wivagg, Dan; Allchin, Douglas
2002-01-01
Points out major problems with the scientific method as a model for learning about methodology in science and suggests teaching about the scientists' toolbox to remedy problems with the conventional scientific method. (KHR)
Fuzzy multi objective transportation problem – evolutionary algorithm approach
NASA Astrophysics Data System (ADS)
Karthy, T.; Ganesan, K.
2018-04-01
This paper deals with fuzzy multi objective transportation problem. An fuzzy optimal compromise solution is obtained by using Fuzzy Genetic Algorithm. A numerical example is provided to illustrate the methodology.
Evaluating Writing Programs: Paradigms, Problems, Possibilities.
ERIC Educational Resources Information Center
McLeod, Susan H.
1992-01-01
Describes two methodological approaches (qualitative and quantitative) that grow out of two different research examples. Suggests the problems these methods present. Discusses the ways in which an awareness of these problems can help teachers to understand how to work with researchers in designing useful evaluations of writing programs. (PRA)
[Scientific and methodologic approaches to evaluating medical management for workers of Kazakhstan].
2012-01-01
The article covers topical problems of workers' health preservation. Complex research results enabled to evaluate and analyze occupational risks in leading industries of Kazakhstan, for improving scientific and methodologic approaches to medical management for workers subjected to hazardous conditions.
Global/local methods research using a common structural analysis framework
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.
1991-01-01
Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.
Terminology and Methodology Related to the Use of Heart Rate Responsivity in Infancy Research
ERIC Educational Resources Information Center
Woodcock, James M.
1971-01-01
Methodological problems in measuring and interpreting infantile heart rate reactivity in research are discussed. Various ways of describing cardiac activity are listed. Attention is given to the relationship between resting state and heart rate responsivity. (Author/WY)
Employee Turnover: An Empirical and Methodological Assessment.
ERIC Educational Resources Information Center
Muchinsky, Paul M.; Tuttle, Mark L.
1979-01-01
Reviews research on the prediction of employee turnover. Groups predictor variables into five general categories: attitudinal (job satisfaction), biodata, work-related, personal, and test-score predictors. Consistent relationships between common predictor variables and turnover were found for four categories. Eight methodological problems/issues…
ERIC Educational Resources Information Center
And Others; Rynders, John E.
1978-01-01
For many years, the educational capabilities of Down's syndrome persons have been underestimated because a large number of studies purporting to give an accurate picture of Down's syndrome persons' developmental capabilities have had serious methodological flaws. (Author)
Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows
NASA Astrophysics Data System (ADS)
Jittamai, Phongchai
This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.
NASA Astrophysics Data System (ADS)
Dib, Alain; Kavvas, M. Levent
2018-03-01
The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.
NASA Astrophysics Data System (ADS)
Neumann, Karl
1987-06-01
In the methodological discussion of recent years it has become apparent that many research problems, including problems relating to the theory of educational science, cannot be solved by using quantitative methods. The multifaceted aspects of human behaviour and all its environment-bound subtle nuances, especially the process of education or the development of identity, cannot fully be taken into account within a rigid neopositivist approach. In employing the paradigm of symbolic interactionism as a suitable model for the analysis of processes of education and formation, the research has generally to start out from complex reciprocal social interactions instead of unambigious connections of causes. In analysing several particular methodological problems, the article demonstrates some weaknesses of quantitative approaches and then shows the advantages in and the necessity for using qualitative research tools.
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid
2017-01-01
Background Design processes such as human-centered design (HCD), which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of HCD can often conflict with the necessary rapid product development life-cycles associated with the competitive connected health industry. Objective The aim of this study was to apply a structured HCD methodology to the development of a smartphone app that was to be used within a connected health fall risk detection system. Our methodology utilizes so called discount usability engineering techniques to minimize the burden on resources during development and maintain a rapid pace of development. This study will provide prospective designers a detailed description of the application of a HCD methodology. Methods A 3-phase methodology was applied. In the first phase, a descriptive “use case” was developed by the system designers and analyzed by both expert stakeholders and end users. The use case described the use of the app and how various actors would interact with it and in what context. A working app prototype and a user manual were then developed based on this feedback and were subjected to a rigorous usability inspection. Further changes were made both to the interface and support documentation. The now advanced prototype was exposed to user testing by end users where further design recommendations were made. Results With combined expert and end-user analysis of a comprehensive use case having originally identified 21 problems with the system interface, we have only seen and observed 3 of these problems in user testing, implying that 18 problems were eliminated between phase 1 and 3. Satisfactory ratings were obtained during validation testing by both experts and end users, and final testing by users shows the system requires low mental, physical, and temporal demands according to the NASA Task Load Index (NASA-TLX). Conclusions From our observation of older adults’ interactions with smartphone interfaces, there were some recurring themes. Clear and relevant feedback as the user attempts to complete a task is critical. Feedback should include pop-ups, sound tones, color or texture changes, or icon changes to indicate that a function has been completed successfully, such as for the connection sequence. For text feedback, clear and unambiguous language should be used so as not to create anxiety, particularly when it comes to saving data. Warning tones or symbols, such as caution symbols or shrill tones, should only be used if absolutely necessary. Our HCD methodology, designed and implemented based on the principles of the International Standard Organizaton (ISO) 9241-210 standard, produced a functional app interface within a short production cycle, which is now suitable for use by older adults in long term clinical trials. PMID:28559227
Interactive instruction in otolaryngology resident education.
Schweinfurth, John M
2007-12-01
Today's academic faculty was typically trained under an education system based entirely on didactic lectures. However, if the aim is to teach thinking or change attitudes beyond the simple transmission of factual knowledge, then lectures alone, without active involvement of the students, are not the most effective method of teaching. If the goals of teaching are to arouse and keep students' interest, give facts and details, to make students think critically about the subject, and to prepare for independent studies by demonstration of problem solving and professional reasoning, then only two of these purposes are suited to didactic lectures. The problem then is how to organize lecture material so that individual student's learning needs are better addressed. The education literature suggests that instruction include a variety of activities designed to stimulate individual thought. These activities include small group discussion, working problems during lecture time, questions included in the lecture, and quizzes at the end of lecture, among others. The current study was undertaken to examine the feasibility of using these types of interactive learning techniques in an otolaryngology residency program. Possibilities considered in the current study include standard interactive lecturing, facilitated discussion, brainstorming, small group activities, problem solving, competitive large group exercises, and the use of illustrative cliff hanger and incident cases. The feasibility of these methodologies being effectively incorporated into a residency curriculum is discussed.
ERIC Educational Resources Information Center
Magnussen, Rikke; Zachariassen, Maria; Kharlamov, Nikita; Larsen, Birger
2017-01-01
This paper presents a methodological discussion of the potential and challenges of involving mobile eye tracking technology in studies of knowledge generation and learning in a science centre context. The methodological exploration is based on eye-tracking studies of audience interaction and knowledge generation in the technology-enhanced health…
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
Development of Contemporary Problem-Based Learning Projects in Particle Technology
ERIC Educational Resources Information Center
Harris, Andrew T.
2009-01-01
The University of Sydney has offered an undergraduate course in particle technology using a contemporary problem based learning (PBL) methodology since 2005. Student learning is developed through the solution of complex, open-ended problems drawn from modern chemical engineering practice. Two examples are presented; i) zero emission electricity…
The Study of Socio-Biospheric Problems.
ERIC Educational Resources Information Center
Scott, Andrew M.
Concepts, tools, and a methodology are needed which will permit the analysis of emergent socio-biospheric problems and facilitate their effective management. Many contemporary problems may be characterized as socio-biospheric; for example, pollution of the seas, acid rain, the growth of cities, and an atmosphere loaded with carcinogens. However,…