Sample records for main methodological problems

  1. [Methodology of Screening New Antibiotics: Present Status and Prospects].

    PubMed

    Trenin, A S

    2015-01-01

    Due to extensive distribution of pathogen resistance to available pharmaceuticals and serious problems in the treatment of various infections and tumor diseases, the necessity of new antibiotics is urgent. The basic methodological approaches to chemical synthesis of antibiotics and screening of new antibiotics among natural products, mainly among microbial secondary metabolites, are considered in the review. Since the natural compounds are very much diverse, screening of such substances gives a good opportunity to discover antibiotics of various chemical structure and mechanism of action. Such an approach followed by chemical or biological transformation, is capable of providing the health care with new effective pharmaceuticals. The review is mainly concentrated on screening of natural products and methodological problems, such as: isolation of microbial producers from the habitats, cultivation of microorganisms producing appropriate substances, isolation and chemical characterization of microbial metabolites, identification of the biological activity of the metabolites. The main attention is paid to the problems of microbial secondary metabolism and design of new models for screening biologically active compounds. The last achievements in the field of antibiotics and most perspective approaches to future investigations are discussed. The main methodological approach to isolation and cultivation of the producers remains actual and needs constant improvement. The increase of the screening efficiency can be achieved by more rapid chemical identification of antibiotics and design of new screening models based on the biological activity detection.

  2. A Social-Medical Approach to Violence in Colombia

    PubMed Central

    Franco, Saul

    2003-01-01

    Violence is the main public health problem in Colombia. Many theoretical and methodological approaches to solving this problem have been attempted from different disciplines. My past work has focused on homicide violence from the perspective of social medicine. In this article I present the main conceptual and methodological aspects and the chief findings of my research over the past 15 years. Findings include a quantitative description of the current situation and the introduction of the category of explanatory contexts as a contribution to the study of Colombian violence. The complexity and severity of this problem demand greater theoretical discussion, more plans for action and a faster transition between the two. Social medicine may make a growing contribution to this field. PMID:14652328

  3. A social-medical approach to violence in Colombia.

    PubMed

    Franco, Saul

    2003-12-01

    Violence is the main public health problem in Colombia. Many theoretical and methodological approaches to solving this problem have been attempted from different disciplines. My past work has focused on homicide violence from the perspective of social medicine. In this article I present the main conceptual and methodological aspects and the chief findings of my research over the past 15 years. Findings include a quantitative description of the current situation and the introduction of the category of explanatory contexts as a contribution to the study of Colombian violence. The complexity and severity of this problem demand greater theoretical discussion, more plans for action and a faster transition between the two. Social medicine may make a growing contribution to this field.

  4. Finite element analysis of steady and transiently moving/rolling nonlinear viscoelastic structure. III - Impact/contact simulations

    NASA Technical Reports Server (NTRS)

    Nakajima, Yukio; Padovan, Joe

    1987-01-01

    In a three-part series of papers, a generalized finite element methodology is formulated to handle traveling load problems involving large deformation fields in structure composed of viscoelastic media. The main thrust of this paper is to develop an overall finite element methodology and associated solution algorithms to handle the transient aspects of moving problems involving contact impact type loading fields. Based on the methodology and algorithms formulated, several numerical experiments are considered. These include the rolling/sliding impact of tires with road obstructions.

  5. Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.

    PubMed

    Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin

    2017-08-16

    The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.

  6. Methodology of Comparative Analysis of Public School Teachers' Continuing Professional Development in Great Britain, Canada and the USA

    ERIC Educational Resources Information Center

    Mukan, Nataliya; Kravets, Svitlana

    2015-01-01

    In the article the methodology of comparative analysis of public school teachers' continuing professional development (CPD) in Great Britain, Canada and the USA has been presented. The main objectives are defined as theoretical analysis of scientific and pedagogical literature, which highlights different aspects of the problem under research;…

  7. A Methodological Approach for Training Analysts of Small Business Problems.

    ERIC Educational Resources Information Center

    Mackness, J. R.

    1986-01-01

    Steps in a small business analysis are discussed: understand how company activities interact internally and with markets and suppliers; know the relative importance of controllable management variables; understand the social atmosphere within the company; analyze the operations of the company; define main problem areas; identify possible actions…

  8. An integrated methodology to assess the benefits of urban green space.

    PubMed

    De Ridder, K; Adamec, V; Bañuelos, A; Bruse, M; Bürger, M; Damsgaard, O; Dufek, J; Hirsch, J; Lefebre, F; Pérez-Lacorzana, J M; Thierry, A; Weber, C

    2004-12-01

    The interrelated issues of urban sprawl, traffic congestion, noise, and air pollution are major socioeconomic problems faced by most European cities. A methodology is currently being developed for evaluating the role of green space and urban form in alleviating the adverse effects of urbanisation, mainly focusing on the environment but also accounting for socioeconomic aspects. The objectives and structure of the methodology are briefly outlined and illustrated with preliminary results obtained from case studies performed on several European cities.

  9. Capturing security requirements for software systems.

    PubMed

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-07-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.

  10. Capturing security requirements for software systems

    PubMed Central

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-01-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way. PMID:25685514

  11. Aeroelastic optimization methodology for viscous and turbulent flows

    NASA Astrophysics Data System (ADS)

    Barcelos Junior, Manuel Nascimento Dias

    2007-12-01

    In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.

  12. A strategy for developing a launch vehicle system for orbit insertion: Methodological aspects

    NASA Astrophysics Data System (ADS)

    Klyushnikov, V. Yu.; Kuznetsov, I. I.; Osadchenko, A. S.

    2014-12-01

    The article addresses methodological aspects of a development strategy to design a launch vehicle system for orbit insertion. The development and implementation of the strategy are broadly outlined. An analysis is provided of the criterial base and input data needed to define the main requirements for the launch vehicle system. Approaches are suggested for solving individual problems in working out the launch vehicle system development strategy.

  13. Problem-Based Learning on Students' Critical Thinking Skills in Teaching Business Education in Malaysia: A Literature Review

    ERIC Educational Resources Information Center

    Zabit, Mohd Nazir Md

    2010-01-01

    This review forms the background to explore and to gain empirical support among lecturers to improve the students' critical thinking skills in business education courses in Malaysia, in which the main teaching and learning methodology is Problem-Based Learning (PBL). The PBL educational approach is known to have maximum positive impacts in…

  14. Multi-UAV Routing for Area Coverage and Remote Sensing with Minimum Time.

    PubMed

    Avellar, Gustavo S C; Pereira, Guilherme A S; Pimenta, Luciano C A; Iscold, Paulo

    2015-11-02

    This paper presents a solution for the problem of minimum time coverage of ground areas using a group of unmanned air vehicles (UAVs) equipped with image sensors. The solution is divided into two parts: (i) the task modeling as a graph whose vertices are geographic coordinates determined in such a way that a single UAV would cover the area in minimum time; and (ii) the solution of a mixed integer linear programming problem, formulated according to the graph variables defined in the first part, to route the team of UAVs over the area. The main contribution of the proposed methodology, when compared with the traditional vehicle routing problem's (VRP) solutions, is the fact that our method solves some practical problems only encountered during the execution of the task with actual UAVs. In this line, one of the main contributions of the paper is that the number of UAVs used to cover the area is automatically selected by solving the optimization problem. The number of UAVs is influenced by the vehicles' maximum flight time and by the setup time, which is the time needed to prepare and launch a UAV. To illustrate the methodology, the paper presents experimental results obtained with two hand-launched, fixed-wing UAVs.

  15. Teaching Geomorphology at University

    ERIC Educational Resources Information Center

    Sugden, David; Hamilton, Patrick

    1978-01-01

    Geomorphology courses in British universities emphasize the main landform/process systems rather than more abstract concepts. Recommends a more theoretical focus on fundamental geomorphic processes and methodological problems. Available from: Faculty of Modern Studies, Oxford Polytechnic, Headington, Oxford OX3 OBP, England. (Author/AV)

  16. Risk methodology overview. [for carbon fiber release

    NASA Technical Reports Server (NTRS)

    Credeur, K. R.

    1979-01-01

    Some considerations of risk estimation, how risk is measured, and how risk analysis decisions are made are discussed. Specific problems of carbon fiber release are discussed by reviewing the objective, describing the main elements, and giving an example of the risk logic and outputs.

  17. [Management of asthma in a context of ambulatory pediatrics: relevance and possibility to avoid the problems. Gruppo de lavoro pediatri dell'Abruzzo Basilicata e Puglia].

    PubMed

    Misticoni, G; Marchetti, F; D'Andrea, N

    1994-01-01

    41 pediatricians agreed to register on a very simple form, all the cases of children affected by bronchial asthma visited in their clinic during october 1993. The data included basic information related to the therapy prescribed, its duration, a judgement on the efficacy of symptoms control and the main problems encountered with the children and their families. 237 cases were reported (mean age 4.6 year, range 2 months-13 years). 80% of children were monitored by the pediatrician; 47% had allergic reactions. The main drug used for profilaxis is ketotifen, a compound without documented efficacy; the main route for drug administration (especially during acute attacks) is by mouth, instead of by aerosol, evidencing problems in the health education on practical skills. In fact the main problems encountered by doctors are related to the communication with patients and families. This survey represents also a research model for involving health care providers and easily and quickly obtaining a useful, methodologically sound and interesting picture of everyday practice.

  18. [Perinatal mortality research in Brazil: review of methodology and results].

    PubMed

    Fonseca, Sandra Costa; Coutinho, Evandro da Silva Freire

    2004-01-01

    The perinatal mortality rate remains a public health problem, demanding epidemiological studies to describe its magnitude and time trends, identify risk factors, and define adequate interventions. There are still methodological controversies, resulting in heterogeneous studies and possible biases. In Brazil, there has been a growing scientific output on this theme, mainly in the South and Southeast of the country. Twenty-four articles from 1996 to 2003 were reviewed, focusing on definitions and classifications, data sources, study designs, measurement of variables, statistical analysis, and results. The review showed an increasing utilization of data bases (mainly SINASC and SIM), few studies on stillbirth, the incorporation of classification schemes, and disagreement concerning risk factors.

  19. Multi-UAV Routing for Area Coverage and Remote Sensing with Minimum Time

    PubMed Central

    Avellar, Gustavo S. C.; Pereira, Guilherme A. S.; Pimenta, Luciano C. A.; Iscold, Paulo

    2015-01-01

    This paper presents a solution for the problem of minimum time coverage of ground areas using a group of unmanned air vehicles (UAVs) equipped with image sensors. The solution is divided into two parts: (i) the task modeling as a graph whose vertices are geographic coordinates determined in such a way that a single UAV would cover the area in minimum time; and (ii) the solution of a mixed integer linear programming problem, formulated according to the graph variables defined in the first part, to route the team of UAVs over the area. The main contribution of the proposed methodology, when compared with the traditional vehicle routing problem’s (VRP) solutions, is the fact that our method solves some practical problems only encountered during the execution of the task with actual UAVs. In this line, one of the main contributions of the paper is that the number of UAVs used to cover the area is automatically selected by solving the optimization problem. The number of UAVs is influenced by the vehicles’ maximum flight time and by the setup time, which is the time needed to prepare and launch a UAV. To illustrate the methodology, the paper presents experimental results obtained with two hand-launched, fixed-wing UAVs. PMID:26540055

  20. Decision-theoretic methodology for reliability and risk allocation in nuclear power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.

    1985-01-01

    This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferencesmore » could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed.« less

  1. Robust optimization modelling with applications to industry and environmental problems

    NASA Astrophysics Data System (ADS)

    Chaerani, Diah; Dewanto, Stanley P.; Lesmana, Eman

    2017-10-01

    Robust Optimization (RO) modeling is one of the existing methodology for handling data uncertainty in optimization problem. The main challenge in this RO methodology is how and when we can reformulate the robust counterpart of uncertain problems as a computationally tractable optimization problem or at least approximate the robust counterpart by a tractable problem. Due to its definition the robust counterpart highly depends on how we choose the uncertainty set. As a consequence we can meet this challenge only if this set is chosen in a suitable way. The development on RO grows fast, since 2004, a new approach of RO called Adjustable Robust Optimization (ARO) is introduced to handle uncertain problems when the decision variables must be decided as a ”wait and see” decision variables. Different than the classic Robust Optimization (RO) that models decision variables as ”here and now”. In ARO, the uncertain problems can be considered as a multistage decision problem, thus decision variables involved are now become the wait and see decision variables. In this paper we present the applications of both RO and ARO. We present briefly all results to strengthen the importance of RO and ARO in many real life problems.

  2. An overall index of environmental quality in coal mining areas and energy facilities.

    PubMed

    Vatalis, Konstantinos I; Kaliampakos, Demetrios C

    2006-12-01

    An approach to measuring environmental quality and trends in coal mining and industrial areas was attempted in this work. For this purpose, the establishment of a reference scale characterizing the status of environmental quality is proposed by developing an Environmental Quality Index (EQI). The methodology involves three main components: social research, the opinion of environmental experts, and the combination of new or existing indices. A survey of public opinion was carried out to identify the main environmental problems in the region of interest. Environmental experts carried out a survey, and the weights of specific environmental problems were obtained through a fuzzy Delphi method and pairwise comparison. The weight attributed to each environmental problem was computed, using new or existing indices (subindices) in the relevant literature. The EQI comprises a combination of the subindices with their own weights. The methodology was applied to a heavily industrialized coal basin in northwestern Macedonia, Greece. The results show that the new index may be used as a reliable tool for evaluating environmental quality in different areas. In addition, the study of EQI trends on an interannual basis can provide useful information on the efficiency of environmental policies already implemented by the responsible authorities.

  3. [The problems of hearing impairment in the flying staff of commercial aviation in Russia].

    PubMed

    Pankova, V B; Bushmanov, A Iu

    2014-01-01

    The authors discuss the problems pertaining to the growing incidence of hearing impairment in the members of the flying staff employed in commercial aviation of Russia and the main criteria used to elucidate the causes behind occupational diseases of the organs of hearing. Special attention is given to the principal normative documents regulating the methodological basis on which the acoustic factor in the aircraft cockpit is evaluated, peculiarities of occupational sensorineural hearing impairment and the methods for its detection. The main errors in the determination of the relationship between the working conditions and the diseases of the organs of hearing are discussed.

  4. A Methodology for Multidisciplinary Decision Making for a Surface Combatant Main Engine Selection Problem

    DTIC Science & Technology

    2014-06-01

    nautical miles per hour ONR Office of Naval Research OPC overall propulsive coefficient RS repairable at sea SFC specific fuel consumption SHP shaft...fmSFC SHP hp =  (2.3) Overall Propulsive Coefficient ( OPC ) The overall propulsive coefficient is equal to the ratio between the effective...horsepower (EHP), and the total installed shaft horsepower (SHP) delivered by the main engine [6]. OPC can be determined using the following relationship

  5. Artificial Intelligence Methodologies in Flight Related Differential Game, Control and Optimization Problems

    DTIC Science & Technology

    1993-01-31

    28 Controllability and Observability ............................. .32 ’ Separation of Learning and Control ... ... 37 Linearization via... Linearization via Transformation of Coordinates and Nonlinear Fedlback . .1 Main Result ......... .............................. 13 Discussion...9 2.1 Basic Structure of a NLM........................ .󈧟 2.2 General Structure of NNLM .......................... .28 2.3 Linear System

  6. University Students' Understanding of the Concepts Empirical, Theoretical, Qualitative and Quantitative Research

    ERIC Educational Resources Information Center

    Murtonen, Mari

    2015-01-01

    University research education in many disciplines is frequently confronted by problems with students' weak level of understanding of research concepts. A mind map technique was used to investigate how students understand central methodological concepts of empirical, theoretical, qualitative and quantitative. The main hypothesis was that some…

  7. An integer programming approach to a real-world recyclable waste collection problem in Argentina.

    PubMed

    Braier, Gustavo; Durán, Guillermo; Marenco, Javier; Wesner, Francisco

    2017-05-01

    This article reports on the use of mathematical programming techniques to optimise the routes of a recyclable waste collection system servicing Morón, a large municipality outside Buenos Aires, Argentina. The truck routing problem posed by the system is a particular case of the generalised directed open rural postman problem. An integer programming model is developed with a solving procedure built around a subtour-merging algorithm and the addition of subtour elimination constraints. The route solutions generated by the proposed methodology perform significantly better than the previously used, manually designed routes, the main improvement being that coverage of blocks within the municipality with the model solutions is 100% by construction, whereas with the manual routes as much as 16% of the blocks went unserviced. The model-generated routes were adopted by the municipality in 2014 and the national government is planning to introduce the methodology elsewhere in the country.

  8. Multiobjective Optimization of Atmospheric Plasma Spray Process Parameters to Deposit Yttria-Stabilized Zirconia Coatings Using Response Surface Methodology

    NASA Astrophysics Data System (ADS)

    Ramachandran, C. S.; Balasubramanian, V.; Ananthapadmanabhan, P. V.

    2011-03-01

    Atmospheric plasma spraying is used extensively to make Thermal Barrier Coatings of 7-8% yttria-stabilized zirconia powders. The main problem faced in the manufacture of yttria-stabilized zirconia coatings by the atmospheric plasma spraying process is the selection of the optimum combination of input variables for achieving the required qualities of coating. This problem can be solved by the development of empirical relationships between the process parameters (input power, primary gas flow rate, stand-off distance, powder feed rate, and carrier gas flow rate) and the coating quality characteristics (deposition efficiency, tensile bond strength, lap shear bond strength, porosity, and hardness) through effective and strategic planning and the execution of experiments by response surface methodology. This article highlights the use of response surface methodology by designing a five-factor five-level central composite rotatable design matrix with full replication for planning, conduction, execution, and development of empirical relationships. Further, response surface methodology was used for the selection of optimum process parameters to achieve desired quality of yttria-stabilized zirconia coating deposits.

  9. Current challenges and problems in teaching pathophysiology in Ukraine - another reaction to Churilov's paper.

    PubMed

    Ataman, Oleksandr V

    2017-12-01

    Pathophysiology in Ukraine has rich traditions and achievements in the scientific areas, as well as in teaching academic discipline. Its history, the main Ukrainian scientific schools and their famous representatives are briefly described. The content of existing study program, the main approaches to teaching, and some methodological and organizational problems needed to be solved are characterized. The necessity and usefulness of developing and implementing the three separate courses of discipline (Essential, Clinical and Advanced Pathophysiology) are substantiated. The place of Pathophysiology in the training of physicians with different kinds of their future activity is discussed. Relation of teaching Pathophysiology to Translational and Personalized Medicine is tried to be shown.

  10. Islamic Studies or the Study of Islam?: From Parker to Rammell

    ERIC Educational Resources Information Center

    Dien, Mawil Izzi

    2007-01-01

    The paper reports and discusses some of the practical and contextual difficulties facing the teaching of Islamic studies within the British higher education environment. The main problems in the author's view stem from the haziness surrounding the discipline definition and the methodology employed in teaching it. This is particularly observed when…

  11. [Evaluation on methodological problems in reports concerning quantitative analysis of syndrome differentiation of diabetes mellitus].

    PubMed

    Chen, Bi-Cang; Wu, Qiu-Ying; Xiang, Cheng-Bin; Zhou, Yi; Guo, Ling-Xiang; Zhao, Neng-Jiang; Yang, Shu-Yu

    2006-01-01

    To evaluate the quality of reports published in recent 10 years in China about quantitative analysis of syndrome differentiation for diabetes mellitus (DM) in order to explore the methodological problems in these reports and find possible solutions. The main medical literature databases in China were searched. Thirty-one articles were included and evaluated by the principles of clinical epidemiology. There were many mistakes and deficiencies in these articles, such as clinical trial designs, diagnosis criteria for DM, standards of syndrome differentiation of DM, case inclusive and exclusive criteria, sample size and estimation, data comparability and statistical methods. It is necessary and important to improve the quality of reports concerning quantitative analysis of syndrome differentiation of DM in light of the principles of clinical epidemiology.

  12. Challenges in conducting qualitative research in health: A conceptual paper.

    PubMed

    Khankeh, Hamidreza; Ranjbar, Maryam; Khorasani-Zavareh, Davoud; Zargham-Boroujeni, Ali; Johansson, Eva

    2015-01-01

    Qualitative research focuses on social world and provides the tools to study health phenomena from the perspective of those experiencing them. Identifying the problem, forming the question, and selecting an appropriate methodology and design are some of the initial challenges that researchers encounter in the early stages of any research project. These problems are particularly common for novices. This article describes the practical challenges of using qualitative inquiry in the field of health and the challenges of performing an interpretive research based on professional experience as a qualitative researcher and on available literature. One of the main topics discussed is the nature of qualitative research, its inherent challenges, and how to overcome them. Some of those highlighted here include: identification of the research problem, formation of the research question/aim, and selecting an appropriate methodology and research design, which are the main concerns of qualitative researchers and need to be handled properly. Insights from real-life experiences in conducting qualitative research in health reveal these issues. The paper provides personal comments on the experiences of a researcher in conducting pure qualitative research in the field of health. It offers insights into the practical difficulties encountered when performing qualitative studies and offers solutions and alternatives applied by these authors, which may be of use to others.

  13. Urban simulation and gaming: Preliminary experience and perspectives, appendix F

    NASA Technical Reports Server (NTRS)

    Shostak, A. B.

    1973-01-01

    A three-month summer study of gaming, as applied to urban problems, was conducted. The results of the study are presented along with a series of recommendations aimed at guiding the warranted efforts of others to further explore the application of scientific gaming to the solution of some of America's urban problems. Three main topics are considered and are discussed in depth. These include: (1) gaming and urbanology, (2) methodology and lessons, and (3) reforms in the Cities Game.

  14. A Novel Methodology for Charging Station Deployment

    NASA Astrophysics Data System (ADS)

    Sun, Zhonghao; Zhao, Yunwei; He, Yueying; Li, Mingzhe

    2018-02-01

    Lack of charging stations has been a main obstacle to the promotion of electric vehicles. This paper studies deploying charging stations in traffic networks considering grid constraints to balance the charging demand and grid stability. First, we propose a statistical model for charging demand. Then we combine the charging demand model with power grid constraints and give the formulation of the charging station deployment problem. Finally, we propose a theoretical solution for the problem by transforming it to a Markov Decision Process.

  15. [The centralized mycobacteriological laboratory is a necessary component of a phthisiological service in large towns of Russia].

    PubMed

    Dorozhkova, I R; Freĭman, G E; Moroz, A M

    2007-01-01

    The paper presents the main points of the authors' own concept of the centralization of mycobacteriological service in large towns of the Russian Federation. The main points of step-by-step organizational and methodological measures required to solve this problem are described in detail. Consecutive measures to realize the proposed mycobacteriological service centralization model originated in January 2004 on a model of the Moscow Eastern Administrative District with 1380 thousand inhabitants are described.

  16. Common methodological flaws in economic evaluations.

    PubMed

    Drummond, Michael; Sculpher, Mark

    2005-07-01

    Economic evaluations are increasingly being used by those bodies such as government agencies and managed care groups that make decisions about the reimbursement of health technologies. However, several reviews of economic evaluations point to numerous deficiencies in the methodology of studies or the failure to follow published methodological guidelines. This article, written for healthcare decision-makers and other users of economic evaluations, outlines the common methodological flaws in studies, focussing on those issues that are likely to be most important when deciding on the reimbursement, or guidance for use, of health technologies. The main flaws discussed are: (i) omission of important costs or benefits; (ii) inappropriate selection of alternatives for comparison; (iii) problems in making indirect comparisons; (iv) inadequate representation of the effectiveness data; (v) inappropriate extrapolation beyond the period observed in clinical studies; (vi) excessive use of assumptions rather than data; (vii) inadequate characterization of uncertainty; (viii) problems in aggregation of results; (ix) reporting of average cost-effectiveness ratios; (x) lack of consideration of generalizability issues; and (xi) selective reporting of findings. In each case examples are given from the literature and guidance is offered on how to detect flaws in economic evaluations.

  17. Intercultural Education Series. The Americas and Self-Identification.

    ERIC Educational Resources Information Center

    Jones, Earl, Ed.; Dean, Frances, Ed.

    This is the final monograph in the Programa de Educacion Interamericana resource series on Latin America: SO 001 424 through SO 001 428. Two main sections are contained here: 1) philosophical and methodological approaches to the problems of teaching the social studies, and 2) ammunition in knowing the Americas so they can be taught better. The…

  18. Linguistic and Cultural Effects on the Attainment of Ethnic Minority Students: Some Methodological Considerations

    ERIC Educational Resources Information Center

    Theodosiou-Zipiti, Galatia; Lamprianou, Iasonas

    2016-01-01

    Established literature suggests that language problems lead to lower attainment levels in those subjects that are more language dependent. Also, language has been suggested as a main driver of ethnic minority attainment. We use an original dataset of 2,020 secondary school students to show that ethnic minority students in Cyprus underperform…

  19. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  20. The colloquial approach: An active learning technique

    NASA Astrophysics Data System (ADS)

    Arce, Pedro

    1994-09-01

    This paper addresses the very important problem of the effectiveness of teaching methodologies in fundamental engineering courses such as transport phenomena. An active learning strategy, termed the colloquial approach, is proposed in order to increase student involvement in the learning process. This methodology is a considerable departure from traditional methods that use solo lecturing. It is based on guided discussions, and it promotes student understanding of new concepts by directing the student to construct new ideas by building upon the current knowledge and by focusing on key cases that capture the essential aspects of new concepts. The colloquial approach motivates the student to participate in discussions, to develop detailed notes, and to design (or construct) his or her own explanation for a given problem. This paper discusses the main features of the colloquial approach within the framework of other current and previous techniques. Problem-solving strategies and the need for new textbooks and for future investigations based on the colloquial approach are also outlined.

  1. Problems of increased transport load as a result of implementation of projects of high-rise constructions

    NASA Astrophysics Data System (ADS)

    Provotorov, Ivan; Gasilov, Valentin; Anisimova, Nadezhda

    2018-03-01

    The structure of problems of high-rise construction us suggested, which includes the impact on environment, design solutions, transportation problems, financial costs for construction and operation, and others. Positive and negative aspects of high-rise construction are considered. One of the basic problems of high-rise construction is the problem of increased transport load. Construction of the subway on the basis of the concession mechanism, with the use of unmanned control of rolling stock is proposed as the most expedient solution. An evaluation of the effectiveness of this project is presented, it shows quite high performance indicators for a private investor. Main problems that the project implementation may face in conditions of lack of scientific and methodological support are outlined.

  2. Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows

    NASA Astrophysics Data System (ADS)

    Jittamai, Phongchai

    This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.

  3. [Factors conditioning primary care services utilization. Empirical evidence and methodological inconsistencies].

    PubMed

    Sáez, M

    2003-01-01

    In Spain, the degree and characteristics of primary care services utilization have been the subject of analysis since at least the 1980s. One of the main reasons for this interest is to assess the extent to which utilization matches primary care needs. In fact, the provision of an adequate health service for those who most need it is a generally accepted priority. The evidence shows that individual characteristics, mainly health status, are the factors most closely related to primary care utilization. Other personal characteristics, such as gender and age, could act as modulators of health care need. Some family and/or cultural variables, as well as factors related to the health care professional and institutions, could explain some of the observed variability in primary care services utilization. Socioeconomic variables, such as income, reveal a paradox. From an aggregate perspective, income is the main determinant of utilization as well as of health care expenditure. When data are analyzed for individuals, however, income is not related to primary health utilization. The situation is controversial, with methodological implications and, above all, consequences for the assessment of the efficiency in primary care utilization. Review of the literature reveals certain methodological inconsistencies that could at least partly explain the disparity of the empirical results. Among others, the following flaws can be highlighted: design problems, measurement errors, misspecification, and misleading statistical methods.Some solutions, among others, are quasi-experiments, the use of large administrative databases and of primary data sources (design problems); differentiation between types of utilization and between units of analysis other than consultations, and correction of measurement errors in the explanatory variables (measurement errors); consideration of relevant explanatory variables (misspecification); and the use of multilevel models (statistical methods).

  4. Rethinking medical humanities.

    PubMed

    Chiapperino, Luca; Boniolo, Giovanni

    2014-12-01

    This paper questions different conceptions of Medical Humanities in order to provide a clearer understanding of what they are and why they matter. Building upon former attempts, we defend a conception of Medical Humanities as a humanistic problem-based approach to medicine aiming at influencing its nature and practice. In particular, we discuss three main conceptual issues regarding the overall nature of this discipline: (i) a problem-driven approach to Medical Humanities; (ii) the need for an integration of Medical Humanities into medicine; (iii) the methodological requirements that could render Medical Humanities an effective framework for medical decision-making.

  5. Challenges in conducting qualitative research in health: A conceptual paper

    PubMed Central

    Khankeh, Hamidreza; Ranjbar, Maryam; Khorasani-Zavareh, Davoud; Zargham-Boroujeni, Ali; Johansson, Eva

    2015-01-01

    Background: Qualitative research focuses on social world and provides the tools to study health phenomena from the perspective of those experiencing them. Identifying the problem, forming the question, and selecting an appropriate methodology and design are some of the initial challenges that researchers encounter in the early stages of any research project. These problems are particularly common for novices. Materials and Methods: This article describes the practical challenges of using qualitative inquiry in the field of health and the challenges of performing an interpretive research based on professional experience as a qualitative researcher and on available literature. Results: One of the main topics discussed is the nature of qualitative research, its inherent challenges, and how to overcome them. Some of those highlighted here include: identification of the research problem, formation of the research question/aim, and selecting an appropriate methodology and research design, which are the main concerns of qualitative researchers and need to be handled properly. Insights from real-life experiences in conducting qualitative research in health reveal these issues. Conclusions: The paper provides personal comments on the experiences of a researcher in conducting pure qualitative research in the field of health. It offers insights into the practical difficulties encountered when performing qualitative studies and offers solutions and alternatives applied by these authors, which may be of use to others. PMID:26793245

  6. Supervised Coursework as a Way of Improving Motivation in the Learning of Digital Electronics

    ERIC Educational Resources Information Center

    Rengel, R.; Martin, M. J.; Vasallo, B. G.

    2012-01-01

    This paper presents a series of activities and educational strategies related to the teaching of digital electronics in computer engineering. The main objective of these methodologies was to develop a final tutored coursework to be carried out by the students in small teams. This coursework was conceived as consisting of advanced problems or small…

  7. Youth, Heroin, Crack: A Review of Recent British Trends

    ERIC Educational Resources Information Center

    Seddon, Toby

    2008-01-01

    Purpose: The purpose of this paper is to review the research evidence on recent British trends in the use of heroin and/or crack-cocaine by young people in order to appraise the scale and nature of the contemporary health problem they pose. Design/methodology/approach: The approach consists of a narrative review of the main current data sources on…

  8. Methodological Advances in Political Gaming: The One-Person Computer Interactive, Quasi-Rigid Rule Game.

    ERIC Educational Resources Information Center

    Shubik, Martin

    The main problem in computer gaming research is the initial decision of choosing the type of gaming method to be used. Free-form games lead to exciting open-ended confrontations that generate much information. However, they do not easily lend themselves to analysis because they generate far too much information and their results are seldom…

  9. Semicompeting risks in aging research: methods, issues and needs

    PubMed Central

    Varadhan, Ravi; Xue, Qian-Li; Bandeen-Roche, Karen

    2015-01-01

    A semicompeting risks problem involves two-types of events: a nonterminal and a terminal event (death). Typically, the nonterminal event is the focus of the study, but the terminal event can preclude the occurrence of the nonterminal event. Semicompeting risks are ubiquitous in studies of aging. Examples of semicompeting risk dyads include: dementia and death, frailty syndrome and death, disability and death, and nursing home placement and death. Semicompeting risk models can be divided into two broad classes: models based only on observables quantities (class O) and those based on potential (latent) failure times (class L). The classical illness-death model belongs to class O. This model is a special case of the multistate models, which has been an active area of methodology development. During the past decade and a half, there has also been a flurry of methodological activity on semicompeting risks based on latent failure times (L models). These advances notwithstanding, the semi-competing risks methodology has not penetrated biomedical research, in general, and gerontological research, in particular. Some possible reasons for this lack of uptake are: the methods are relatively new and sophisticated, conceptual problems associated with potential failure time models are difficult to overcome, paucity of expository articles aimed at educating practitioners, and non-availability of readily usable software. The main goals of this review article are: (i) to describe the major types of semicompeting risks problems arising in aging research, (ii) to provide a brief survey of the semicompeting risks methods, (iii) to suggest appropriate methods for addressing the problems in aging research, (iv) to highlight areas where more work is needed, and (v) to suggest ways to facilitate the uptake of the semicompeting risks methodology by the broader biomedical research community. PMID:24729136

  10. Educational-research laboratory "electric circuits" on the base of digital technologies

    NASA Astrophysics Data System (ADS)

    Koroteyev, V. I.; Florentsev, V. V.; Florentseva, N. I.

    2017-01-01

    The problem of research activity of trainees' activation in the educational-research laboratory "Electric Circuits" using innovative methodological solutions and digital technologies is considered. The main task is in creation of the unified experimental research information-educational environment "Electrical Engineering". The problems arising during the developing and application of the modern software and hardware, experimental and research stands and digital control and measuring systems are presented. This paper presents the main stages of development and creation of educational-research laboratory "Electrical Circuits" at the Department of Electrical Engineering of NRNU MEPhI. The authors also consider the analogues of the described research complex offered by various educational institutions and companies. The analysis of their strengths and weaknesses, on which the advantages of the proposed solution are based, is held.

  11. Twin studies in psychiatry and psychology: science or pseudoscience?

    PubMed

    Joseph, Jay

    2002-01-01

    Twin studies are frequently cited in support of the influence of genetic factors for a wide range of psychiatric conditions and psychological trait differences. The most common method, known as the classical twin method, compares the concordance rates or correlations of reared-together identical (MZ) vs. reared-together same-sex fraternal (DZ) twins. However, drawing genetic inferences from MZ-DZ comparisons is problematic due to methodological problems and questionable assumptions. It is argued that the main theoretical assumption of the twin method--known as the "equal environment assumption"--is not tenable. The twin method is therefore of doubtful value as an indicator of genetic influences. Studies of reared-apart twins are discussed, and it is noted that these studies are also vulnerable to methodological problems and environmental confounds. It is concluded that there is little reason to believe that twin studies provide evidence in favor of genetic influences on psychiatric disorders and human behavioral differences.

  12. The archiving of meteor research information

    NASA Technical Reports Server (NTRS)

    Nechitailenko, V. A.

    1987-01-01

    The results obtained over the past years under GLOBMET are not reviewed but some of the problems the solution of which will guide further development of meteor investigation and international cooperation in this field for the near term are discussed. The main attention is paid to problems which the meteor community itself can solve, or at least expedite. Most of them are more or less connected with the problem of information archiving. Information archiving deals with methods and techniques of solving two closely connected groups of problems. The first is the analysis of data and information as an integral part of meteor research and deals with the solution of certain methodological problems. The second deals with gathering data and information for the designing of models of the atmosphere and/or meteor complex and its utilization. These problem solutions are discussed.

  13. Developing authentic problems through lived experiences in nature

    NASA Astrophysics Data System (ADS)

    Gürel, Zeynep

    2017-02-01

    This study's main objective is to develop a theoretical and ontological basis for experimentation in contact with the real life, oriented to physics education. Physics is built upon the observation of nature, where our experience provides opportunity to deal with science in natural environment to those learners who have background in the very basics and essentials of physics. Physics in Nature course includes visiting and camping experiences situated in nature and organizing camp with educational purposes. The course has been integrated with indoor and outdoor settings interactively and the authentic problems, which have been taken from outdoor settings, have been brought into the class without well-defined structure (Ill-structured problems). Within the period of ten years, there were plethora of events and problems that would provide sufficient material for many researchers. Because every problem is an event and has a story. The philosophical event concept of Deleuze and Guattari has been used in the events of Physics in Nature courses. Post-qualitative research methodology has been used in order to put forward how to construct the relation between physics and nature and become the main problem in the physics in nature, thereby it has been the basis of the course and our academic research

  14. Neurophenomenology revisited: second-person methods for the study of human consciousness

    PubMed Central

    Olivares, Francisco A.; Vargas, Esteban; Fuentes, Claudio; Martínez-Pernía, David; Canales-Johnson, Andrés

    2015-01-01

    In the study of consciousness, neurophenomenology was originally established as a novel research program attempting to reconcile two apparently irreconcilable methodologies in psychology: qualitative and quantitative methods. Its potential relies on Francisco Varela’s idea of reciprocal constraints, in which first-person accounts and neurophysiological data mutually inform each other. However, since its first conceptualization, neurophenomenology has encountered methodological problems. These problems have emerged mainly because of the difficulty of obtaining and analyzing subjective reports in a systematic manner. However, more recently, several interview techniques for describing subjective accounts have been developed, collectively known as “second-person methods.” Second-person methods refer to interview techniques that solicit both verbal and non-verbal information from participants in order to obtain systematic and detailed subjective reports. Here, we examine the potential for employing second-person methodologies in the neurophenomenological study of consciousness and we propose three practical ideas for developing a second-person neurophenomenological method. Thus, we first describe second-person methodologies available in the literature for analyzing subjective reports, identifying specific constraints on the status of the first-, second- and third- person methods. Second, we analyze two experimental studies that explicitly incorporate second-person methods for traversing the “gap” between phenomenology and neuroscience. Third, we analyze the challenges that second-person accounts face in establishing an objective methodology for comparing results across different participants and interviewers: this is the “validation” problem. Finally, we synthesize the common aspects of the interview methods described above. In conclusion, our arguments emphasize that second-person methods represent a powerful approach for closing the gap between the experiential and the neurobiological levels of description in the study of human consciousness. PMID:26074839

  15. The intersubjective endeavor of psychopathology research: methodological reflections on a second-person perspective approach

    PubMed Central

    Galbusera, Laura; Fellin, Lisa

    2014-01-01

    Research in psychopathology may be considered as an intersubjective endeavor mainly concerned with understanding other minds. Thus, the way we conceive of social understanding influences how we do research in psychology in the first place. In this paper, we focus on psychopathology research as a paradigmatic case for this methodological issue, since the relation between the researcher and the object of study is characterized by a major component of “otherness.” We critically review different methodologies in psychopathology research, highlighting their relation to different social cognition theories (the third-, first-, and second-person approaches). Hence we outline the methodological implications arising from each theoretical stance. Firstly, we critically discuss the dominant paradigm in psychopathology research, based on the Diagnostic and Statistical Manual of Mental Disorders (American Psychiatric Association, 2013) and on quantitative methodology, as an example of a third-person methodology. Secondly, we contrast this mainstream view with phenomenological psychopathology which—by rejecting the reductionist view exclusively focused on behavioral symptoms—takes consciousness as its main object of study: it therefore attempts to grasp patients’ first-person experience. But how can we speak about a first-person perspective in psychopathology if the problem at stake is the experience of the other? How is it possible to understand the experience from “within,” if the person who is having this experience is another? By addressing these issues, we critically explore the feasibility and usefulness of a second-person methodology in psychopathology research. Notwithstanding the importance of methodological pluralism, we argue that a second-person perspective should inform the epistemology and methods of research in psychopathology, as it recognizes the fundamental circular and intersubjective construction of knowledge. PMID:25368589

  16. The intersubjective endeavor of psychopathology research: methodological reflections on a second-person perspective approach.

    PubMed

    Galbusera, Laura; Fellin, Lisa

    2014-01-01

    Research in psychopathology may be considered as an intersubjective endeavor mainly concerned with understanding other minds. Thus, the way we conceive of social understanding influences how we do research in psychology in the first place. In this paper, we focus on psychopathology research as a paradigmatic case for this methodological issue, since the relation between the researcher and the object of study is characterized by a major component of "otherness." We critically review different methodologies in psychopathology research, highlighting their relation to different social cognition theories (the third-, first-, and second-person approaches). Hence we outline the methodological implications arising from each theoretical stance. Firstly, we critically discuss the dominant paradigm in psychopathology research, based on the Diagnostic and Statistical Manual of Mental Disorders (American Psychiatric Association, 2013) and on quantitative methodology, as an example of a third-person methodology. Secondly, we contrast this mainstream view with phenomenological psychopathology which-by rejecting the reductionist view exclusively focused on behavioral symptoms-takes consciousness as its main object of study: it therefore attempts to grasp patients' first-person experience. But how can we speak about a first-person perspective in psychopathology if the problem at stake is the experience of the other? How is it possible to understand the experience from "within," if the person who is having this experience is another? By addressing these issues, we critically explore the feasibility and usefulness of a second-person methodology in psychopathology research. Notwithstanding the importance of methodological pluralism, we argue that a second-person perspective should inform the epistemology and methods of research in psychopathology, as it recognizes the fundamental circular and intersubjective construction of knowledge.

  17. A methodology to derive Synthetic Design Hydrographs for river flood management

    NASA Astrophysics Data System (ADS)

    Tomirotti, Massimo; Mignosa, Paolo

    2017-12-01

    The design of flood protection measures requires in many cases not only the estimation of the peak discharges, but also of the volume of the floods and its time distribution. A typical solution to this kind of problems is the formulation of Synthetic Design Hydrographs (SDHs). In this paper a methodology to derive SDHs is proposed on the basis of the estimation of the Flow Duration Frequency (FDF) reduction curve and of a Peak-Duration (PD) relationship furnishing respectively the quantiles of the maximum average discharge and the average peak position in each duration. The methodology is intended to synthesize the main features of the historical floods in a unique SDH for each return period. The shape of the SDH is not selected a priori but is a result of the behaviour of FDF and PD curves, allowing to account in a very convenient way for the variability of the shapes of the observed hydrographs at local time scale. The validation of the methodology is performed with reference to flood routing problems in reservoirs, lakes and rivers. The results obtained demonstrate the capability of the SDHs to describe the effects of different hydraulic systems on the statistical regime of floods, even in presence of strong modifications induced on the probability distribution of peak flows.

  18. GPR image analysis to locate water leaks from buried pipes by applying variance filters

    NASA Astrophysics Data System (ADS)

    Ocaña-Levario, Silvia J.; Carreño-Alvarado, Elizabeth P.; Ayala-Cabrera, David; Izquierdo, Joaquín

    2018-05-01

    Nowadays, there is growing interest in controlling and reducing the amount of water lost through leakage in water supply systems (WSSs). Leakage is, in fact, one of the biggest problems faced by the managers of these utilities. This work addresses the problem of leakage in WSSs by using GPR (Ground Penetrating Radar) as a non-destructive method. The main objective is to identify and extract features from GPR images such as leaks and components in a controlled laboratory condition by a methodology based on second order statistical parameters and, using the obtained features, to create 3D models that allows quick visualization of components and leaks in WSSs from GPR image analysis and subsequent interpretation. This methodology has been used before in other fields and provided promising results. The results obtained with the proposed methodology are presented, analyzed, interpreted and compared with the results obtained by using a well-established multi-agent based methodology. These results show that the variance filter is capable of highlighting the characteristics of components and anomalies, in an intuitive manner, which can be identified by non-highly qualified personnel, using the 3D models we develop. This research intends to pave the way towards future intelligent detection systems that enable the automatic detection of leaks in WSSs.

  19. Quality of life assessment in children: a review of conceptual and methodological issues in multidimensional health status measures.

    PubMed Central

    Pal, D K

    1996-01-01

    STUDY OBJECTIVE: To clarify concepts and methodological problems in existing multidimensional health status measures for children. DESIGN: Thematic review of instruments found by computerised and manual searches, 1979-95. SUBJECTS: Nine health status instruments. MAIN RESULTS: Many instruments did not satisfy criteria of being child centered or family focussed; few had sufficient psychometric properties for research or clinical use; underlying conceptual assumptions were rarely explicit. CONCLUSIONS: Quality of life measures should be viewed cautiously. Interdisciplinary discussion is required, as well as discussion with children and parents, to establish constructs that are truly useful. PMID:8882220

  20. Elderly victims of gender violence in Portugal: Invisible and not heard?

    PubMed

    Magalhães, Maria José; Rodríguez Castro, Yolanda; Ruido, Patricia Alonso; Braga Lopez, Rita DeOliveira

    2016-12-01

    In this article, we explore professionals' representations of elderly female victims of gender violence. Semi-structured interviews were used to explore seven professionals' work philosophies and intervention methodologies in their work with elderly female victims of violence, their main problems and difficulties, and their perspectives regarding shelters for elderly women. Results show that there are no specific philosophies and methodologies to intervene with these victims. There is a tendency to homogenize all the victims of gender violence, regardless of their age and specific needs. The professionals also tended to trivialize gender violence against elderly female victims, considering that these women tolerate violence.

  1. Composition of pyrolysis gas from oil shale at various stages of heating

    NASA Astrophysics Data System (ADS)

    Martemyanov, S. M.; Bukharkin, A. A.; Koryashov, I. A.; Ivanov, A. A.

    2017-05-01

    Underground, the pyrolytic conversion of an oil shale in the nearest future may become an alternative source of a fuel gas and a synthetic oil. The main scientific problem in designing this technology is to provide a methodology for determination of the optimal mode of heating the subterranean formation. Such a methodology must allow predicting the composition of the pyrolysis products and the energy consumption at a given heating rate of the subterranean formation. The paper describes the results of heating of the oil shale fragments in conditions similar to the underground. The dynamics of composition of the gaseous products of pyrolysis are presented and analyzed.

  2. Methodology for vocational psychodiagnostics of senior schoolchildren using information technologies

    NASA Astrophysics Data System (ADS)

    Bogdanovskaya, I. M.; Kosheleva, A. N.; Kiselev, P. B.; Davydova, Yu. A.

    2017-01-01

    The article identifies the role and main problems of vocational psychodiagnostics in modern socio-cultural conditions. It analyzes the potentials of information technologies in vocational psychodiagnostics of senior schoolchildren. The article describes the theoretical and methodological grounds, content and diagnostic potentials of the computerized method in vocational psychodiagnostics. The computerized method includes three blocks of sub-tests to identify intellectual potential, personal qualities, professional interests and values, career orientations, as well as subtests to analyze the specific life experience of senior schoolchildren. The results of diagnostics allow developing an integrated psychodiagnostic conclusion with recommendations. The article contains options of software architecture for the given method.

  3. Restoration of a single superresolution image from several blurred, noisy, and undersampled measured images.

    PubMed

    Elad, M; Feuer, A

    1997-01-01

    The three main tools in the single image restoration theory are the maximum likelihood (ML) estimator, the maximum a posteriori probability (MAP) estimator, and the set theoretic approach using projection onto convex sets (POCS). This paper utilizes the above known tools to propose a unified methodology toward the more complicated problem of superresolution restoration. In the superresolution restoration problem, an improved resolution image is restored from several geometrically warped, blurred, noisy and downsampled measured images. The superresolution restoration problem is modeled and analyzed from the ML, the MAP, and POCS points of view, yielding a generalization of the known superresolution restoration methods. The proposed restoration approach is general but assumes explicit knowledge of the linear space- and time-variant blur, the (additive Gaussian) noise, the different measured resolutions, and the (smooth) motion characteristics. A hybrid method combining the simplicity of the ML and the incorporation of nonellipsoid constraints is presented, giving improved restoration performance, compared with the ML and the POCS approaches. The hybrid method is shown to converge to the unique optimal solution of a new definition of the optimization problem. Superresolution restoration from motionless measurements is also discussed. Simulations demonstrate the power of the proposed methodology.

  4. Geography in Italian Schools (An Example of a Cross-Curricular Project Using Geospatial Technologies for a Practical Contribution to Educators)

    ERIC Educational Resources Information Center

    De Vecchis, Gino; Pasquinelli D'Allegra, Daniela; Pesaresi, Cristiano

    2011-01-01

    During the last few years the Italian school system has seen significant changes but geography continues to be considered a boring and un-useful discipline by public institutions. The main problem is the widespread geographic illiteracy and the fact that very often people do not know the objectives, methodology and tools of geographical studies.…

  5. What we know and don't know about mental health problems among immigrants in Norway.

    PubMed

    Abebe, Dawit Shawel; Lien, Lars; Hjelde, Karin Harsløf

    2014-02-01

    Mental health problems have been regarded as one of the main public health challenges of immigrants in several countries. Understanding and generating research-based knowledge on immigrant health problems is highly relevant for planning preventive interventions, as well as guiding social and policy actions. This review aims to map the available knowledge on immigrants' mental health status and its associated risk factors in Norway. The reviewed literature about mental health problems among immigrant populations in Norway was found through databases, such as PUBMED, EMBASE, PsychINFO and MEDLINE. About 41 peer-reviewed original articles published since 1990s were included. In the majority of the studies, the immigrant populations, specifically adult immigrants from low and middle income countries, have been found with a higher degree of mental health problems compared to Norwegians and the general population. Increased risk for mental illness is primarily linked to a higher risk for acculturative stress, poor social support, deprived socioeconomic conditions, multiple negative life events, experiences of discrimination and traumatic pre-migration experiences. However, research in this field has been confronted by a number of gaps and methodological challenges. The available knowledge indicates a need for preventive interventions. Correspondingly, it strongly recommends a comprehensive research program that addresses gaps and methodological challenges.

  6. Overall equipment efficiency of Flexographic Printing process: A case study

    NASA Astrophysics Data System (ADS)

    Zahoor, S.; Shehzad, A.; Mufti, NA; Zahoor, Z.; Saeed, U.

    2017-12-01

    This paper reports the efficiency improvement of a flexographic printing machine by reducing breakdown time with the help of a total productive maintenance measure called overall equipment efficiency (OEE). The methodology is comprised of calculating OEE of the machine before and after identifying the causes of the problems. Pareto diagram is used to prioritize main problem areas and 5-whys analysis approach is used to identify the root cause of these problems. OEE of the process is improved from 34% to 40.2% for a 30 days time period. It is concluded that OEE and 5-whys analysis techniques are useful in improving effectiveness of the equipment and for the continuous process improvement as well.

  7. Common Methodological Problems in Research on the Addictions.

    ERIC Educational Resources Information Center

    Nathan, Peter E.; Lansky, David

    1978-01-01

    Identifies common problems in research on the addictions and offers suggestions for remediating these methodological problems. The addictions considered include alcoholism and drug dependencies. Problems considered are those arising from inadequate, incomplete, or biased reviews of relevant literatures and methodological shortcomings of subject…

  8. Engine dynamic analysis with general nonlinear finite element codes. II - Bearing element implementation, overall numerical characteristics and benchmarking

    NASA Technical Reports Server (NTRS)

    Padovan, J.; Adams, M.; Lam, P.; Fertis, D.; Zeid, I.

    1982-01-01

    Second-year efforts within a three-year study to develop and extend finite element (FE) methodology to efficiently handle the transient/steady state response of rotor-bearing-stator structure associated with gas turbine engines are outlined. The two main areas aim at (1) implanting the squeeze film damper element into a general purpose FE code for testing and evaluation; and (2) determining the numerical characteristics of the FE-generated rotor-bearing-stator simulation scheme. The governing FE field equations are set out and the solution methodology is presented. The choice of ADINA as the general-purpose FE code is explained, and the numerical operational characteristics of the direct integration approach of FE-generated rotor-bearing-stator simulations is determined, including benchmarking, comparison of explicit vs. implicit methodologies of direct integration, and demonstration problems.

  9. An NAFP Project: Use of Object Oriented Methodologies and Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Baggs, Rhoda

    2007-01-01

    In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.

  10. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications.

    PubMed

    d'Acierno, Antonio; Esposito, Massimo; De Pietro, Giuseppe

    2013-01-01

    The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed.

  11. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications

    PubMed Central

    2013-01-01

    Background The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. Methods We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. Results The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. Conclusions The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed. PMID:23368970

  12. The Taguchi methodology as a statistical tool for biotechnological applications: a critical appraisal.

    PubMed

    Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J

    2008-04-01

    Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.

  13. Innovation and Integrity in Intervention Research: Conceptual Issues, Methodology, and Knowledge Translation.

    PubMed

    Malti, Tina; Beelmann, Andreas; Noam, Gil G; Sommer, Simon

    2018-04-01

    In this article, we introduce the special issue entitled Innovation and Integrity in Intervention Science. Its focus is on essential problems and prospects for intervention research examining two related topics, i.e., methodological issues and research integrity, and challenges in the transfer of research knowledge into practice and policy. The main aims are to identify how to advance methodology in order to improve research quality, examine scientific integrity in the field of intervention science, and discuss future steps to enhance the transfer of knowledge about evidence-based intervention principles into sustained practice, routine activities, and policy decisions. Themes of the special issue are twofold. The first includes questions about research methodology in intervention science, both in terms of research design and methods, as well as data analyses and the reporting of findings. Second, the issue tackles questions surrounding the types of knowledge translation frameworks that might be beneficial to mobilize the transfer of research-based knowledge into practice and public policies. The issue argues that innovations in methodology and thoughtful approaches to knowledge translation can enable transparency, quality, and sustainability of intervention research.

  14. Spatial prediction of water quality variables along a main river channel, in presence of pollution hotspots.

    PubMed

    Rizo-Decelis, L D; Pardo-Igúzquiza, E; Andreo, B

    2017-12-15

    In order to treat and evaluate the available data of water quality and fully exploit monitoring results (e.g. characterize regional patterns, optimize monitoring networks, infer conditions at unmonitored locations, etc.), it is crucial to develop improved and efficient methodologies. Accordingly, estimation of water quality along fluvial ecosystems is a frequent task in environment studies. In this work, a particular case of this problem is examined, namely, the estimation of water quality along a main stem of a large basin (where most anthropic activity takes place), from observational data measured along this river channel. We adapted topological kriging to this case, where each watershed contains all the watersheds of the upstream observed data ("nested support effect"). Data analysis was additionally extended by taking into account the upstream distance to the closest contamination hotspot as an external drift. We propose choosing the best estimation method by cross-validation. The methodological approach in spatial variability modeling may be used for optimizing the water quality monitoring of a given watercourse. The methodology presented is applied to 28 water quality variables measured along the Santiago River in Western Mexico. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Speciation of adsorbates on surface of solids by infrared spectroscopy and chemometrics.

    PubMed

    Vilmin, Franck; Bazin, Philippe; Thibault-Starzyk, Frédéric; Travert, Arnaud

    2015-09-03

    Speciation, i.e. identification and quantification, of surface species on heterogeneous surfaces by infrared spectroscopy is important in many fields but remains a challenging task when facing strongly overlapped spectra of multiple adspecies. Here, we propose a new methodology, combining state of the art instrumental developments for quantitative infrared spectroscopy of adspecies and chemometrics tools, mainly a novel data processing algorithm, called SORB-MCR (SOft modeling by Recursive Based-Multivariate Curve Resolution) and multivariate calibration. After formal transposition of the general linear mixture model to adsorption spectral data, the main issues, i.e. validity of Beer-Lambert law and rank deficiency problems, are theoretically discussed. Then, the methodology is exposed through application to two case studies, each of them characterized by a specific type of rank deficiency: (i) speciation of physisorbed water species over a hydrated silica surface, and (ii) speciation (chemisorption and physisorption) of a silane probe molecule over a dehydrated silica surface. In both cases, we demonstrate the relevance of this approach which leads to a thorough surface speciation based on comprehensive and fully interpretable multivariate quantitative models. Limitations and drawbacks of the methodology are also underlined. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Validation of a SysML based design for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Berrachedi, Amel; Rahim, Messaoud; Ioualalen, Malika; Hammad, Ahmed

    2017-07-01

    When developing complex systems, the requirement for the verification of the systems' design is one of the main challenges. Wireless Sensor Networks (WSNs) are examples of such systems. We address the problem of how WSNs must be designed to fulfil the system requirements. Using the SysML Language, we propose a Model Based System Engineering (MBSE) specification and verification methodology for designing WSNs. This methodology uses SysML to describe the WSNs requirements, structure and behaviour. Then, it translates the SysML elements to an analytic model, specifically, to a Deterministic Stochastic Petri Net. The proposed approach allows to design WSNs and study their behaviors and their energy performances.

  17. Large scale nonlinear programming for the optimization of spacecraft trajectories

    NASA Astrophysics Data System (ADS)

    Arrieta-Camacho, Juan Jose

    Despite the availability of high fidelity mathematical models, the computation of accurate optimal spacecraft trajectories has never been an easy task. While simplified models of spacecraft motion can provide useful estimates on energy requirements, sizing, and cost; the actual launch window and maneuver scheduling must rely on more accurate representations. We propose an alternative for the computation of optimal transfers that uses an accurate representation of the spacecraft dynamics. Like other methodologies for trajectory optimization, this alternative is able to consider all major disturbances. In contrast, it can handle explicitly equality and inequality constraints throughout the trajectory; it requires neither the derivation of costate equations nor the identification of the constrained arcs. The alternative consist of two steps: (1) discretizing the dynamic model using high-order collocation at Radau points, which displays numerical advantages, and (2) solution to the resulting Nonlinear Programming (NLP) problem using an interior point method, which does not suffer from the performance bottleneck associated with identifying the active set, as required by sequential quadratic programming methods; in this way the methodology exploits the availability of sound numerical methods, and next generation NLP solvers. In practice the methodology is versatile; it can be applied to a variety of aerospace problems like homing, guidance, and aircraft collision avoidance; the methodology is particularly well suited for low-thrust spacecraft trajectory optimization. Examples are presented which consider the optimization of a low-thrust orbit transfer subject to the main disturbances due to Earth's gravity field together with Lunar and Solar attraction. Other example considers the optimization of a multiple asteroid rendezvous problem. In both cases, the ability of our proposed methodology to consider non-standard objective functions and constraints is illustrated. Future research directions are identified, involving the automatic scheduling and optimization of trajectory correction maneuvers. The sensitivity information provided by the methodology is expected to be invaluable in such research pursuit. The collocation scheme and nonlinear programming algorithm presented in this work, complement other existing methodologies by providing reliable and efficient numerical methods able to handle large scale, nonlinear dynamic models.

  18. An extended validation of the last generation of particle finite element method for free surface flows

    NASA Astrophysics Data System (ADS)

    Gimenez, Juan M.; González, Leo M.

    2015-03-01

    In this paper, a new generation of the particle method known as Particle Finite Element Method (PFEM), which combines convective particle movement and a fixed mesh resolution, is applied to free surface flows. This interesting variant, previously described in the literature as PFEM-2, is able to use larger time steps when compared to other similar numerical tools which implies shorter computational times while maintaining the accuracy of the computation. PFEM-2 has already been extended to free surface problems, being the main topic of this paper a deep validation of this methodology for a wider range of flows. To accomplish this task, different improved versions of discontinuous and continuous enriched basis functions for the pressure field have been developed to capture the free surface dynamics without artificial diffusion or undesired numerical effects when different density ratios are involved. A collection of problems has been carefully selected such that a wide variety of Froude numbers, density ratios and dominant dissipative cases are reported with the intention of presenting a general methodology, not restricted to a particular range of parameters, and capable of using large time-steps. The results of the different free-surface problems solved, which include: Rayleigh-Taylor instability, sloshing problems, viscous standing waves and the dam break problem, are compared to well validated numerical alternatives or experimental measurements obtaining accurate approximations for such complex flows.

  19. A methodology model for quality management in a general hospital.

    PubMed

    Stern, Z; Naveh, E

    1997-01-01

    A reappraisal is made of the relevance of industrial modes of quality management to the issues of medical care. Analysis of the nature of medical care, which differentiates it from the supplier-client relationships of industry, presents the main intrinsic characteristics, which create problems in application of the industrial quality management approaches to medical care. Several examples are the complexity of the relationship between the medical action and the result obtained, the client's nonacceptance of economic profitability as a value in his medical care, and customer satisfaction biased by variable standards of knowledge. The real problems unique to hospitals are addressed, and a methodology model for their quality management is offered. Included is a sample of indicator vectors, measurements of quality care, cost of medical care, quality of service, and human resources. These are based on the trilogy of planning quality, quality control, and improving quality. The conclusions confirm the inadequacy of industrial quality management approaches for medical institutions and recommend investment in formulation of appropriate concepts.

  20. Normalization of hydrocarbon emissions in Germany

    NASA Astrophysics Data System (ADS)

    Levitin, R. E.

    2018-05-01

    In connection with the integration of the Russian Federation into the European space, many technical regulations and methodologies are being corrected. The work deals with the German legislation in the field of determining of hydrocarbon emissions and the methodology for determining the emissions of oil products from vertical steel tanks. In German law, the Emission Protection Act establishes only basic requirements. Mainly technical details, which have importance for practice, are regulated in numerous Orders on the Procedure for the Implementation of the Law (German abbr. - BimSchV). Documents referred to by the Technical Manual on the Maintenance of Clean Air are a step below on the hierarchical ladder of legislative and regulatory documentation. This set of documents is represented by numerous DIN standards and VDI guidelines. The article considers the methodology from the guidance document VDI 3479. The shortcomings and problems of applying the given method in Russia are shown.

  1. An Investigation of the Repair Cycle for H-53 and H-60 Helicopter Main Gearboxes - Physical Movement and Information Flows

    DTIC Science & Technology

    1989-09-01

    15 The Pipeline . . 17 The Repair Cycle 20 The Order Cycle ard Order Processing 24 Summary . . . . . . . . . . . 27 III. Methodology...cycles, and order processing . The literature is found in business logistics books, business logistics periodicals, and military periodicals. 14 II...differences and problems previously outlined (1:19). The Order Cycle and Order Processing The Nature of the Order Cycle and Order Processing . Order

  2. Soft Computing Methods for Disulfide Connectivity Prediction.

    PubMed

    Márquez-Chamorro, Alfonso E; Aguilar-Ruiz, Jesús S

    2015-01-01

    The problem of protein structure prediction (PSP) is one of the main challenges in structural bioinformatics. To tackle this problem, PSP can be divided into several subproblems. One of these subproblems is the prediction of disulfide bonds. The disulfide connectivity prediction problem consists in identifying which nonadjacent cysteines would be cross-linked from all possible candidates. Determining the disulfide bond connectivity between the cysteines of a protein is desirable as a previous step of the 3D PSP, as the protein conformational search space is highly reduced. The most representative soft computing approaches for the disulfide bonds connectivity prediction problem of the last decade are summarized in this paper. Certain aspects, such as the different methodologies based on soft computing approaches (artificial neural network or support vector machine) or features of the algorithms, are used for the classification of these methods.

  3. Vector data structure conversion at the EROS Data Center

    USGS Publications Warehouse

    van Roessel, Jan W.; Doescher, S.W.

    1986-01-01

    With the increasing prevalence of GIS systems and the processing of spatial data, conversion of data from one system to another has become a more serious problem. This report describes the approach taken to arrive at a solution at the EROS Data Center. The report consists of a main section and a number of appendices. The methodology is described in the main section, while the appendices have system specific descriptions. The overall approach is based on a central conversion hub consisting of a relational database manager and associated tools, with a standard data structure for the transfer of spatial data. This approach is the best compromise between the two goals of reducing the overall interfacing effort and producing efficient system interfaces, while the tools can be used to arrive at a progression of interface sophistication ranging from toolbench to smooth flow. The appendices provide detailed information on a number of spatial data handling systems and data structures and existing interfaces as well as interfaces developed with the described methodology.

  4. Problem solving using soft systems methodology.

    PubMed

    Land, L

    This article outlines a method of problem solving which considers holistic solutions to complex problems. Soft systems methodology allows people involved in the problem situation to have control over the decision-making process.

  5. GeomarCD project; an educational CD-Rom about marine geophysics

    NASA Astrophysics Data System (ADS)

    Diaz, J.; Rubio, E.; Gómez, M.; Gallart, J.

    2009-04-01

    This project aims to introduce the main aspects of marine geophysics experiments to the high school students. We have chosen to present the information in the form an interactive game in which, taking care of the scientific objectives and the technological and logistic resources, the player must found the best strategy to make one of the 3 research projects proposed. Along the game, the player is introduced to the main aspects of the plate tectonics theory and the crustal structure as well as to the main methodologies available (seismics, potencial fields, cores). Rather than being based in theoretical aspects, largely covered by other outreach projects, this work focuses in how a realistic problem can be solved through a field experiment. The game takes place in the researcher's desk and in an oceanographic vessel as the BIO Hesperides and includes the choice of the research project, the design and development of the field work and the interpretation of the results. At the end, the player must complete a questionnaire to elaborate the final report. The correct choice of the appropriate methodologies and its interpretation is necessary to succeed. CD copies in Spanish are freely available upon request.

  6. Cause-and-effect analysis of risk management files to assess patient care in the emergency department.

    PubMed

    White, Andrew A; Wright, Seth W; Blanco, Roberto; Lemonds, Brent; Sisco, Janice; Bledsoe, Sandy; Irwin, Cindy; Isenhour, Jennifer; Pichert, James W

    2004-10-01

    Identifying the etiologies of adverse outcomes is an important first step in improving patient safety and reducing malpractice risks. However, relatively little is known about the causes of emergency department-related adverse outcomes. The objective was to describe a method for identification of common causes of adverse outcomes in an emergency department. This methodology potentially can suggest ways to improve care and might provide a model for identification of factors associated with adverse outcomes. This was a retrospective analysis of 74 consecutive files opened by a malpractice insurer between 1995 and 2000. Each risk-management file was analyzed to identify potential causes of adverse outcomes. The main outcomes were rater-assigned codes for alleged problems with care (e.g., failures of communication or problems related to diagnosis). About 50% of cases were related to injuries or abdominal complaints. A contributing cause was found in 92% of cases, and most had more than one contributing cause. The most frequent contributing categories included failure to diagnose (45%), supervision problems (31%), communication problems (30%), patient behavior (24%), administrative problems (20%), and documentation (20%). Specific relating factors within these categories, such as lack of timely resident supervision and failure to follow policies and procedures, were identified. This project documented that an aggregate analysis of risk-management files has the potential to identify shared causes related to real or perceived adverse outcomes. Several potentially correctable systems problems were identified using this methodology. These simple, descriptive management tools may be useful in identifying issues for problem solving and can be easily learned by physicians and managers.

  7. Fractional Programming for Communication Systems—Part II: Uplink Scheduling via Matching

    NASA Astrophysics Data System (ADS)

    Shen, Kaiming; Yu, Wei

    2018-05-01

    This two-part paper develops novel methodologies for using fractional programming (FP) techniques to design and optimize communication systems. Part I of this paper proposes a new quadratic transform for FP and treats its application for continuous optimization problems. In this Part II of the paper, we study discrete problems, such as those involving user scheduling, which are considerably more difficult to solve. Unlike the continuous problems, discrete or mixed discrete-continuous problems normally cannot be recast as convex problems. In contrast to the common heuristic of relaxing the discrete variables, this work reformulates the original problem in an FP form amenable to distributed combinatorial optimization. The paper illustrates this methodology by tackling the important and challenging problem of uplink coordinated multi-cell user scheduling in wireless cellular systems. Uplink scheduling is more challenging than downlink scheduling, because uplink user scheduling decisions significantly affect the interference pattern in nearby cells. Further, the discrete scheduling variable needs to be optimized jointly with continuous variables such as transmit power levels and beamformers. The main idea of the proposed FP approach is to decouple the interaction among the interfering links, thereby permitting a distributed and joint optimization of the discrete and continuous variables with provable convergence. The paper shows that the well-known weighted minimum mean-square-error (WMMSE) algorithm can also be derived from a particular use of FP; but our proposed FP-based method significantly outperforms WMMSE when discrete user scheduling variables are involved, both in term of run-time efficiency and optimizing results.

  8. Participatory ergonomics among female cashiers from a department store.

    PubMed

    Cristancho, María Yanire León

    2012-01-01

    The objective of this paper was to control ergonomic risks among female cashiers working in a department store belonging to the retail market. This study was conducted between May and November 2010. Participatory ergonomics was applied through knowing and understanding how the company works, establishing the work team (Ergo group), training the team in ergonomics-related topics, and making decisions and interventions. The sample was composed of 71 participants--mostly female cashiers--, and all of them have a musculoskeletal compromise, declaring pain or discomfort mainly in the neck, lower back, right wrist and shoulders. Among others, following problems were found: postural overload, repetitive work, manual load handling, mental fatigue, environmental discomfort, variable work schedules, extended working days, and absence of breaks. In the intervention, the main implemented changes were the redesign of workstation, complete change of chairs and keyboards, and the implementation of a rotation system, as well breaks for compensatory exercises. After that, an evident improvement of found problems was observed, therefore it can be concluded that participatory ergonomics is an attractive methodology, appropriate and efficient for solving and controlling ergonomic risks and problems.

  9. Experience of Teaching Drawing in German Schools by A. Ažbe and S. Hollósy (On the Example of the Image of Human Head)

    ERIC Educational Resources Information Center

    Melnikova, Svetlana

    2017-01-01

    The main aim of the paper is to analyze and disclose the methods for teaching drawing of the human head in foreign schools at the end of the 19th and beginning of the 20th centuries for further application in modern Russian methodology of art education. The relevance of the problem under investigation is due to the structuring and disclosure of…

  10. Estimating air emissions from ships: Meta-analysis of modelling approaches and available data sources

    NASA Astrophysics Data System (ADS)

    Miola, Apollonia; Ciuffo, Biagio

    2011-04-01

    Maritime transport plays a central role in the transport sector's sustainability debate. Its contribution to air pollution and greenhouse gases is significant. An effective policy strategy to regulate air emissions requires their robust estimation in terms of quantification and location. This paper provides a critical analysis of the ship emission modelling approaches and data sources available, identifying their limits and constraints. It classifies the main methodologies on the basis of the approach followed (bottom-up or top-down) for the evaluation and geographic characterisation of emissions. The analysis highlights the uncertainty of results from the different methods. This is mainly due to the level of uncertainty connected with the sources of information that are used as inputs to the different studies. This paper describes the sources of the information required for these analyses, paying particular attention to AIS data and to the possible problems associated with their use. One way of reducing the overall uncertainty in the results could be the simultaneous use of different sources of information. This paper presents an alternative methodology based on this approach. As a final remark, it can be expected that new approaches to the problem together with more reliable data sources over the coming years could give more impetus to the debate on the global impact of maritime traffic on the environment that, currently, has only reached agreement via the "consensus" estimates provided by IMO (2009).

  11. New Tools in Orthology Analysis: A Brief Review of Promising Perspectives

    PubMed Central

    Nichio, Bruno T. L.; Marchaukoski, Jeroniza Nunes; Raittz, Roberto Tadeu

    2017-01-01

    Nowadays defying homology relationships among sequences is essential for biological research. Within homology the analysis of orthologs sequences is of great importance for computational biology, annotation of genomes and for phylogenetic inference. Since 2007, with the increase in the number of new sequences being deposited in large biological databases, researchers have begun to analyse computerized methodologies and tools aimed at selecting the most promising ones in the prediction of orthologous groups. Literature in this field of research describes the problems that the majority of available tools show, such as those encountered in accuracy, time required for analysis (especially in light of the increasing volume of data being submitted, which require faster techniques) and the automatization of the process without requiring manual intervention. Conducting our search through BMC, Google Scholar, NCBI PubMed, and Expasy, we examined more than 600 articles pursuing the most recent techniques and tools developed to solve most the problems still existing in orthology detection. We listed the main computational tools created and developed between 2011 and 2017, taking into consideration the differences in the type of orthology analysis, outlining the main features of each tool and pointing to the problems that each one tries to address. We also observed that several tools still use as their main algorithm the BLAST “all-against-all” methodology, which entails some limitations, such as limited number of queries, computational cost, and high processing time to complete the analysis. However, new promising tools are being developed, like OrthoVenn (which uses the Venn diagram to show the relationship of ortholog groups generated by its algorithm); or proteinOrtho (which improves the accuracy of ortholog groups); or ReMark (tackling the integration of the pipeline to turn the entry process automatic); or OrthAgogue (using algorithms developed to minimize processing time); and proteinOrtho (developed for dealing with large amounts of biological data). We made a comparison among the main features of four tool and tested them using four for prokaryotic genomas. We hope that our review can be useful for researchers and will help them in selecting the most appropriate tool for their work in the field of orthology. PMID:29163633

  12. New Tools in Orthology Analysis: A Brief Review of Promising Perspectives.

    PubMed

    Nichio, Bruno T L; Marchaukoski, Jeroniza Nunes; Raittz, Roberto Tadeu

    2017-01-01

    Nowadays defying homology relationships among sequences is essential for biological research. Within homology the analysis of orthologs sequences is of great importance for computational biology, annotation of genomes and for phylogenetic inference. Since 2007, with the increase in the number of new sequences being deposited in large biological databases, researchers have begun to analyse computerized methodologies and tools aimed at selecting the most promising ones in the prediction of orthologous groups. Literature in this field of research describes the problems that the majority of available tools show, such as those encountered in accuracy, time required for analysis (especially in light of the increasing volume of data being submitted, which require faster techniques) and the automatization of the process without requiring manual intervention. Conducting our search through BMC, Google Scholar, NCBI PubMed, and Expasy, we examined more than 600 articles pursuing the most recent techniques and tools developed to solve most the problems still existing in orthology detection. We listed the main computational tools created and developed between 2011 and 2017, taking into consideration the differences in the type of orthology analysis, outlining the main features of each tool and pointing to the problems that each one tries to address. We also observed that several tools still use as their main algorithm the BLAST "all-against-all" methodology, which entails some limitations, such as limited number of queries, computational cost, and high processing time to complete the analysis. However, new promising tools are being developed, like OrthoVenn (which uses the Venn diagram to show the relationship of ortholog groups generated by its algorithm); or proteinOrtho (which improves the accuracy of ortholog groups); or ReMark (tackling the integration of the pipeline to turn the entry process automatic); or OrthAgogue (using algorithms developed to minimize processing time); and proteinOrtho (developed for dealing with large amounts of biological data). We made a comparison among the main features of four tool and tested them using four for prokaryotic genomas. We hope that our review can be useful for researchers and will help them in selecting the most appropriate tool for their work in the field of orthology.

  13. Numerical approach to constructing the lunar physical libration: results of the initial stage

    NASA Astrophysics Data System (ADS)

    Zagidullin, A.; Petrova, N.; Nefediev, Yu.; Usanin, V.; Glushkov, M.

    2015-10-01

    So called "main problem" it is taken as a model to develop the numerical approach in the theory of lunar physical libration. For the chosen model, there are both a good methodological basis and results obtained at the Kazan University as an outcome of the analytic theory construction. Results of the first stage in numerical approach are presented in this report. Three main limitation are taken to describe the main problem: -independent consideration of orbital and rotational motion of the Moon; - a rigid body model for the lunar body is taken and its dynamical figure is described by inertia ellipsoid, which gives us the mass distribution inside the Moon. - only gravitational interaction with the Earth and the Sun is considered. Development of selenopotential is limited on this stage by the second harmonic only. Inclusion of the 3-rd and 4-th order harmonics is the nearest task for the next stage.The full solution of libration problem consists of removing the below specified limitations: consideration of the fine effects, caused by planet perturbations, by visco-elastic properties of the lunar body, by the presence of a two-layer lunar core, by the Earth obliquity, by ecliptic rotation, if it is taken as a reference plane.

  14. Improved modeling of two-dimensional transitions in dense phases on crystalline surfaces. Krypton–graphite system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ustinov, E. A., E-mail: eustinov@mail.wplus.net

    This paper presents a refined technique to describe two-dimensional phase transitions in dense fluids adsorbed on a crystalline surface. Prediction of parameters of 2D liquid–solid equilibrium is known to be an extremely challenging problem, which is mainly due to a small difference in thermodynamic functions of coexisting phases and lack of accuracy of numerical experiments in case of their high density. This is a serious limitation of various attempts to circumvent this problem. To improve this situation, a new methodology based on the kinetic Monte Carlo method was applied. The methodology involves analysis of equilibrium gas–liquid and gas–solid systems undergoingmore » an external potential, which allows gradual shifting parameters of the phase coexistence. The interrelation of the chemical potential and tangential pressure for each system is then treated with the Gibbs–Duhem equation to obtain the point of intersection corresponding to the liquid/solid–solid equilibrium coexistence. The methodology is demonstrated on the krypton–graphite system below and above the 2D critical temperature. Using experimental data on the liquid–solid and the commensurate–incommensurate transitions in the krypton monolayer derived from adsorption isotherms, the Kr–graphite Lennard–Jones parameters have been corrected resulting in a higher periodic potential modulation.« less

  15. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  16. A quality evaluation methodology of health web-pages for non-professionals.

    PubMed

    Currò, Vincenzo; Buonuomo, Paola Sabrina; Onesimo, Roberta; de Rose, Paola; Vituzzi, Andrea; di Tanna, Gian Luca; D'Atri, Alessandro

    2004-06-01

    The proposal of an evaluation methodology for determining the quality of healthcare web sites for the dissemination of medical information to non-professionals. Three (macro) factors are considered for the quality evaluation: medical contents, accountability of the authors, and usability of the web site. Starting from two results in the literature the problem of whether or not to introduce a weighting function has been investigated. This methodology has been validated on a specialized information content, i.e., sore throats, due to the large interest such a topic enjoys with target users. The World Wide Web was accessed using a meta-search system merging several search engines. A statistical analysis was made to compare the proposed methodology with the obtained ranks of the sample web pages. The statistical analysis confirms that the variables examined (per item and sub factor) show substantially similar ranks and are capable of contributing to the evaluation of the main quality macro factors. A comparison between the aggregation functions in the proposed methodology (non-weighted averages) and the weighting functions, derived from the literature, allowed us to verify the suitability of the method. The proposed methodology suggests a simple approach which can quickly award an overall quality score for medical web sites oriented to non-professionals.

  17. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    NASA Technical Reports Server (NTRS)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  18. New methodology for fast prediction of wheel wear evolution

    NASA Astrophysics Data System (ADS)

    Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.

    2017-07-01

    In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.

  19. Finite-time H∞ control for linear continuous system with norm-bounded disturbance

    NASA Astrophysics Data System (ADS)

    Meng, Qingyi; Shen, Yanjun

    2009-04-01

    In this paper, the definition of finite-time H∞ control is presented. The system under consideration is subject to time-varying norm-bounded exogenous disturbance. The main aim of this paper is focused on the design a state feedback controller which ensures that the closed-loop system is finite-time bounded (FTB) and reduces the effect of the disturbance input on the controlled output to a prescribed level. A sufficient condition is presented for the solvability of this problem, which can be reduced to a feasibility problem involving linear matrix inequalities (LMIs). A detailed solving method is proposed for the restricted linear matrix inequalities. Finally, examples are given to show the validity of the methodology.

  20. A promise kept.

    PubMed

    Harper, W James

    2010-01-01

    This article is largely biographical and relates to my experiences of the past 67 years in research and teaching, both of equal importance in my life. I was fortunate to start at the beginning of the development of instrumental methods of analysis and have eagerly embraced each new methodology as it became available. This paper is dedicated to all those students and colleagues who taught me much and whose efforts are mainly responsible for what has been accomplished in our work with food science and technology. The research focused primarily on trying to find out the "why" behind the problems that food, and especially the dairy products area, encountered over the past 65 years. The teaching has tried to foster thinking and problem solving.

  1. Evaluation of the HARDMAN comparability methodology for manpower, personnel and training

    NASA Technical Reports Server (NTRS)

    Zimmerman, W.; Butler, R.; Gray, V.; Rosenberg, L.

    1984-01-01

    The methodology evaluation and recommendation are part of an effort to improve Hardware versus Manpower (HARDMAN) methodology for projecting manpower, personnel, and training (MPT) to support new acquisition. Several different validity tests are employed to evaluate the methodology. The methodology conforms fairly well with both the MPT user needs and other accepted manpower modeling techniques. Audits of three completed HARDMAN applications reveal only a small number of potential problem areas compared to the total number of issues investigated. The reliability study results conform well with the problem areas uncovered through the audits. The results of the accuracy studies suggest that the manpower life-cycle cost component is only marginally sensitive to changes in other related cost variables. Even with some minor problems, the methodology seem sound and has good near term utility to the Army. Recommendations are provided to firm up the problem areas revealed through the evaluation.

  2. Twelfth International Symposium on Methodologies for Intelligent Systems (ISMIS󈧄) Held in Charlotte, North Carolina on October 11-14, 2000

    DTIC Science & Technology

    2000-10-14

    without any knowledge of the problem area. Therefore, Darwinian-type evolutionary computation has found a very wide range of applications, including many ...the author examined many biomedical studies that included literature searches. The Science Citation Index (SCL) Abstracts of these studies...yield many records that are non-relevant to the main technical themes of the study. In summary, these types of simple limited queries can result in two

  3. [Discovery-based teaching and learning strategies in health: problematization and problem-based learning].

    PubMed

    Cyrino, Eliana Goldfarb; Toralles-Pereira, Maria Lúcia

    2004-01-01

    Considering the changes in teaching in the health field and the demand for new ways of dealing with knowledge in higher learning, the article discusses two innovative methodological approaches: problem-based learning (PBL) and problematization. Describing the two methods' theoretical roots, the article attempts to identify their main foundations. As distinct proposals, both contribute to a review of the teaching and learning process: problematization, focused on knowledge construction in the context of the formation of a critical awareness; PBL, focused on cognitive aspects in the construction of concepts and appropriation of basic mechanisms in science. Both problematization and PBL lead to breaks with the traditional way of teaching and learning, stimulating participatory management by actors in the experience and reorganization of the relationship between theory and practice. The critique of each proposal's possibilities and limits using the analysis of their theoretical and methodological foundations leads us to conclude that pedagogical experiences based on PBL and/or problematization can represent an innovative trend in the context of health education, fostering breaks and more sweeping changes.

  4. Computer-Aided Diagnosis Systems for Lung Cancer: Challenges and Methodologies

    PubMed Central

    El-Baz, Ayman; Beache, Garth M.; Gimel'farb, Georgy; Suzuki, Kenji; Okada, Kazunori; Elnakib, Ahmed; Soliman, Ahmed; Abdollahi, Behnoush

    2013-01-01

    This paper overviews one of the most important, interesting, and challenging problems in oncology, the problem of lung cancer diagnosis. Developing an effective computer-aided diagnosis (CAD) system for lung cancer is of great clinical importance and can increase the patient's chance of survival. For this reason, CAD systems for lung cancer have been investigated in a huge number of research studies. A typical CAD system for lung cancer diagnosis is composed of four main processing steps: segmentation of the lung fields, detection of nodules inside the lung fields, segmentation of the detected nodules, and diagnosis of the nodules as benign or malignant. This paper overviews the current state-of-the-art techniques that have been developed to implement each of these CAD processing steps. For each technique, various aspects of technical issues, implemented methodologies, training and testing databases, and validation methods, as well as achieved performances, are described. In addition, the paper addresses several challenges that researchers face in each implementation step and outlines the strengths and drawbacks of the existing approaches for lung cancer CAD systems. PMID:23431282

  5. A novel anti-windup framework for cascade control systems: an application to underactuated mechanical systems.

    PubMed

    Mehdi, Niaz; Rehan, Muhammad; Malik, Fahad Mumtaz; Bhatti, Aamer Iqbal; Tufail, Muhammad

    2014-05-01

    This paper describes the anti-windup compensator (AWC) design methodologies for stable and unstable cascade plants with cascade controllers facing actuator saturation. Two novel full-order decoupling AWC architectures, based on equivalence of the overall closed-loop system, are developed to deal with windup effects. The decoupled architectures have been developed, to formulate the AWC synthesis problem, by assuring equivalence of the coupled and the decoupled architectures, instead of using an analogy, for cascade control systems. A comparison of both AWC architectures from application point of view is provided to consolidate their utilities. Mainly, one of the architecture is better in terms of computational complexity for implementation, while the other is suitable for unstable cascade systems. On the basis of the architectures for cascade systems facing stability and performance degradation problems in the event of actuator saturation, the global AWC design methodologies utilizing linear matrix inequalities (LMIs) are developed. These LMIs are synthesized by application of the Lyapunov theory, the global sector condition and the ℒ2 gain reduction of the uncertain decoupled nonlinear component of the decoupled architecture. Further, an LMI-based local AWC design methodology is derived by utilizing a local sector condition by means of a quadratic Lyapunov function to resolve the windup problem for unstable cascade plants under saturation. To demonstrate effectiveness of the proposed AWC schemes, an underactuated mechanical system, the ball-and-beam system, is considered, and details of the simulation and practical implementation results are described. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Faux-Pas Test: A Proposal of a Standardized Short Version.

    PubMed

    Fernández-Modamio, Mar; Arrieta-Rodríguez, Marta; Bengochea-Seco, Rosario; Santacoloma-Cabero, Iciar; Gómez de Tojeiro-Roce, Juan; García-Polavieja, Bárbara; González-Fraile, Eduardo; Martín-Carrasco, Manuel; Griffin, Kim; Gil-Sanz, David

    2018-06-26

    Previous research on theory of mind suggests that people with schizophrenia have difficulties with complex mentalization tasks that involve the integration of cognition and affective mental states. One of the tools most commonly used to assess theory of mind is the Faux-Pas Test. However, it presents two main methodological problems: 1) the lack of a standard scoring system; 2) the different versions are not comparable due to a lack of information on the stories used. These methodological problems make it difficult to draw conclusions about performance on this test by people with schizophrenia. The aim of this study was to develop a reduced version of the Faux-Pas test with adequate psychometric properties. The test was administered to control and clinical groups. Interrater and test-retest reliability were analyzed for each story in order to select the set of 10 stories included in the final reduced version. The shortened version showed good psychometric properties for controls and patients: test-retest reliability of 0.97 and 0.78, inter-rater reliability of 0.95 and 0.87 and Cronbach's alpha of 0.82 and 0.72.

  7. [Production chain supply management for public hospitals: a logistical approach to healthcare].

    PubMed

    Infante, Maria; dos Santos, Maria Angélica Borges

    2007-01-01

    Despite their importance for hospital operations, discussions of healthcare organization logistics and supply and materials management are notably lacking in Brazilian literature. This paper describes a methodology for organizing the supply of medical materials in public hospitals, based on an action-research approach. Interventions were based on the assumption that a significant portion of problems in Brazil's National Health System (SUS) facilities derive from the fact that their clinical and administrative departments do not see themselves as belonging to the same production chain - neither the hospital nor the supply department is aware of what the other produces. The development of the methodology and its main steps are presented and discussed, against a background of recent literature and total quality and supply chain management concepts.

  8. An outer approximation method for the road network design problem

    PubMed Central

    2018-01-01

    Best investment in the road infrastructure or the network design is perceived as a fundamental and benchmark problem in transportation. Given a set of candidate road projects with associated costs, finding the best subset with respect to a limited budget is known as a bilevel Discrete Network Design Problem (DNDP) of NP-hard computationally complexity. We engage with the complexity with a hybrid exact-heuristic methodology based on a two-stage relaxation as follows: (i) the bilevel feature is relaxed to a single-level problem by taking the network performance function of the upper level into the user equilibrium traffic assignment problem (UE-TAP) in the lower level as a constraint. It results in a mixed-integer nonlinear programming (MINLP) problem which is then solved using the Outer Approximation (OA) algorithm (ii) we further relax the multi-commodity UE-TAP to a single-commodity MILP problem, that is, the multiple OD pairs are aggregated to a single OD pair. This methodology has two main advantages: (i) the method is proven to be highly efficient to solve the DNDP for a large-sized network of Winnipeg, Canada. The results suggest that within a limited number of iterations (as termination criterion), global optimum solutions are quickly reached in most of the cases; otherwise, good solutions (close to global optimum solutions) are found in early iterations. Comparative analysis of the networks of Gao and Sioux-Falls shows that for such a non-exact method the global optimum solutions are found in fewer iterations than those found in some analytically exact algorithms in the literature. (ii) Integration of the objective function among the constraints provides a commensurate capability to tackle the multi-objective (or multi-criteria) DNDP as well. PMID:29590111

  9. An outer approximation method for the road network design problem.

    PubMed

    Asadi Bagloee, Saeed; Sarvi, Majid

    2018-01-01

    Best investment in the road infrastructure or the network design is perceived as a fundamental and benchmark problem in transportation. Given a set of candidate road projects with associated costs, finding the best subset with respect to a limited budget is known as a bilevel Discrete Network Design Problem (DNDP) of NP-hard computationally complexity. We engage with the complexity with a hybrid exact-heuristic methodology based on a two-stage relaxation as follows: (i) the bilevel feature is relaxed to a single-level problem by taking the network performance function of the upper level into the user equilibrium traffic assignment problem (UE-TAP) in the lower level as a constraint. It results in a mixed-integer nonlinear programming (MINLP) problem which is then solved using the Outer Approximation (OA) algorithm (ii) we further relax the multi-commodity UE-TAP to a single-commodity MILP problem, that is, the multiple OD pairs are aggregated to a single OD pair. This methodology has two main advantages: (i) the method is proven to be highly efficient to solve the DNDP for a large-sized network of Winnipeg, Canada. The results suggest that within a limited number of iterations (as termination criterion), global optimum solutions are quickly reached in most of the cases; otherwise, good solutions (close to global optimum solutions) are found in early iterations. Comparative analysis of the networks of Gao and Sioux-Falls shows that for such a non-exact method the global optimum solutions are found in fewer iterations than those found in some analytically exact algorithms in the literature. (ii) Integration of the objective function among the constraints provides a commensurate capability to tackle the multi-objective (or multi-criteria) DNDP as well.

  10. Applications of decision analysis and related techniques to industrial engineering problems at KSC

    NASA Technical Reports Server (NTRS)

    Evans, Gerald W.

    1995-01-01

    This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).

  11. Estimation of the limit of detection in semiconductor gas sensors through linearized calibration models.

    PubMed

    Burgués, Javier; Jiménez-Soto, Juan Manuel; Marco, Santiago

    2018-07-12

    The limit of detection (LOD) is a key figure of merit in chemical sensing. However, the estimation of this figure of merit is hindered by the non-linear calibration curve characteristic of semiconductor gas sensor technologies such as, metal oxide (MOX), gasFETs or thermoelectric sensors. Additionally, chemical sensors suffer from cross-sensitivities and temporal stability problems. The application of the International Union of Pure and Applied Chemistry (IUPAC) recommendations for univariate LOD estimation in non-linear semiconductor gas sensors is not straightforward due to the strong statistical requirements of the IUPAC methodology (linearity, homoscedasticity, normality). Here, we propose a methodological approach to LOD estimation through linearized calibration models. As an example, the methodology is applied to the detection of low concentrations of carbon monoxide using MOX gas sensors in a scenario where the main source of error is the presence of uncontrolled levels of humidity. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Analysis of the procedures used to evaluate suicide crime scenes in Brazil: a statistical approach to interpret reports.

    PubMed

    Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira

    2014-08-01

    This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  13. A Reference Model for Monitoring IoT WSN-Based Applications.

    PubMed

    Capella, Juan Vicente; Campelo, José Carlos; Bonastre, Alberto; Ors, Rafael

    2016-10-30

    The Internet of Things (IoT) is, at this moment, one of the most promising technologies that has arisen for decades. Wireless Sensor Networks (WSNs) are one of the main pillars for many IoT applications, insofar as they require to obtain context-awareness information. The bibliography shows many difficulties in their real implementation that have prevented its massive deployment. Additionally, in IoT environments where data producers and data consumers are not directly related, compatibility and certification issues become fundamental. Both problems would profit from accurate knowledge of the internal behavior of WSNs that must be obtained by the utilization of appropriate tools. There are many ad-hoc proposals with no common structure or methodology, and intended to monitor a particular WSN. To overcome this problem, this paper proposes a structured three-layer reference model for WSN Monitoring Platforms (WSN-MP), which offers a standard environment for the design of new monitoring platforms to debug, verify and certify a WSN's behavior and performance, and applicable to every WSN. This model also allows the comparative analysis of the current proposals for monitoring the operation of WSNs. Following this methodology, it is possible to achieve a standardization of WSN-MP, promoting new research areas in order to solve the problems of each layer.

  14. Multi-criteria analysis for PM10 planning

    NASA Astrophysics Data System (ADS)

    Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa

    To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.

  15. Evaluation of counseling outcomes at a university counseling center: the impact of clinically significant change on problem resolution and academic functioning.

    PubMed

    Choi, Keum-Hyeong; Buskey, Wendy; Johnson, Bonita

    2010-07-01

    The main purpose of this study was to investigate how receiving personal counseling at a university counseling center helps students deal with their personal problems and facilitates academic functioning. To that end, this study used both clinical and academic outcome measures that are relevant to the practice of counseling provided at a counseling center and its unique function in an institution of higher education. In addition, this study used the clinical significance methodology (N. S. Jacobson & P. Truax, 1991) that takes into account clients' differences in making clinically reliable and significant change. Pre-intake and post-termination surveys, including the Outcome Questionnaire (M. J. Lambert, K. Lunnen, V. Umphress, N. Hansen, & G. Burlingame, 1994), were completed by 78 clients, and the responses were analyzed using clinical significance methodology. The results revealed that those who made clinically reliable and significant change (i.e., the recovered group) reported the highest level of improvement in academic commitment to their educational goals and problem resolution, compared with those who did not make clinically significant change. The implications of the findings on practice for counseling at university counseling centers and for administrators in higher education institutions are discussed. (c) 2010 APA, all rights reserved.

  16. [Research on psychosomatic disease. Various theoretical and methodologic aspects].

    PubMed

    Barbosa, A; Castanheira, J L; Cordeiro, J C

    1992-07-01

    This article mentions ther present main lines of psychosomatic research either in what concerns the elimination of the concept of psychosomatic illness, or in what concerns its etiological understanding of the peculiar ways of therapeutic approach. We specify some methodological problems resulting from using several instruments to collect data and measure them. We analyse the theoric relevance of the constructs: depressive equivalents and, specially, the alexithymia one. Starting from the consensual phenomonological description of this construct, we explain its psychodynamic understanding, its neurophysiologic basis and sociocultural determination. We question the relationship between alexithymia and psychosomatic illness. We point out the pertinency of its utilization as a risk or maintainance factor and the possibility of its modelling by ambiance factors. We clarify the main heuristic contributions of this construct to psychosomatic investigation and we analyse, critically and concisely, the validity and fidelity of some instruments of measure built to measure it. It is necessary to pay prior attention to psychosomatic investigation in the health area. We propose lines of investigation to be developed in our country that should have a multidisciplinary perspective.

  17. General Methodology for Designing Spacecraft Trajectories

    NASA Technical Reports Server (NTRS)

    Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.

    2012-01-01

    A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.

  18. Decision-making regarding organ donation in Korean adults: A grounded-theory study.

    PubMed

    Yeun, Eun Ja; Kwon, Young Mi; Kim, Jung A

    2015-06-01

    The aim of this study was to identify the hidden patterns of behavior leading toward the decision to donate organs. Thirteen registrants at the Association for Organ Sharing in Korea were recruited. Data were collected using in-depth interview and the interview transcripts were analyzed using Glaserian grounded-theory methodology. The main problem of participants was "body attachment" and the core category (management process) was determined to be "pursuing life." The theme consisted of four phases, which were: "hesitating," "investigating," "releasing," and "re-discovering. " Therefore, to increase organ donations, it is important to find a strategy that will create positive attitudes about organ donation through education and public relations. These results explain and provide a deeper understanding of the main problem that Korean people have about organ donation and their management of decision-making processes. These findings can help care providers to facilitate the decision-making process and respond to public needs while taking into account the sociocultural context within which decisions are made. © 2014 Wiley Publishing Asia Pty Ltd.

  19. Decision-problem state analysis methodology

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1980-01-01

    A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.

  20. Adaptive multiresolution modeling of groundwater flow in heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Malenica, Luka; Gotovac, Hrvoje; Srzic, Veljko; Andric, Ivo

    2016-04-01

    Proposed methodology was originally developed by our scientific team in Split who designed multiresolution approach for analyzing flow and transport processes in highly heterogeneous porous media. The main properties of the adaptive Fup multi-resolution approach are: 1) computational capabilities of Fup basis functions with compact support capable to resolve all spatial and temporal scales, 2) multi-resolution presentation of heterogeneity as well as all other input and output variables, 3) accurate, adaptive and efficient strategy and 4) semi-analytical properties which increase our understanding of usually complex flow and transport processes in porous media. The main computational idea behind this approach is to separately find the minimum number of basis functions and resolution levels necessary to describe each flow and transport variable with the desired accuracy on a particular adaptive grid. Therefore, each variable is separately analyzed, and the adaptive and multi-scale nature of the methodology enables not only computational efficiency and accuracy, but it also describes subsurface processes closely related to their understood physical interpretation. The methodology inherently supports a mesh-free procedure, avoiding the classical numerical integration, and yields continuous velocity and flux fields, which is vitally important for flow and transport simulations. In this paper, we will show recent improvements within the proposed methodology. Since "state of the art" multiresolution approach usually uses method of lines and only spatial adaptive procedure, temporal approximation was rarely considered as a multiscale. Therefore, novel adaptive implicit Fup integration scheme is developed, resolving all time scales within each global time step. It means that algorithm uses smaller time steps only in lines where solution changes are intensive. Application of Fup basis functions enables continuous time approximation, simple interpolation calculations across different temporal lines and local time stepping control. Critical aspect of time integration accuracy is construction of spatial stencil due to accurate calculation of spatial derivatives. Since common approach applied for wavelets and splines uses a finite difference operator, we developed here collocation one including solution values and differential operator. In this way, new improved algorithm is adaptive in space and time enabling accurate solution for groundwater flow problems, especially in highly heterogeneous porous media with large lnK variances and different correlation length scales. In addition, differences between collocation and finite volume approaches are discussed. Finally, results show application of methodology to the groundwater flow problems in highly heterogeneous confined and unconfined aquifers.

  1. A new hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.

  2. Bruxism is unlikely to cause damage to the periodontium: findings from a systematic literature assessment.

    PubMed

    Manfredini, Daniele; Ahlberg, Jari; Mura, Rossano; Lobbezoo, Frank

    2015-04-01

    This paper systematically reviews the MEDLINE and SCOPUS literature to answer the following question: Is there any evidence that bruxism may cause periodontal damage per se? Clinical studies on humans, assessing the potential relationship between bruxism and periodontal lesions (i.e., decreased attachment level, bone loss, tooth mobility/migration, altered periodontal perception) were eligible. Methodologic shortcomings were identified by the adoption of the Critical Appraisal Skills Program quality assessment, mainly concerning the internal validity of findings according to an unspecific bruxism diagnosis. The six included articles covered a high variability of topics, without multiple papers on the same argument. Findings showed that the only effect of bruxism on periodontal structures was an increase in periodontal sensation, whereas a relationship with periodontal lesions was absent. Based on the analysis of Hill criteria, the validity of causation conclusions was limited, mainly owing to the absence of a longitudinal evaluation of the temporal relationship and dose-response effects between bruxism and periodontal lesions. Despite the scarce quantity and quality of the literature that prevents sound conclusions on the causal link between bruxism and the periodontal problems assessed in this review, it seems reasonable to suggest that bruxism cannot cause periodontal damage per se. It is also important to emphasize, however, that because of methodologic problems, particularly regarding sleep bruxism assessment, more high-quality studies (e.g., randomized controlled trials) are needed to further clarify this issue.

  3. Soft Systems Methodology and Problem Framing: Development of an Environmental Problem Solving Model Respecting a New Emergent Reflexive Paradigm.

    ERIC Educational Resources Information Center

    Gauthier, Benoit; And Others

    1997-01-01

    Identifies the more representative problem-solving models in environmental education. Suggests the addition of a strategy for defining a problem situation using Soft Systems Methodology to environmental education activities explicitly designed for the development of critical thinking. Contains 45 references. (JRH)

  4. Protein sequencing via nanopore based devices: a nanofluidics perspective

    NASA Astrophysics Data System (ADS)

    Chinappi, Mauro; Cecconi, Fabio

    2018-05-01

    Proteins perform a huge number of central functions in living organisms, thus all the new techniques allowing their precise, fast and accurate characterization at single-molecule level certainly represent a burst in proteomics with important biomedical impact. In this review, we describe the recent progresses in the developing of nanopore based devices for protein sequencing. We start with a critical analysis of the main technical requirements for nanopore protein sequencing, summarizing some ideas and methodologies that have recently appeared in the literature. In the last sections, we focus on the physical modelling of the transport phenomena occurring in nanopore based devices. The multiscale nature of the problem is discussed and, in this respect, some of the main possible computational approaches are illustrated.

  5. Optimization and Technological Development Strategies of an Antimicrobial Extract from Achyrocline alata Assisted by Statistical Design

    PubMed Central

    Demarque, Daniel P.; Fitts, Sonia Maria F.; Boaretto, Amanda G.; da Silva, Júlio César Leite; Vieira, Maria C.; Franco, Vanessa N. P.; Teixeira, Caroline B.; Toffoli-Kadri, Mônica C.; Carollo, Carlos A.

    2015-01-01

    Achyrocline alata, known as Jateí-ka-há, is traditionally used to treat several health problems, including inflammations and infections. This study aimed to optimize an active extract against Streptococcus mutans, the main bacteria that causes caries. The extract was developed using an accelerated solvent extraction and chemometric calculations. Factorial design and response surface methodologies were used to determine the most important variables, such as active compound selectivity. The standardized extraction recovered 99% of the four main compounds, gnaphaliin, helipyrone, obtusifolin and lepidissipyrone, which represent 44% of the extract. The optimized extract of A. alata has a MIC of 62.5 μg/mL against S. mutans and could be used in mouth care products. PMID:25710523

  6. On the effect of model parameters on forecast objects

    NASA Astrophysics Data System (ADS)

    Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott

    2018-04-01

    Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map. The field for some quantities generally consists of spatially coherent and disconnected objects. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.

  7. A new approach to the concept of "relevance" in information retrieval (IR).

    PubMed

    Kagolovsky, Y; Möhr, J R

    2001-01-01

    The concept of "relevance" is the fundamental concept of information science in general and information retrieval, in particular. Although "relevance" is extensively used in evaluation of information retrieval, there are considerable problems associated with reaching an agreement on its definition, meaning, evaluation, and application in information retrieval. There are a number of different views on "relevance" and its use for evaluation. Based on a review of the literature the main problems associated with the concept of "relevance" in information retrieval are identified. The authors argue that the proposal for the solution of the problems can be based on the conceptual IR framework built using a systems analytic approach to IR. Using this framework different kinds of "relevance" relationships in the IR process are identified, and a methodology for evaluation of "relevance" based on methods of semantics capturing and comparison is proposed.

  8. Facing the Language-Memory Problem in the Study of Autobiographical Memory.

    PubMed

    Bartoli, Eleonora; Smorti, Andrea

    2018-05-16

    This paper discusses the problem of the role of language in autobiographical memory, that is barely considered in studies on autobiographical memories and narratives. As a matter of fact, most of the current studies on autobiographical memory confounded memory and narrative together. The present paper focuses on two main issues. Firstly, it debates how narratives contribute to the construction of autobiographical memories through self-other communication. Secondly, it reflects on how language and communication should be manipulated in studies about autobiographical memory. This paper is made of three sections: the first section discusses the role of language, particularly in the form of narrative, as a social tool by which autobiographical memories can be organised in a life story; the second section examines previous methods of investigation used in the study of autobiographical memories; finally, the third section proposes different methodological alternatives to overcome the problems emerging from our analysis of literature.

  9. Connectivity-Preserving Approach for Distributed Adaptive Synchronized Tracking of Networked Uncertain Nonholonomic Mobile Robots.

    PubMed

    Yoo, Sung Jin; Park, Bong Seok

    2017-09-06

    This paper addresses a distributed connectivity-preserving synchronized tracking problem of multiple uncertain nonholonomic mobile robots with limited communication ranges. The information of the time-varying leader robot is assumed to be accessible to only a small fraction of follower robots. The main contribution of this paper is to introduce a new distributed nonlinear error surface for dealing with both the synchronized tracking and the preservation of the initial connectivity patterns among nonholonomic robots. Based on this nonlinear error surface, the recursive design methodology is presented to construct the approximation-based local adaptive tracking scheme at the robot dynamic level. Furthermore, a technical lemma is established to analyze the stability and the connectivity preservation of the total closed-loop control system in the Lyapunov sense. An example is provided to illustrate the effectiveness of the proposed methodology.

  10. Economic evaluation of health promotion interventions for older people: do applied economic studies meet the methodological challenges?

    PubMed

    Huter, Kai; Dubas-Jakóbczyk, Katarzyna; Kocot, Ewa; Kissimova-Skarbek, Katarzyna; Rothgang, Heinz

    2018-01-01

    In the light of demographic developments health promotion interventions for older people are gaining importance. In addition to methodological challenges arising from the economic evaluation of health promotion interventions in general, there are specific methodological problems for the particular target group of older people. There are especially four main methodological challenges that are discussed in the literature. They concern measurement and valuation of informal caregiving, accounting for productivity costs, effects of unrelated cost in added life years and the inclusion of 'beyond-health' benefits. This paper focuses on the question whether and to what extent specific methodological requirements are actually met in applied health economic evaluations. Following a systematic review of pertinent health economic evaluations, the included studies are analysed on the basis of four assessment criteria that are derived from methodological debates on the economic evaluation of health promotion interventions in general and economic evaluations targeting older people in particular. Of the 37 studies included in the systematic review, only very few include cost and outcome categories discussed as being of specific relevance to the assessment of health promotion interventions for older people. The few studies that consider these aspects use very heterogeneous methods, thus there is no common methodological standard. There is a strong need for the development of guidelines to achieve better comparability and to include cost categories and outcomes that are relevant for older people. Disregarding these methodological obstacles could implicitly lead to discrimination against the elderly in terms of health promotion and disease prevention and, hence, an age-based rationing of public health care.

  11. Generation of three-dimensional body-fitted grids by solving hyperbolic partial differential equations

    NASA Technical Reports Server (NTRS)

    Steger, Joseph L.

    1989-01-01

    Hyperbolic grid generation procedures are described which have been used in external flow simulations about complex configurations. For many practical applications a single well-ordered (i.e., structured) grid can be used to mesh an entire configuration, in other problems, composite or unstructured grid procedures are needed. Although the hyperbolic partial differential equation grid generation procedure has mainly been utilized to generate structured grids, an extension of the procedure to semiunstructured grids is briefly described. Extensions of the methodology are also described using two-dimensional equations.

  12. Generation of three-dimensional body-fitted grids by solving hyperbolic and parabolic partial differential equations

    NASA Technical Reports Server (NTRS)

    Steger, Joseph L.

    1989-01-01

    Hyperbolic grid generation procedures are described which have been used in external flow simulations about complex configurations. For many practical applications a single well-ordered (i.e., structured) grid can be used to mesh an entire configuration, in other problems, composite or unstructured grid procedures are needed. Although the hyperbolic partial differential equation grid generation procedure has mainly been utilized to generate structured grids, extension of the procedure to semiunstructured grids is briefly described. Extensions of the methodology are also described using two-dimensional equations.

  13. Community access to health information in Ireland.

    PubMed

    Macdougall, J

    1999-06-01

    This paper is based on a research project conducted on consumer health information (CHI) in the Republic of Ireland, the results of which were published in a report entitled Well Read: Developing Consumer Health Information in Ireland. The paper describes the research methodology and the Irish experience in relation to CHI followed by a discussion of access problems, illustrated with examples from the special needs and primary care sectors. The role of information providers in relation to primary healthcare and libraries is examined briefly, and finally the main research conclusions and recommendations are highlighted.

  14. Proceedings of the NATO-Advanced Study Institute on Computer Aided Analysis of Rigid and Flexible Mechanical Systems Held in Troia, Portugal on June 27-July 9, 1993. Volume 1. Main Lectures

    DTIC Science & Technology

    1993-07-09

    real-time simulation capabilities, highly non -linear control devices, work space path planing, active control of machine flexibilities and reliability...P.M., "The Information Capacity of the Human Motor System in Controlling the Amplitude of Movement," Journal of Experimental Psychology, Vol 47, No...driven many research groups in the challenging problem of flexible sy,;tems with an increasing interaction with finite element methodologies. Basic

  15. Typology of person-environment fit constellations: a platform addressing accessibility problems in the built environment for people with functional limitations.

    PubMed

    Slaug, Björn; Schilling, Oliver; Iwarsson, Susanne; Carlsson, Gunilla

    2015-09-02

    Making the built environment accessible for all regardless of functional capacity is an important goal for public health efforts. Considerable impediments to achieving this goal suggest the need for valid measurements of acccessibility and for greater attention to the complexity of person-environment fit issues. To address these needs, this study aimed to provide a methodological platform, useful for further research and instrument development within accessibility research. This was accomplished by the construction of a typology of problematic person-environment fit constellations, utilizing an existing methodology developed to assess and analyze accessibility problems in the built environment. By means of qualitative review and statistical methods we classified the person-environment fit components covered by an existing application which targets housing accessibility: the Housing Enabler (HE) instrument. The International Classification of Functioning, Disability and Health (ICF) was used as a conceptual framework. Qualitative classification principles were based on conceptual similarities and for quantitative analysis of similarities, Principal Component Analysis was carried out. We present a typology of problematic person-environment fit constellations classified along three dimensions: 1) accessibility problem range and severity 2) aspects of functioning 3) environmental context. As a result of the classification of the HE components, 48 typical person-environment fit constellations were recognised. The main contribution of this study is the proposed typology of person-environment fit constellations. The typology provides a methodological platform for the identification and quantification of problematic person-environment fit constellations. Its link to the globally accepted ICF classification system facilitates communication within the scientific and health care practice communities. The typology also highlights how relations between aspects of functioning and physical environmental barriers generate typical accessibility problems, and thereby furnishes a reference point for research oriented to how the built environment may be designed to be supportive for activity, participation and health.

  16. Eco-analytical Methodology in Environmental Problems Monitoring

    NASA Astrophysics Data System (ADS)

    Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.

    2017-01-01

    Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.

  17. Learning Science in the 21st century - a shared experience between schools

    NASA Astrophysics Data System (ADS)

    Pinto, Tânia; Soares, Rosa; Ruas, Fátima

    2015-04-01

    Problem Based Learning is considered an innovative teaching and learning inquiry methodology that is student centered, focused in the resolution of an authentic problem and in which the teacher acts like a facilitator of the work in small groups. In this process, it is expected that students develop attitudinal, procedural and communication skills, in addition to the cognitive typically valued. PBL implementation also allows the use of multiple educational strategies, like laboratorial experiments, analogue modeling or ICT (video animations, electronic presentations or software simulations, for instance), which can potentiate a more interactive environment in the classroom. In this study, taken in three schools in the north of Portugal, which resulted from the cooperation between three science teachers, with a 75 individuals sample, were examined students' opinions about the main difficulties and strengths concerning the PBL methodology, having as a common denominator the use of a laboratorial experiment followed by an adequate digital software as educational resource to interpret the obtained results and to make predictions (e.g. EarthQuake, Virtual Quake, Stellarium). The data collection methods were based on direct observation and questionnaires. The results globally show that this educational approach motivates students' towards science, helping them to solve problems from daily life and that the use of software was relevant, as well as the collaborative working. The cognitive strand continues to be the most valued by pupils.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuzmina, L.K.

    The research deals with different aspects of mathematical modelling and the analysis of complex dynamic non-linear systems as a consequence of applied problems in mechanics (in particular those for gyrosystems, for stabilization and orientation systems, control systems of movable objects, including the aviation and aerospace systems) Non-linearity, multi-connectedness and high dimensionness of dynamical problems, that occur at the initial full statement lead to the need of the problem narrowing, and of the decomposition of the full model, but with safe-keeping of main properties and of qualitative equivalence. The elaboration of regular methods for modelling problems in dynamics, the generalization ofmore » reduction principle are the main aims of the investigations. Here, uniform methodology, based on Lyapunov`s methods, founded by N.G.Ohetayev, is developed. The objects of the investigations are considered with exclusive positions, as systems of singularly perturbed class, treated as ones with singular parametrical perturbations. It is the natural extension of the statements of N.G.Chetayev and P.A.Kuzmin for parametrical stability. In paper the systematical procedures for construction of correct simplified models (comparison ones) are developed, the validity conditions of the transition are determined the appraisals are received, the regular algorithms of engineering level are obtained. Applicabilitelly to the stabilization and orientation systems with the gyroscopic controlling subsystems, these methods enable to build the hierarchical sequence of admissible simplified models; to determine the conditions of their correctness.« less

  19. Dynamic Decision Making under Uncertainty and Partial Information

    DTIC Science & Technology

    2017-01-30

    order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those

  20. Size Distributions of Solar Proton Events: Methodological and Physical Restrictions

    NASA Astrophysics Data System (ADS)

    Miroshnichenko, L. I.; Yanke, V. G.

    2016-12-01

    Based on the new catalogue of solar proton events (SPEs) for the period of 1997 - 2009 (Solar Cycle 23) we revisit the long-studied problem of the event-size distributions in the context of those constructed for other solar-flare parameters. Recent results on the problem of size distributions of solar flares and proton events are briefly reviewed. Even a cursory acquaintance with this research field reveals a rather mixed and controversial picture. We concentrate on three main issues: i) SPE size distribution for {>} 10 MeV protons in Solar Cycle 23; ii) size distribution of {>} 1 GV proton events in 1942 - 2014; iii) variations of annual numbers for {>} 10 MeV proton events on long time scales (1955 - 2015). Different results are critically compared; most of the studies in this field are shown to suffer from vastly different input datasets as well as from insufficient knowledge of underlying physical processes in the SPEs under consideration. New studies in this field should be made on more distinct physical and methodological bases. It is important to note the evident similarity in size distributions of solar flares and superflares in Sun-like stars.

  1. Hospital Management Between The Modern Image And Aging

    NASA Astrophysics Data System (ADS)

    Dadulescu, Ana-Maria

    2015-09-01

    Hospital management has experienced significant progress with the evolution of the Romanian health system reform, it has made strides in terms of resource allocation and cost control, new systems for classification, evaluation and monitoring (DRGs, SIUI, CaPeSaRo) were implemented, some taken from other countries and adapted to local conditions, but not always integrated with the other components and sometimes incompletely implemented and developed. This material does not offer definite solutions to current problems. It only briefly addresses the main aspects of hospital activity, and points out some failures with whom hospital managers are presently faced. Once the problems are identified it creates prerequisites for solving them, it opens channels of research and development of new methodologies or correlation of the existing deficient workflows that can be corrected.

  2. The integration of automated knowledge acquisition with computer-aided software engineering for space shuttle expert systems

    NASA Technical Reports Server (NTRS)

    Modesitt, Kenneth L.

    1990-01-01

    A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.

  3. Infrared stereo calibration for unmanned ground vehicle navigation

    NASA Astrophysics Data System (ADS)

    Harguess, Josh; Strange, Shawn

    2014-06-01

    The problem of calibrating two color cameras as a stereo pair has been heavily researched and many off-the-shelf software packages, such as Robot Operating System and OpenCV, include calibration routines that work in most cases. However, the problem of calibrating two infrared (IR) cameras for the purposes of sensor fusion and point could generation is relatively new and many challenges exist. We present a comparison of color camera and IR camera stereo calibration using data from an unmanned ground vehicle. There are two main challenges in IR stereo calibration; the calibration board (material, design, etc.) and the accuracy of calibration pattern detection. We present our analysis of these challenges along with our IR stereo calibration methodology. Finally, we present our results both visually and analytically with computed reprojection errors.

  4. A new diagnostic of stratospheric polar vortices

    NASA Astrophysics Data System (ADS)

    Gimeno, Luis; de La Torre, Laura; Nieto, Raquel; Gallego, David; Ribera, Pedro; García-Herrera, Ricardo

    2007-11-01

    We studied the main climatological features of the Arctic and Antarctic stratospheric vortices, using a new approach based on defining the vortex edge as the 50 hPa geostrophic streamline of maximum average velocity at each hemisphere. Given the use of NCAR-NCEP reanalysis data, it was thought advisable to limit the study to the periods 1958 2004 for the Northern Hemisphere (NH) and 1979 2004 for the Southern Hemisphere (SH). After describing the method and testing sample results with those from other approaches, we analysed the climatological means and trends of the four most distinctive characteristics of the vortices: average latitude, strength, area, and temperature. In general terms, our results confirm most of what is already known about the stratospheric vortices from previous studies that used different data and approaches. In addition, the new methodology provides some interesting new quantifications of the dominant wavenumber and its interannual variability, as well as the principal variability modes through an empirical orthogonal function analysis that was performed directly over the vortex trajectories. The main drawbacks of the methodology, such as noticeable problems characterising highly disturbed stratospheric structures as multiple or off-pole vortices, are also identified.

  5. Info-Gap robustness pathway method for transitioning of urban drainage systems under deep uncertainties.

    PubMed

    Zischg, Jonatan; Goncalves, Mariana L R; Bacchin, Taneha Kuzniecow; Leonhardt, Günther; Viklander, Maria; van Timmeren, Arjan; Rauch, Wolfgang; Sitzenfrei, Robert

    2017-09-01

    In the urban water cycle, there are different ways of handling stormwater runoff. Traditional systems mainly rely on underground piped, sometimes named 'gray' infrastructure. New and so-called 'green/blue' ambitions aim for treating and conveying the runoff at the surface. Such concepts are mainly based on ground infiltration and temporal storage. In this work a methodology to create and compare different planning alternatives for stormwater handling on their pathways to a desired system state is presented. Investigations are made to assess the system performance and robustness when facing the deeply uncertain spatial and temporal developments in the future urban fabric, including impacts caused by climate change, urbanization and other disruptive events, like shifts in the network layout and interactions of 'gray' and 'green/blue' structures. With the Info-Gap robustness pathway method, three planning alternatives are evaluated to identify critical performance levels at different stages over time. This novel methodology is applied to a real case study problem where a city relocation process takes place during the upcoming decades. In this case study it is shown that hybrid systems including green infrastructures are more robust with respect to future uncertainties, compared to traditional network design.

  6. Genomic and metagenomic technologies to explore the antibiotic resistance mobilome.

    PubMed

    Martínez, José L; Coque, Teresa M; Lanza, Val F; de la Cruz, Fernando; Baquero, Fernando

    2017-01-01

    Antibiotic resistance is a relevant problem for human health that requires global approaches to establish a deep understanding of the processes of acquisition, stabilization, and spread of resistance among human bacterial pathogens. Since natural (nonclinical) ecosystems are reservoirs of resistance genes, a health-integrated study of the epidemiology of antibiotic resistance requires the exploration of such ecosystems with the aim of determining the role they may play in the selection, evolution, and spread of antibiotic resistance genes, involving the so-called resistance mobilome. High-throughput sequencing techniques allow an unprecedented opportunity to describe the genetic composition of a given microbiome without the need to subculture the organisms present inside. However, bioinformatic methods for analyzing this bulk of data, mainly with respect to binning each resistance gene with the organism hosting it, are still in their infancy. Here, we discuss how current genomic methodologies can serve to analyze the resistance mobilome and its linkage with different bacterial genomes and metagenomes. In addition, we describe the drawbacks of current methodologies for analyzing the resistance mobilome, mainly in cases of complex microbiotas, and discuss the possibility of implementing novel tools to improve our current metagenomic toolbox. © 2016 New York Academy of Sciences.

  7. A guided tour of current research in synovial joints with reference to wavelet methodology

    NASA Astrophysics Data System (ADS)

    Agarwal, Ruchi; Salimath, C. S.; Alam, Khursheed

    2017-10-01

    Main aim of this article is to provide a comprehensive overview of biomechanical aspects of synovial joints of human body. This can be considered as a part of continued research work carried out by various authors over a period of time. Almost every person once in life time has suffered from joint disease; this has triggered intensive investigation into various biomechanical aspects of synovial joints. This has also resulted into an increase of arthroplasty with introduction to various clinical trials. From last few decades new improvements and ideas for new technologies have been introduced to decrease the incidence of joint problem. In this paper a literature survey of recent advances, developments and recognition of wear and tear of human joint is presented. Wavelet method in Computational fluid dynamics (CFD) is relatively a new research field. This review aims to provide a glimpse of wavelet methodology in CFD. Wavelets methodology has played a vital role in the solution of governing equation of synovial fluid flow in the synovial joints represented by Reynolds equation and its modified version.

  8. The use of repetitive transcranial magnetic stimulation for modulating craving and addictive behaviours: a critical literature review of efficacy, technical and methodological considerations.

    PubMed

    Grall-Bronnec, M; Sauvaget, A

    2014-11-01

    Repetitive transcranial magnetic stimulation (rTMS) is a potential therapeutic intervention for the treatment of addiction. This critical review aims to summarise the recent developments with respect to the efficacy of rTMS for all types of addiction and related disorders (including eating disorders), and concentrates on the associated methodological and technical issues. The bibliographic search consisted of a computerised screening of the Medline and ScienceDirect databases up to December 2013. Criteria for inclusion were the target problem was an addiction, a related disorder, or craving; the intervention was performed using rTMS; and the study was a clinical trial. Of the potential 638 articles, 18 met the criteria for inclusion. Most of these (11 of the 18) supported the efficacy of rTMS, especially in the short term. In most cases, the main assessment criterion was the measurement of craving using a Visual Analogue Scale. The results are discussed with respect to the study limitations and, in particular, the many methodological and technical discrepancies that were identified. Key recommendations are provided.

  9. Ergonomic assessment methodologies in manual handling of loads--opportunities in organizations.

    PubMed

    Pires, Claudia

    2012-01-01

    The present study was developed based on the analysis of workplaces in the engineering industry, particularly in automotive companies. The main objectives of the study were to evaluate the activities present in the workplace concerning manual handling, using assessment methodologies NIOSH Ergonomic Equation [1] and Manual Material Handling [2], present in ISO 11228 [3-4], and to consider the possibility of developing musculoskeletal injuries associated with these activities, an issue of great concern in all industrial sectors. Similarly, it was also shown the suitability of each method to the task concerned. The study was conducted in three steps. The first step was to collect images and information about the target tasks. As a second step proceeded to the analysis, determining the method to use and to evaluate activities. Finally, we found the results obtained and acted on accordingly. With the study observed situations considered urgent action, according to the methodologies used, and proceeded to develop solutions in order to solve the problems identified, eliminating and / or minimizing embarrassing situations and harmful to employees.

  10. Diverse and participative learning methodologies: a remedial teaching intervention for low marks dental students in Chile.

    PubMed

    Alcota, Marcela; Muñoz, Andrea; González, Fermín E

    2011-10-01

    The purpose of this educational intervention was to diagnose the learning style of a group of low marks (i.e., grades) dental students in Chile and improve their academic achievement by means of remedial teaching. The intervention group was composed of ten students in endodontics and eleven in pedodontics with low marks. These two groups were mutually exclusive. The Kolb test of learning styles was applied to the low mark students group and to the rest of the class (n=72). Diverse methodologies were applied to the low marks students, such as seminars, case-based learning and problem-based learning, directed study, plenary discussions and debate, integration and questions, and web-based learning in an effort to cover all learning styles. Students' perceptions of the educational intervention were assessed by means of a questionnaire. The learning styles of the low marks group were mainly divergent (52.4 percent) and convergent (19 percent). Accommodators and assimilators were 14.3 percent each. The rest of the class showed a very distinct frequencies distribution: divergent 18 percent, convergent 20 percent, accommodators 28 percent, and assimilators 34 percent. After the educational intervention, the mean of the scores obtained by the intervention group in formal evaluations was higher than the average scores obtained before the intervention for both courses. Students' perceptions of the activities were that they were effective for their learning process (76 percent) and that the teaching methodologies were useful mainly to clarify concepts and contents from both courses (82 percent). We can conclude that the use of diverse and participative teaching methodologies in a remedial teaching intervention, to cover all the different learning styles of the students, contributes to improve their marks in formal evaluations.

  11. Interventions to improve cultural competency in health care for Indigenous peoples of Australia, New Zealand, Canada and the USA: a systematic review.

    PubMed

    Clifford, Anton; McCalman, Janya; Bainbridge, Roxanne; Tsey, Komla

    2015-04-01

    This article describes the characteristics and reviews the methodological quality of interventions designed to improve cultural competency in health care for Indigenous peoples of Australia, New Zealand, Canada and the USA. A total of 17 electronic databases and 13 websites for the period of 2002-13. Studies were included if they evaluated an intervention strategy designed to improve cultural competency in health care for Indigenous peoples of Australia, New Zealand, the USA or Canada. Information on the characteristics and methodological quality of included studies was extracted using standardized assessment tools. Sixteen published evaluations of interventions to improve cultural competency in health care for Indigenous peoples were identified: 11 for Indigenous peoples of the USA and 5 for Indigenous Australians. The main types of intervention strategies were education and training of the health workforce, culturally specific health programs and recruitment of an Indigenous health workforce. Main positive outcomes reported were improvements in health professionals' confidence, and patients' satisfaction with and access to health care. The methodological quality of evaluations and the reporting of key methodological criteria were variable. Particular problems included weak study designs, low or no reporting of consent rates, confounding and non-validated measurement instruments. There is a lack of evidence from rigorous evaluations on the effectiveness of interventions for improving cultural competency in health care for Indigenous peoples. Future evaluations should employ more rigorous study designs and extend their measurement of outcomes beyond those relating to health professionals, to those relating to the health of Indigenous peoples. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  12. Meeting EHR security requirements: SeAAS approach.

    PubMed

    Katt, Basel; Trojer, Thomas; Breu, Ruth; Schabetsberger, Thomas; Wozak, Florian

    2010-01-01

    In the last few years, Electronic Health Record (EHR) systems have received a great attention in the literature, as well as in the industry. They are expected to lead to health care savings, increase health care quality and reduce medical errors. This interest has been accompanied by the development of different standards and frameworks to meet EHR challenges. One of the most important initiatives that was developed to solve problems of EHR is IHE (Integrating the Healthcare Enterprise), which adapts the distributed approach to store and manage healthcare data. IHE aims at standardizing the way healthcare systems exchange information in distributed environments. For this purpose it defines several so called Integration Profiles that specify the interactions and the interfaces (Transactions) between various healthcare systems (Actors) or entities. Security was considered also in few profiles that tackled the main security requirements, mainly authentication and audit trails. The security profiles of IHE currently suffer two drawbacks. First, they apply end point security methodology, which has been proven recently to be insufficient and cumbersome in distributed and heterogeneous environment. Second, the current security profiles for more complex security requirements are oversimplified, vague and do not consider architectural design. This recently changed to some extend e.g., with the introduction of newly published white papers regarding privacy [5] and access control [9]. In order to solve the first problem we utilize results of previous studies conducted in the area of security-aware IHE-based systems and the state-of-the-art Security-as-a-Service approach as a convenient methodology to group domain-wide security needs and overcome the end point security shortcomings.

  13. Potential of neuro-fuzzy methodology to estimate noise level of wind turbines

    NASA Astrophysics Data System (ADS)

    Nikolić, Vlastimir; Petković, Dalibor; Por, Lip Yee; Shamshirband, Shahaboddin; Zamani, Mazdak; Ćojbašić, Žarko; Motamedi, Shervin

    2016-01-01

    Wind turbines noise effect became large problem because of increasing of wind farms numbers since renewable energy becomes the most influential energy sources. However, wind turbine noise generation and propagation is not understandable in all aspects. Mechanical noise of wind turbines can be ignored since aerodynamic noise of wind turbine blades is the main source of the noise generation. Numerical simulations of the noise effects of the wind turbine can be very challenging task. Therefore in this article soft computing method is used to evaluate noise level of wind turbines. The main goal of the study is to estimate wind turbine noise in regard of wind speed at different heights and for different sound frequency. Adaptive neuro-fuzzy inference system (ANFIS) is used to estimate the wind turbine noise levels.

  14. Knowledge based system and decision making methodologies in materials selection for aircraft cabin metallic structures

    NASA Astrophysics Data System (ADS)

    Adhikari, Pashupati Raj

    Materials selection processes have been the most important aspects in product design and development. Knowledge-based system (KBS) and some of the methodologies used in the materials selection for the design of aircraft cabin metallic structures are discussed. Overall aircraft weight reduction means substantially less fuel consumption. Part of the solution to this problem is to find a way to reduce overall weight of metallic structures inside the cabin. Among various methodologies of materials selection using Multi Criterion Decision Making (MCDM) techniques, a few of them are demonstrated with examples and the results are compared with those obtained using Ashby's approach in materials selection. Pre-defined constraint values, mainly mechanical properties, are employed as relevant attributes in the process. Aluminum alloys with high strength-to-weight ratio have been second-to-none in most of the aircraft parts manufacturing. Magnesium alloys that are much lighter in weight as alternatives to the Al-alloys currently in use in the structures are tested using the methodologies and ranked results are compared. Each material attribute considered in the design are categorized as benefit and non-benefit attribute. Using Ashby's approach, material indices that are required to be maximized for an optimum performance are determined, and materials are ranked based on the average of consolidated indices ranking. Ranking results are compared for any disparity among the methodologies.

  15. Methodological issues in the study of violence against women

    PubMed Central

    Ruiz‐Pérez, Isabel; Plazaola‐Castaño, Juncal; Vives‐Cases, Carmen

    2007-01-01

    The objective of this paper is to review the methodological issues that arise when studying violence against women as a public health problem, focusing on intimate partner violence (IPV), since this is the form of violence that has the greatest consequences at a social and political level. The paper focuses first on the problems of defining what is meant by IPV. Secondly, the paper describes the difficulties in assessing the magnitude of the problem. Obtaining reliable data on this type of violence is a complex task, because of the methodological issues derived from the very nature of the phenomenon, such as the private, intimate context in which this violence often takes place, which means the problem cannot be directly observed. Finally, the paper examines the limitations and bias in research on violence, including the lack of consensus with regard to measuring events that may or may not represent a risk factor for violence against women or the methodological problem related to the type of sampling used in both aetiological and prevalence studies. PMID:18000113

  16. Performance Parameters Analysis of an XD3P Peugeot Engine Using Artificial Neural Networks (ANN) Concept in MATLAB

    NASA Astrophysics Data System (ADS)

    Rangaswamy, T.; Vidhyashankar, S.; Madhusudan, M.; Bharath Shekar, H. R.

    2015-04-01

    The current trends of engineering follow the basic rule of innovation in mechanical engineering aspects. For the engineers to be efficient, problem solving aspects need to be viewed in a multidimensional perspective. One such methodology implemented is the fusion of technologies from other disciplines in order to solve the problems. This paper mainly deals with the application of Neural Networks in order to analyze the performance parameters of an XD3P Peugeot engine (used in Ministry of Defence). The basic propaganda of the work is divided into two main working stages. In the former stage, experimentation of an IC engine is carried out in order to obtain the primary data. In the latter stage the primary database formed is used to design and implement a predictive neural network in order to analyze the output parameters variation with respect to each other. A mathematical governing equation for the neural network is obtained. The obtained polynomial equation describes the characteristic behavior of the built neural network system. Finally, a comparative study of the results is carried out.

  17. Compressed learning and its applications to subcellular localization.

    PubMed

    Zheng, Zhong-Long; Guo, Li; Jia, Jiong; Xie, Chen-Mao; Zeng, Wen-Cai; Yang, Jie

    2011-09-01

    One of the main challenges faced by biological applications is to predict protein subcellular localization in automatic fashion accurately. To achieve this in these applications, a wide variety of machine learning methods have been proposed in recent years. Most of them focus on finding the optimal classification scheme and less of them take the simplifying the complexity of biological systems into account. Traditionally, such bio-data are analyzed by first performing a feature selection before classification. Motivated by CS (Compressed Sensing) theory, we propose the methodology which performs compressed learning with a sparseness criterion such that feature selection and dimension reduction are merged into one analysis. The proposed methodology decreases the complexity of biological system, while increases protein subcellular localization accuracy. Experimental results are quite encouraging, indicating that the aforementioned sparse methods are quite promising in dealing with complicated biological problems, such as predicting the subcellular localization of Gram-negative bacterial proteins.

  18. [Research on the Application of Fuzzy Logic to Systems Analysis and Control

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Research conducted with the support of NASA Grant NCC2-275 has been focused in the main on the development of fuzzy logic and soft computing methodologies and their applications to systems analysis and control. with emphasis 011 problem areas which are of relevance to NASA's missions. One of the principal results of our research has been the development of a new methodology called Computing with Words (CW). Basically, in CW words drawn from a natural language are employed in place of numbers for computing and reasoning. There are two major imperatives for computing with words. First, computing with words is a necessity when the available information is too imprecise to justify the use of numbers, and second, when there is a tolerance for imprecision which can be exploited to achieve tractability, robustness, low solution cost, and better rapport with reality. Exploitation of the tolerance for imprecision is an issue of central importance in CW.

  19. Probabilistic self-organizing maps for continuous data.

    PubMed

    Lopez-Rubio, Ezequiel

    2010-10-01

    The original self-organizing feature map did not define any probability distribution on the input space. However, the advantages of introducing probabilistic methodologies into self-organizing map models were soon evident. This has led to a wide range of proposals which reflect the current emergence of probabilistic approaches to computational intelligence. The underlying estimation theories behind them derive from two main lines of thought: the expectation maximization methodology and stochastic approximation methods. Here, we present a comprehensive view of the state of the art, with a unifying perspective of the involved theoretical frameworks. In particular, we examine the most commonly used continuous probability distributions, self-organization mechanisms, and learning schemes. Special emphasis is given to the connections among them and their relative advantages depending on the characteristics of the problem at hand. Furthermore, we evaluate their performance in two typical applications of self-organizing maps: classification and visualization.

  20. Oropharyngeal dysphagia in myotonic dystrophy type 1: a systematic review.

    PubMed

    Pilz, Walmari; Baijens, Laura W J; Kremer, Bernd

    2014-06-01

    A systematic review was conducted to investigate the pathophysiology of and diagnostic procedures for oropharyngeal dysphagia in myotonic dystrophy (MD). The electronic databases Embase, PubMed, and The Cochrane Library were used. The search was limited to English, Dutch, French, German, Spanish, and Portuguese publications. Sixteen studies met the inclusion criteria. Two independent reviewers assessed the methodological quality of the included articles. Swallowing assessment tools, the corresponding protocols, the studies' outcome measurements, and main findings are summarized and presented. The body of literature on pathophysiology of swallowing in dysphagic patients with MD type 1 remains scant. The included studies are heterogeneous with respect to design and outcome measures and hence are not directly comparable. More importantly, most studies had methodological problems. These are discussed in detail and recommendations for further research on diagnostic examinations for swallowing disorders in patients with MD type 1 are provided.

  1. Training effectiveness assessment: Methodological problems and issues

    NASA Technical Reports Server (NTRS)

    Cross, Kenneth D.

    1992-01-01

    The U.S. military uses a large number of simulators to train and sustain the flying skills of helicopter pilots. Despite the enormous resources required to purchase, maintain, and use those simulators, little effort has been expended in assessing their training effectiveness. One reason for this is the lack of an evaluation methodology that yields comprehensive and valid data at a practical cost. Some of these methodological problems and issues that arise in assessing simulator training effectiveness, as well as problems with the classical transfer-of-learning paradigm were discussed.

  2. A systematic review of suicide prevention interventions targeting indigenous peoples in Australia, United States, Canada and New Zealand

    PubMed Central

    2013-01-01

    Background Indigenous peoples of Australia, Canada, United States and New Zealand experience disproportionately high rates of suicide. As such, the methodological quality of evaluations of suicide prevention interventions targeting these Indigenous populations should be rigorously examined, in order to determine the extent to which they are effective for reducing rates of Indigenous suicide and suicidal behaviours. This systematic review aims to: 1) identify published evaluations of suicide prevention interventions targeting Indigenous peoples in Australia, Canada, United States and New Zealand; 2) critique their methodological quality; and 3) describe their main characteristics. Methods A systematic search of 17 electronic databases and 13 websites for the period 1981–2012 (inclusive) was undertaken. The reference lists of reviews of suicide prevention interventions were hand-searched for additional relevant studies not identified by the electronic and web search. The methodological quality of evaluations of suicide prevention interventions was assessed using a standardised assessment tool. Results Nine evaluations of suicide prevention interventions were identified: five targeting Native Americans; three targeting Aboriginal Australians; and one First Nation Canadians. The main intervention strategies employed included: Community Prevention, Gatekeeper Training, and Education. Only three of the nine evaluations measured changes in rates of suicide or suicidal behaviour, all of which reported significant improvements. The methodological quality of evaluations was variable. Particular problems included weak study designs, reliance on self-report measures, highly variable consent and follow-up rates, and the absence of economic or cost analyses. Conclusions There is an urgent need for an increase in the number of evaluations of preventive interventions targeting reductions in Indigenous suicide using methodologically rigorous study designs across geographically and culturally diverse Indigenous populations. Combining and tailoring best evidence and culturally-specific individual strategies into one coherent suicide prevention program for delivery to whole Indigenous communities and/or population groups at high risk of suicide offers considerable promise. PMID:23663493

  3. A systematic review of suicide prevention interventions targeting indigenous peoples in Australia, United States, Canada and New Zealand.

    PubMed

    Clifford, Anton C; Doran, Christopher M; Tsey, Komla

    2013-05-13

    Indigenous peoples of Australia, Canada, United States and New Zealand experience disproportionately high rates of suicide. As such, the methodological quality of evaluations of suicide prevention interventions targeting these Indigenous populations should be rigorously examined, in order to determine the extent to which they are effective for reducing rates of Indigenous suicide and suicidal behaviours. This systematic review aims to: 1) identify published evaluations of suicide prevention interventions targeting Indigenous peoples in Australia, Canada, United States and New Zealand; 2) critique their methodological quality; and 3) describe their main characteristics. A systematic search of 17 electronic databases and 13 websites for the period 1981-2012 (inclusive) was undertaken. The reference lists of reviews of suicide prevention interventions were hand-searched for additional relevant studies not identified by the electronic and web search. The methodological quality of evaluations of suicide prevention interventions was assessed using a standardised assessment tool. Nine evaluations of suicide prevention interventions were identified: five targeting Native Americans; three targeting Aboriginal Australians; and one First Nation Canadians. The main intervention strategies employed included: Community Prevention, Gatekeeper Training, and Education. Only three of the nine evaluations measured changes in rates of suicide or suicidal behaviour, all of which reported significant improvements. The methodological quality of evaluations was variable. Particular problems included weak study designs, reliance on self-report measures, highly variable consent and follow-up rates, and the absence of economic or cost analyses. There is an urgent need for an increase in the number of evaluations of preventive interventions targeting reductions in Indigenous suicide using methodologically rigorous study designs across geographically and culturally diverse Indigenous populations. Combining and tailoring best evidence and culturally-specific individual strategies into one coherent suicide prevention program for delivery to whole Indigenous communities and/or population groups at high risk of suicide offers considerable promise.

  4. Ethical and Legal Implications of the Methodological Crisis in Neuroimaging.

    PubMed

    Kellmeyer, Philipp

    2017-10-01

    Currently, many scientific fields such as psychology or biomedicine face a methodological crisis concerning the reproducibility, replicability, and validity of their research. In neuroimaging, similar methodological concerns have taken hold of the field, and researchers are working frantically toward finding solutions for the methodological problems specific to neuroimaging. This article examines some ethical and legal implications of this methodological crisis in neuroimaging. With respect to ethical challenges, the article discusses the impact of flawed methods in neuroimaging research in cognitive and clinical neuroscience, particularly with respect to faulty brain-based models of human cognition, behavior, and personality. Specifically examined is whether such faulty models, when they are applied to neurological or psychiatric diseases, could put patients at risk, and whether this places special obligations on researchers using neuroimaging. In the legal domain, the actual use of neuroimaging as evidence in United States courtrooms is surveyed, followed by an examination of ways that the methodological problems may create challenges for the criminal justice system. Finally, the article reviews and promotes some promising ideas and initiatives from within the neuroimaging community for addressing the methodological problems.

  5. An LMI approach for the Integral Sliding Mode and H∞ State Feedback Control Problem

    NASA Astrophysics Data System (ADS)

    Bezzaoucha, Souad; Henry, David

    2015-11-01

    This paper deals with the state feedback control problem for linear uncertain systems subject to both matched and unmatched perturbations. The proposed control law is based on an the Integral Sliding Mode Control (ISMC) approach to tackle matched perturbations as well as the H∞ paradigm for robustness against unmatched perturbations. The proposed method also parallels the work presented in [1] which addressed the same problem and proposed a solution involving an Algebraic Riccati Equation (ARE)-based formulation. The contribution of this paper is concerned by the establishment of a Linear Matrix Inequality (LMI)-based solution which offers the possibility to consider other types of constraints such as 𝓓-stability constraints (pole assignment-like constraints). The proposed methodology is applied to a pilot three-tank system and experiment results illustrate the feasibility. Note that only a few real experiments have been rarely considered using SMC in the past. This is due to the high energetic behaviour of the control signal. It is important to outline that the paper does not aim at proposing a LMI formulation of an ARE. This is done since 1971 [2] and further discussed in [3] where the link between AREs and ARIs (algebraic Riccati inequality) is established for the H∞ control problem. The main contribution of this paper is to establish the adequate LMI-based methodology (changes of matrix variables) so that the ARE that corresponds to the particular structure of the mixed ISMC/H∞ structure proposed by [1] can be re-formulated within the LMI paradigm.

  6. Virtual-pulse time integral methodology: A new explicit approach for computational dynamics - Theoretical developments for general nonlinear structural dynamics

    NASA Technical Reports Server (NTRS)

    Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong

    1993-01-01

    The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.

  7. Aeroacoustics of Flight Vehicles: Theory and Practice. Volume 1: Noise Sources

    NASA Technical Reports Server (NTRS)

    Hubbard, Harvey H. (Editor)

    1991-01-01

    Methodology recommended to evaluate aeroacoustic related problems is provided, and approaches to their solutions are suggested without extensive tables, nomographs, and derivations. Orientation is toward flight vehicles and emphasis is on underlying physical concepts. Theoretical, experimental, and applied aspects are covered, including the main formulations and comparisons of theory and experiment. The topics covered include: propeller and propfan noise, rotor noise, turbomachinery noise, jet noise classical theory and experiments, noise from turbulent shear flows, jet noise generated by large-scale coherent motion, airframe noise, propulsive lift noise, combustion and core noise, and sonic booms.

  8. Conceptual, Methodological, and Ethical Problems in Communicating Uncertainty in Clinical Evidence

    PubMed Central

    Han, Paul K. J.

    2014-01-01

    The communication of uncertainty in clinical evidence is an important endeavor that poses difficult conceptual, methodological, and ethical problems. Conceptual problems include logical paradoxes in the meaning of probability and “ambiguity”— second-order uncertainty arising from the lack of reliability, credibility, or adequacy of probability information. Methodological problems include questions about optimal methods for representing fundamental uncertainties and for communicating these uncertainties in clinical practice. Ethical problems include questions about whether communicating uncertainty enhances or diminishes patient autonomy and produces net benefits or harms. This article reviews the limited but growing literature on these problems and efforts to address them and identifies key areas of focus for future research. It is argued that the critical need moving forward is for greater conceptual clarity and consistent representational methods that make the meaning of various uncertainties understandable, and for clinical interventions to support patients in coping with uncertainty in decision making. PMID:23132891

  9. [Methodological problems in the use of information technologies in physical education].

    PubMed

    Martirosov, E G; Zaĭtseva, G A

    2000-01-01

    The paper considers methodological problems in the use of computer technologies in physical education by applying diagnostic and consulting systems, educational and educational-and-training process automation systems, and control and self-control programmes for athletes and others.

  10. 2D Inversion of Transient Electromagnetic Method (TEM)

    NASA Astrophysics Data System (ADS)

    Bortolozo, Cassiano Antonio; Luís Porsani, Jorge; Acácio Monteiro dos Santos, Fernando

    2017-04-01

    A new methodology was developed for 2D inversion of Transient Electromagnetic Method (TEM). The methodology consists in the elaboration of a set of routines in Matlab code for modeling and inversion of TEM data and the determination of the most efficient field array for the problem. In this research, the 2D TEM modeling uses the finite differences discretization. To solve the inversion problem, were applied an algorithm based on Marquardt technique, also known as Ridge Regression. The algorithm is stable and efficient and it is widely used in geoelectrical inversion problems. The main advantage of 1D survey is the rapid data acquisition in a large area, but in regions with two-dimensional structures or that need more details, is essential to use two-dimensional interpretation methodologies. For an efficient field acquisition we used in an innovative form the fixed-loop array, with a square transmitter loop (200m x 200m) and 25m spacing between the sounding points. The TEM surveys were conducted only inside the transmitter loop, in order to not deal with negative apparent resistivity values. Although it is possible to model the negative values, it makes the inversion convergence more difficult. Therefore the methodology described above has been developed in order to achieve maximum optimization of data acquisition. Since it is necessary only one transmitter loop disposition in the surface for each series of soundings inside the loop. The algorithms were tested with synthetic data and the results were essential to the interpretation of the results with real data and will be useful in future situations. With the inversion of the real data acquired over the Paraná Sedimentary Basin (PSB) was successful realized a 2D TEM inversion. The results indicate a robust geoelectrical characterization for the sedimentary and crystalline aquifers in the PSB. Therefore, using a new and relevant approach for 2D TEM inversion, this research effectively contributed to map the most promising regions for groundwater exploration. In addition, there was the development of new geophysical software that can be applied as an important tool for many geological/hydrogeological applications and educational purposes.

  11. Comparison of algebraic and analytical approaches to the formulation of the statistical model-based reconstruction problem for X-ray computed tomography.

    PubMed

    Cierniak, Robert; Lorent, Anna

    2016-09-01

    The main aim of this paper is to investigate properties of our originally formulated statistical model-based iterative approach applied to the image reconstruction from projections problem which are related to its conditioning, and, in this manner, to prove a superiority of this approach over ones recently used by other authors. The reconstruction algorithm based on this conception uses a maximum likelihood estimation with an objective adjusted to the probability distribution of measured signals obtained from an X-ray computed tomography system with parallel beam geometry. The analysis and experimental results presented here show that our analytical approach outperforms the referential algebraic methodology which is explored widely in the literature and exploited in various commercial implementations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Helping the decision maker effectively promote various experts' views into various optimal solutions to China's institutional problem of health care provider selection through the organization of a pilot health care provider research system.

    PubMed

    Tang, Liyang

    2013-04-04

    The main aim of China's Health Care System Reform was to help the decision maker find the optimal solution to China's institutional problem of health care provider selection. A pilot health care provider research system was recently organized in China's health care system, and it could efficiently collect the data for determining the optimal solution to China's institutional problem of health care provider selection from various experts, then the purpose of this study was to apply the optimal implementation methodology to help the decision maker effectively promote various experts' views into various optimal solutions to this problem under the support of this pilot system. After the general framework of China's institutional problem of health care provider selection was established, this study collaborated with the National Bureau of Statistics of China to commission a large-scale 2009 to 2010 national expert survey (n = 3,914) through the organization of a pilot health care provider research system for the first time in China, and the analytic network process (ANP) implementation methodology was adopted to analyze the dataset from this survey. The market-oriented health care provider approach was the optimal solution to China's institutional problem of health care provider selection from the doctors' point of view; the traditional government's regulation-oriented health care provider approach was the optimal solution to China's institutional problem of health care provider selection from the pharmacists' point of view, the hospital administrators' point of view, and the point of view of health officials in health administration departments; the public private partnership (PPP) approach was the optimal solution to China's institutional problem of health care provider selection from the nurses' point of view, the point of view of officials in medical insurance agencies, and the health care researchers' point of view. The data collected through a pilot health care provider research system in the 2009 to 2010 national expert survey could help the decision maker effectively promote various experts' views into various optimal solutions to China's institutional problem of health care provider selection.

  13. Reconstruction and restoration of historical buildings of transport infrastructure

    NASA Astrophysics Data System (ADS)

    Kareeva, Daria; Glazkova, Valeriya

    2017-10-01

    The aim of this article is to identify the main problems in the restoration of the historical objects. For this reason, it is rationally to collect and analyze the existing world experience of restoration. The information which was put together showed that there are some problems which are common and can be solved. In addition, the protection of the Monuments of Culture and Architecture Comittees always makes the restoration and reconstruction of the historical buildings complicated. By the examples of Germany, Italy and Russia it is shown that there are problems in organization, economy, planning and control. Engineers should think of and justify the methodology of organizing and monitoring of the restoration of the historical buildings. As a second solution, it will be possible to minimize time and financial costs through a favorable financial and legal background for investors and through the creation of a system of restoration work organizing. And for a faster process of restoration the imitation programs should be optimized for research and selection of the reconstruction technological and economic methods.

  14. Optimizing Multi-Product Multi-Constraint Inventory Control Systems with Stochastic Replenishments

    NASA Astrophysics Data System (ADS)

    Allah Taleizadeh, Ata; Aryanezhad, Mir-Bahador; Niaki, Seyed Taghi Akhavan

    Multi-periodic inventory control problems are mainly studied employing two assumptions. The first is the continuous review, where depending on the inventory level orders can happen at any time and the other is the periodic review, where orders can only happen at the beginning of each period. In this study, we relax these assumptions and assume that the periodic replenishments are stochastic in nature. Furthermore, we assume that the periods between two replenishments are independent and identically random variables. For the problem at hand, the decision variables are of integer-type and there are two kinds of space and service level constraints for each product. We develop a model of the problem in which a combination of back-order and lost-sales are considered for the shortages. Then, we show that the model is of an integer-nonlinear-programming type and in order to solve it, a search algorithm can be utilized. We employ a simulated annealing approach and provide a numerical example to demonstrate the applicability of the proposed methodology.

  15. Conjugate gradient based projection - A new explicit methodology for frictional contact

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Li, Maocheng; Sha, Desong

    1993-01-01

    With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.

  16. The Impact of Various Parental Mental Disorders on Children's Diagnoses: A Systematic Review.

    PubMed

    van Santvoort, Floor; Hosman, Clemens M H; Janssens, Jan M A M; van Doesum, Karin T M; Reupert, Andrea; van Loon, Linda M A

    2015-12-01

    Children of mentally ill parents are at high risk of developing problems themselves. They are often identified and approached as a homogeneous group, despite diversity in parental diagnoses. Some studies demonstrate evidence for transgenerational equifinality (children of parents with various disorders are at risk of similar problems) and multifinality (children are at risk of a broad spectrum of problems). At the same time, other studies indicate transgenerational specificity (child problems are specifically related to the parent's diagnosis) and concordance (children are mainly at risk of the same disorder as their parent). Better insight into the similarities and differences between children of parents with various mental disorders is needed and may inform the development and evaluation of future preventive interventions for children and their families. Accordingly, we systematically compared 76 studies on diagnoses in children of parents with the most prevalent axis I disorders: unipolar depression, bipolar disorder, and anxiety disorders. Methodological characteristics of the studies were compared, and outcomes were analyzed for the presence of transgenerational equifinality, multifinality, specificity, and concordance. Also, the strengths of the relationships between child and parent diagnoses were investigated. This review showed that multifinality and equifinality appear to be more of a characteristic of children of unipolar and bipolar parents than of children of anxious parents, whose risk is mainly restricted to developing anxiety disorders. For all children, risk transmission is assumed to be partly specific since the studies indicate a strong tendency for children to develop the same disorder as their parent.

  17. Spinal Cord Injury-Induced Dysautonomia via Plasticity in Paravertebral Sympathetic Postganglionic

    DTIC Science & Technology

    2017-10-01

    their near anatomical inaccessibility. We have solved the accessibility problem with a strategic methodological advance. We will determine the extent...inaccessibility. We have solved the accessibility problem with a strategic methodological advance. We will determine the extent to which paravertebral

  18. Human Prenatal Effects: Methodological Problems and Some Suggested Solutions

    ERIC Educational Resources Information Center

    Copans, Stuart A.

    1974-01-01

    Briefly reviews the relevant literature on human prenatal effects, describes some of the possible designs for such studies; and discusses some of the methodological problem areas: sample choice, measurement of prenatal variables, monitoring of labor and delivery, and neonatal assessment. (CS)

  19. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    PubMed

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  20. Hybrid Fourier pseudospectral/discontinuous Galerkin time-domain method for wave propagation

    NASA Astrophysics Data System (ADS)

    Pagán Muñoz, Raúl; Hornikx, Maarten

    2017-11-01

    The Fourier Pseudospectral time-domain (Fourier PSTD) method was shown to be an efficient way of modelling acoustic propagation problems as described by the linearized Euler equations (LEE), but is limited to real-valued frequency independent boundary conditions and predominantly staircase-like boundary shapes. This paper presents a hybrid approach to solve the LEE, coupling Fourier PSTD with a nodal Discontinuous Galerkin (DG) method. DG exhibits almost no restrictions with respect to geometrical complexity or boundary conditions. The aim of this novel method is to allow the computation of complex geometries and to be a step towards the implementation of frequency dependent boundary conditions by using the benefits of DG at the boundaries, while keeping the efficient Fourier PSTD in the bulk of the domain. The hybridization approach is based on conformal meshes to avoid spatial interpolation of the DG solutions when transferring values from DG to Fourier PSTD, while the data transfer from Fourier PSTD to DG is done utilizing spectral interpolation of the Fourier PSTD solutions. The accuracy of the hybrid approach is presented for one- and two-dimensional acoustic problems and the main sources of error are investigated. It is concluded that the hybrid methodology does not introduce significant errors compared to the Fourier PSTD stand-alone solver. An example of a cylinder scattering problem is presented and accurate results have been obtained when using the proposed approach. Finally, no instabilities were found during long-time calculation using the current hybrid methodology on a two-dimensional domain.

  1. [A problem-posing educational methodology for the prevention of HIV / AIDS].

    PubMed

    Magana, J R; Ferreira-pinto, J B; Blair, M; Mata A

    1992-01-01

    The problem-posing methodology of Brazilian educator Paulo Freire, using the reading circle approach previously deployed in successful literacy campaigns in developing countries, is introduced for application in AIDS information programs. The basis of this educational process is the dialogue where those to be educated resolve their problems by evaluating information critically, capturing concepts by codification and decodification, and transmitting information by creating relevant educational materials. Health circles are organized with women as educators to impart knowledge about AIDS and HIV: definitions, epidemiological components (sex, age, and risk behavior), means of transmission, stages of the progression of AIDS, prevention of HIV infection, and tests for detecting HIV antibodies. The dialogue explores knowledge and feelings about AIDS and how it affects life in the community reveals personal experiences and accounts of knowing someone who was HIV-positive, and develops action plans to minimize AIDS cases in the community. The Latin population of California, mainly of Mexican origin, with low levels of education, income, and acculturation and a high incidence of AIDS, is an appropriate target of such intervention. In 1980, there were 12.3 million people of Hispanic origin in the US. In August 1990, there were 143,280 persons diagnoses with AIDS according to the Centers for Disease Control. 78,878 of these (55%) were Anglos, and 21,752 (15%) were Hispanics. Among the Anglos, the incidence was 300/million inhabitants, while among Hispanics, it was 1059/million, a 3-field higher rate.

  2. Conceptual design and multidisciplinary optimization of in-plane morphing wing structures

    NASA Astrophysics Data System (ADS)

    Inoyama, Daisaku; Sanders, Brian P.; Joo, James J.

    2006-03-01

    In this paper, the topology optimization methodology for the synthesis of distributed actuation system with specific applications to the morphing air vehicle is discussed. The main emphasis is placed on the topology optimization problem formulations and the development of computational modeling concepts. For demonstration purposes, the inplane morphing wing model is presented. The analysis model is developed to meet several important criteria: It must allow large rigid-body displacements, as well as variation in planform area, with minimum strain on structural members while retaining acceptable numerical stability for finite element analysis. Preliminary work has indicated that addressed modeling concept meets the criteria and may be suitable for the purpose. Topology optimization is performed on the ground structure based on this modeling concept with design variables that control the system configuration. In other words, states of each element in the model are design variables and they are to be determined through optimization process. In effect, the optimization process assigns morphing members as 'soft' elements, non-morphing load-bearing members as 'stiff' elements, and non-existent members as 'voids.' In addition, the optimization process determines the location and relative force intensities of distributed actuators, which is represented computationally as equal and opposite nodal forces with soft axial stiffness. Several different optimization problem formulations are investigated to understand their potential benefits in solution quality, as well as meaningfulness of formulation itself. Sample in-plane morphing problems are solved to demonstrate the potential capability of the methodology introduced in this paper.

  3. The implementation of problem-based learning in health service management training programs.

    PubMed

    Stankunas, Mindaugas; Czabanowska, Katarzyna; Avery, Mark; Kalediene, Ramune; Babich, Suzanne Marie

    2016-10-03

    Purpose Strengthening management capacity within the health care sector could have a significant impact on population health. However, many training programs in this area are still delivered using a classic lecture-based approach. The purpose of this paper is to evaluate and better understand the feasibility of using a problem-based learning (PBL) approach in health services management training programs. Design/methodology/approach A PBL teaching approach (based on the Maastricht University model) was tested with second-year postgraduate students from the Master in Public Health Management program at the Lithuanian University of Health Sciences. Students' opinions about PBL were investigated using a questionnaire with eight open-ended questions. Thematic content analysis was chosen to reflect the search for patterns across the data. Findings Respondents stated that the main advantage of PBL was that it was a more interesting and effective way of learning: "It is easier to remember, when you study by yourself and discuss with all peers". In addition, it was mentioned that PBL initiated a rapid exchange of ideas and sharing of personal experience. Students stressed that PBL was a good tool for developing other skills as well, such as "public speaking, communication, logic thinking". All students recommended delivering all other courses in the health services management program using PBL methodologies. Originality/value Findings from our study suggest that PBL may be an effective approach to teaching health services management. Potential problems in implementation are noted.

  4. Image Restoration in Cryo-electron Microscopy

    PubMed Central

    Penczek, Pawel A.

    2011-01-01

    Image restoration techniques are used to obtain, given experimental measurements, the best possible approximation of the original object within the limits imposed by instrumental conditions and noise level in the data. In molecular electron microscopy, we are mainly interested in linear methods that preserve the respective relationships between mass densities within the restored map. Here, we describe the methodology of image restoration in structural electron microscopy, and more specifically, we will focus on the problem of the optimum recovery of Fourier amplitudes given electron microscope data collected under various defocus settings. We discuss in detail two classes of commonly used linear methods, the first of which consists of methods based on pseudoinverse restoration, and which is further subdivided into mean-square error, chi-square error, and constrained based restorations, where the methods in the latter two subclasses explicitly incorporates non-white distribution of noise in the data. The second class of methods is based on the Wiener filtration approach. We show that the Wiener filter-based methodology can be used to obtain a solution to the problem of amplitude correction (or “sharpening”) of the electron microscopy map that makes it visually comparable to maps determined by X-ray crystallography, and thus amenable to comparable interpretation. Finally, we present a semi-heuristic Wiener filter-based solution to the problem of image restoration given sets of heterogeneous solutions. We conclude the chapter with a discussion of image restoration protocols implemented in commonly used single particle software packages. PMID:20888957

  5. Overcoming an obstacle in expanding a UMLS semantic type extent.

    PubMed

    Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James

    2012-02-01

    This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Overcoming an Obstacle in Expanding a UMLS Semantic Type Extent

    PubMed Central

    Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James

    2011-01-01

    This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. PMID:21925287

  7. A Machine Learning Approach to the Detection of Pilot's Reaction to Unexpected Events Based on EEG Signals

    PubMed Central

    Cyran, Krzysztof A.

    2018-01-01

    This work considers the problem of utilizing electroencephalographic signals for use in systems designed for monitoring and enhancing the performance of aircraft pilots. Systems with such capabilities are generally referred to as cognitive cockpits. This article provides a description of the potential that is carried by such systems, especially in terms of increasing flight safety. Additionally, a neuropsychological background of the problem is presented. Conducted research was focused mainly on the problem of discrimination between states of brain activity related to idle but focused anticipation of visual cue and reaction to it. Especially, a problem of selecting a proper classification algorithm for such problems is being examined. For that purpose an experiment involving 10 subjects was planned and conducted. Experimental electroencephalographic data was acquired using an Emotiv EPOC+ headset. Proposed methodology involved use of a popular method in biomedical signal processing, the Common Spatial Pattern, extraction of bandpower features, and an extensive test of different classification algorithms, such as Linear Discriminant Analysis, k-nearest neighbors, and Support Vector Machines with linear and radial basis function kernels, Random Forests, and Artificial Neural Networks. PMID:29849544

  8. Drug exposure in pregnant women.

    PubMed

    Czeizel, A E

    2004-01-01

    The objectives of this paper are to describe the Hungarian case-control surveillance system of congenital abnormalities (HCCSCA), to summarize the principles of this activity and our main experiences. Among the main principles, the importance of the time factor (the first trimester concept is outdated), the differentiation of isolated and multiple manifestations of the seemingly same congenital abnormalities, noxa specificity, the separation of drugs and pregnancy supplements within medicinal products (or medicines) are stressed. After some methodological problems (recall bias, chance effect), the main experiences regarding the risk and benefit of medicines are summarized. The conclusion is that the results of our studies based on the data set of the HCCSCA showed that at present the exaggerated teratogenic risk of drugs is much more harmful for the fetus than the real teratogenic effect of some drugs themselves. Medical doctors and other experts therefore need more education to know the principles and findings of modern human teratology because it may help us to have a better balance between the risk and benefit of drug use during pregnancy.

  9. E-therapy for mental health problems: a systematic review.

    PubMed

    Postel, Marloes G; de Haan, Hein A; De Jong, Cor A J

    2008-09-01

    The widespread availability of the Internet offers opportunities for improving access to therapy for people with mental health problems. There is a seemingly infinite supply of Internet-based interventions available on the World Wide Web. The aim of the present study is to systematically assess the methodological quality of randomized controlled trials (RCTs) concerning e-therapy for mental health problems. Two reviewers independently assessed the methodological quality of the RCTs, based on a list of criteria for the methodological quality assessment as recommended by the Cochrane Back Review Group. The search yielded 14 papers that reported RCTs concerning e-therapy for mental-health problems. The methodological quality of studies included in this review was generally low. It is concluded that e-therapy may turn out to be an appropriate therapeutic entity, but the evidence needs to be more convincing. Recommendations are made concerning the method of reporting RCTs and the need to add some content items to an e-therapy study.

  10. Evaluation of Heavy Metals in Solid Waste Disposal Sites in Campinas City, Brazil Using Synchrotron Radiation Total Reflection X-Ray Fluorescence

    NASA Astrophysics Data System (ADS)

    de Faria, Bruna Fernanda; Moreira, Silvana

    2011-12-01

    The problem of solid waste in most countries is on the rise as a result of rapid population growth, urbanization, industrial development and changes in consumption habits. Amongst the various forms of waste disposals, landfills are today the most viable for the Brazilian reality, both technically and economically. Proper landfill construction practices allow minimizing the effects of the two main sources of pollution from solid waste: landfill gas and slurry. However, minimizing is not synonymous with eliminating; consequently, the landfill alone cannot resolve all the problems with solid waste disposal. The main goal of this work is to evaluate the content of trace elements in samples of groundwater, surface water and slurry arising from local solid waste disposals in the city of Campinas, SP, Brazil. Samples were collected at the Delta, Santa Barbara and Pirelli landfills. At the Delta and Santa Barbara sites, values above the maximum permitted level established by CETESB for Cr, Mn, Fe, Ni and Pb were observed in samples of groundwater, while at the Pirelli site, elements with concentrations above the permitted levels were Mn, Fe, Ba and Pb. At Delta, values above levels permitted by the CONAMA 357 legislation were still observed in surface water samples for Cr, Mn, Fe and Cu, whereas in slurry samples, values above the permitted levels were observed for Cr, Mn, Fe, Ni, Cu, Zn and Pb. Slurry samples were prepared in accordance with two extraction methodologies, EPA 3050B and EPA 200.8. Concentrations of Cr, Ni, Cu and Pb were higher than the limit established by CONAMA 357 for most samples collected at different periods (dry and rainy) and also for the two extraction methodologies employed.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faria, Bruna Fernanda de; Moreira, Silvana

    The problem of solid waste in most countries is on the rise as a result of rapid population growth, urbanization, industrial development and changes in consumption habits. Amongst the various forms of waste disposals, landfills are today the most viable for the Brazilian reality, both technically and economically. Proper landfill construction practices allow minimizing the effects of the two main sources of pollution from solid waste: landfill gas and slurry. However, minimizing is not synonymous with eliminating; consequently, the landfill alone cannot resolve all the problems with solid waste disposal. The main goal of this work is to evaluate themore » content of trace elements in samples of groundwater, surface water and slurry arising from local solid waste disposals in the city of Campinas, SP, Brazil. Samples were collected at the Delta, Santa Barbara and Pirelli landfills. At the Delta and Santa Barbara sites, values above the maximum permitted level established by CETESB for Cr, Mn, Fe, Ni and Pb were observed in samples of groundwater, while at the Pirelli site, elements with concentrations above the permitted levels were Mn, Fe, Ba and Pb. At Delta, values above levels permitted by the CONAMA 357 legislation were still observed in surface water samples for Cr, Mn, Fe and Cu, whereas in slurry samples, values above the permitted levels were observed for Cr, Mn, Fe, Ni, Cu, Zn and Pb. Slurry samples were prepared in accordance with two extraction methodologies, EPA 3050B and EPA 200.8. Concentrations of Cr, Ni, Cu and Pb were higher than the limit established by CONAMA 357 for most samples collected at different periods (dry and rainy) and also for the two extraction methodologies employed.« less

  12. CFD methodology and validation for turbomachinery flows

    NASA Astrophysics Data System (ADS)

    Hirsch, Ch.

    1994-05-01

    The essential problem today, in the application of 3D Navier-Stokes simulations to the design and analysis of turbomachinery components, is the validation of the numerical approximation and of the physical models, in particular the turbulence modelling. Although most of the complex 3D flow phenomena occurring in turbomachinery bladings can be captured with relatively coarse meshes, many detailed flow features are dependent on mesh size, on the turbulence and transition models. A brief review of the present state of the art of CFD methodology is given with emphasis on quality and accuracy of numerical approximations related to viscous flow computations. Considerations related to the mesh influence on solution accuracy are stressed. The basic problems of turbulence and transition modelling are discussed next, with a short summary of the main turbulence models and their applications to representative turbomachinery flows. Validations of present turbulence models indicate that none of the available turbulence models is able to predict all the detailed flow behavior in complex flow interactions. In order to identify the phenomena that can be captured on coarser meshes a detailed understanding of the complex 3D flow in compressor and turbines is necessary. Examples of global validations for different flow configurations, representative of compressor and turbine aerodynamics are presented, including secondary and tip clearance flows.

  13. Fitting methods to paradigms: are ergonomics methods fit for systems thinking?

    PubMed

    Salmon, Paul M; Walker, Guy H; M Read, Gemma J; Goode, Natassia; Stanton, Neville A

    2017-02-01

    The issues being tackled within ergonomics problem spaces are shifting. Although existing paradigms appear relevant for modern day systems, it is worth questioning whether our methods are. This paper asks whether the complexities of systems thinking, a currently ubiquitous ergonomics paradigm, are outpacing the capabilities of our methodological toolkit. This is achieved through examining the contemporary ergonomics problem space and the extent to which ergonomics methods can meet the challenges posed. Specifically, five key areas within the ergonomics paradigm of systems thinking are focused on: normal performance as a cause of accidents, accident prediction, system migration, systems concepts and ergonomics in design. The methods available for pursuing each line of inquiry are discussed, along with their ability to respond to key requirements. In doing so, a series of new methodological requirements and capabilities are identified. It is argued that further methodological development is required to provide researchers and practitioners with appropriate tools to explore both contemporary and future problems. Practitioner Summary: Ergonomics methods are the cornerstone of our discipline. This paper examines whether our current methodological toolkit is fit for purpose given the changing nature of ergonomics problems. The findings provide key research and practice requirements for methodological development.

  14. Soft systems methodology and the ecosystem approach: a system study of the Cooum River and environs in Chennai, India.

    PubMed

    Bunch, Martin J

    2003-02-01

    This paper discusses the integration of soft systems methodology (SSM) within an ecosystem approach in research to support rehabilitation and management of the Cooum River and environs in Chennai, India. The Cooum is an extremely polluted urban stream. Its management is complicated by high rates of population growth, poverty, uncontrolled urban development, jurisdictional conflicts, institutional culture, flat topography, tidal action, blockage of the river mouth, and monsoon flooding. The situation is characterized by basic uncertainty about main processes and activities, and the nature of relationships among actors and elements in the system.SSM is an approach for dealing with messy or ill-structured problematic situations involving human activity. In this work SSM contributed techniques (such as "rich picture" and "CATWOE" tools) to description of the Cooum situation as a socioecological system and informed the approach itself at a theoretical level. Application of three general phases in SSM is discussed in the context of the Cooum River research: (1) problem definition and exploration of the problem situation, (2) development of conceptual models of relevant systems, and (3) the use of these to generate insight and stimulate debate about desirable and feasible change. Its use here gives weight to the statement by others that SSM would be a particularly appropriate methodology to operate the ecosystem approach. As well as informing efforts at management of the Cooum system, this work led the way to explore an adaptive ecosystem approach more broadly to management of the urban environment for human health in Chennai.

  15. Increasing accuracy in the assessment of motion sickness: A construct methodology

    NASA Technical Reports Server (NTRS)

    Stout, Cynthia S.; Cowings, Patricia S.

    1993-01-01

    The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.

  16. Problem Solving in Biology: A Methodology

    ERIC Educational Resources Information Center

    Wisehart, Gary; Mandell, Mark

    2008-01-01

    A methodology is described that teaches science process by combining informal logic and a heuristic for rating factual reliability. This system facilitates student hypothesis formation, testing, and evaluation of results. After problem solving with this scheme, students are asked to examine and evaluate arguments for the underlying principles of…

  17. SOME POSSIBLE APPLICATIONS OF PROJECT OUTCOMES RESEARCH METHODOLOGY

    DTIC Science & Technology

    Section I, refers to the possibility of using the theory and methodology of Project Outcomes to problems of strategic information. It is felt that...purposes of assessing present and future organizational effectiveness . Section IV, refers to the applications that our study may have for problems of

  18. [Methodological aspects of the assessment of phytotoxicic properties of ice-melter reagents].

    PubMed

    Sbitnev, A V; Vodianova, M A; Kriatov, I A; Donerian, L G; Evseeva, I S; Ushakova, O V; Ushakov, D I; Matveeva, I S; Rodionova, O M

    One of the main criteria which determine the possibility of the use of a particular type of ice-melter reagents (IMR) is the degree of their safety for the environment and human health, which is reflected in the establishment of safe doses and concentrations. In this regard, the current area of research is to improve the ecological and epidemiological principles of risk assessment of modern types of anti-icing agents. Currently available data concerning monitoring soil studies and the snow held in various cities of Russia, show that there is a process of accumulation of the main components of IMR - sodium and chlorine ions in the areas related to the roadway. The article is designated a problem of existing methodological approaches to the assessment of the phytotoxic impact in the investigation of anti-icing agents in the laboratory. There was executed the comparative characteristics of the results of the preliminary pilot studies on the phytotoxic properties of IMR under using different substrates for germination of seeds - soil and filter paper. The data obtained are characterized by differences in the degree of phytotoxic action of the same species depending upon ice-melter reagents methodical setting circuit laboratory experiment. As a result, there was shown the imperfection of the existing method of rapid analysis in relation to ice-melter materials (IMM).

  19. Child maltreatment prevention: a systematic review of reviews.

    PubMed

    Mikton, Christopher; Butchart, Alexander

    2009-05-01

    To synthesize recent evidence from systematic and comprehensive reviews on the effectiveness of universal and selective child maltreatment prevention interventions, evaluate the methodological quality of the reviews and outcome evaluation studies they are based on, and map the geographical distribution of the evidence. A systematic review of reviews was conducted. The quality of the systematic reviews was evaluated with a tool for the assessment of multiple systematic reviews (AMSTAR), and the quality of the outcome evaluations was assessed using indicators of internal validity and of the construct validity of outcome measures. The review focused on seven main types of interventions: home visiting, parent education, child sex abuse prevention, abusive head trauma prevention, multi-component interventions, media-based interventions, and support and mutual aid groups. Four of the seven - home-visiting, parent education, abusive head trauma prevention and multi-component interventions - show promise in preventing actual child maltreatment. Three of them - home visiting, parent education and child sexual abuse prevention - appear effective in reducing risk factors for child maltreatment, although these conclusions are tentative due to the methodological shortcomings of the reviews and outcome evaluation studies they draw on. An analysis of the geographical distribution of the evidence shows that outcome evaluations of child maltreatment prevention interventions are exceedingly rare in low- and middle-income countries and make up only 0.6% of the total evidence base. Evidence for the effectiveness of four of the seven main types of interventions for preventing child maltreatment is promising, although it is weakened by methodological problems and paucity of outcome evaluations from low- and middle-income countries.

  20. Topology synthesis and size optimization of morphing wing structures

    NASA Astrophysics Data System (ADS)

    Inoyama, Daisaku

    This research demonstrates a novel topology and size optimization methodology for synthesis of distributed actuation systems with specific applications to morphing air vehicle structures. The main emphasis is placed on the topology and size optimization problem formulations and the development of computational modeling concepts. The analysis model is developed to meet several important criteria: It must allow a rigid-body displacement, as well as a variation in planform area, with minimum strain on structural members while retaining acceptable numerical stability for finite element analysis. Topology optimization is performed on a semi-ground structure with design variables that control the system configuration. In effect, the optimization process assigns morphing members as "soft" elements, non-morphing load-bearing members as "stiff' elements, and non-existent members as "voids." The optimization process also determines the optimum actuator placement, where each actuator is represented computationally by equal and opposite nodal forces with soft axial stiffness. In addition, the configuration of attachments that connect the morphing structure to a non-morphing structure is determined simultaneously. Several different optimization problem formulations are investigated to understand their potential benefits in solution quality, as well as meaningfulness of the formulations. Extensions and enhancements to the initial concept and problem formulations are made to accommodate multiple-configuration definitions. In addition, the principal issues on the external-load dependency and the reversibility of a design, as well as the appropriate selection of a reference configuration, are addressed in the research. The methodology to control actuator distributions and concentrations is also discussed. Finally, the strategy to transfer the topology solution to the sizing optimization is developed and cross-sectional areas of existent structural members are optimized under applied aerodynamic loads. That is, the optimization process is implemented in sequential order: The actuation system layout is first determined through multi-disciplinary topology optimization process, and then the thickness or cross-sectional area of each existent member is optimized under given constraints and boundary conditions. Sample problems are solved to demonstrate the potential capabilities of the presented methodology. The research demonstrates an innovative structural design procedure from a computational perspective and opens new insights into the potential design requirements and characteristics of morphing structures.

  1. Stability of ecological industry chain: an entropy model approach.

    PubMed

    Wang, Qingsong; Qiu, Shishou; Yuan, Xueliang; Zuo, Jian; Cao, Dayong; Hong, Jinglan; Zhang, Jian; Dong, Yong; Zheng, Ying

    2016-07-01

    A novel methodology is proposed in this study to examine the stability of ecological industry chain network based on entropy theory. This methodology is developed according to the associated dissipative structure characteristics, i.e., complexity, openness, and nonlinear. As defined in the methodology, network organization is the object while the main focus is the identification of core enterprises and core industry chains. It is proposed that the chain network should be established around the core enterprise while supplementation to the core industry chain helps to improve system stability, which is verified quantitatively. Relational entropy model can be used to identify core enterprise and core eco-industry chain. It could determine the core of the network organization and core eco-industry chain through the link form and direction of node enterprises. Similarly, the conductive mechanism of different node enterprises can be examined quantitatively despite the absence of key data. Structural entropy model can be employed to solve the problem of order degree for network organization. Results showed that the stability of the entire system could be enhanced by the supplemented chain around the core enterprise in eco-industry chain network organization. As a result, the sustainability of the entire system could be further improved.

  2. Detecting and correcting for publication bias in meta-analysis - A truncated normal distribution approach.

    PubMed

    Zhu, Qiaohao; Carriere, K C

    2016-01-01

    Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.

  3. [Problem-based learning in cardiopulmonary resuscitation: basic life support].

    PubMed

    Sardo, Pedro Miguel Garcez; Dal Sasso, Grace Terezinha Marcon

    2008-12-01

    Descriptive and exploratory study, aimed to develop an educational practice of Problem-Based Learning in CPR/BLS with 24 students in the third stage of the Nursing Undergraduate Course in a University in the Southern region of Brazil. The study used the PBL methodology, focused on problem situations of cardiopulmonary arrest, and was approved by the CONEP. The methodological strategies for data collection, such as participative observation and questionnaires to evaluate the learning, the educational practices and their methodology, allowed for grouping the results in: students' expectations; group activities; individual activities; practical activities; evaluation of the meetings and their methodology. The study showed that PBL allows the educator to evaluate the academic learning process in several dimensions, functioning as a motivating factor for both the educator and the student, because it allows the theoretical-practical integration in an integrated learning process.

  4. Advances in the indirect, descriptive, and experimental approaches to the functional analysis of problem behavior.

    PubMed

    Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier

    2014-05-01

    Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.

  5. Methodological Problems on the Way to Integrative Human Neuroscience.

    PubMed

    Kotchoubey, Boris; Tretter, Felix; Braun, Hans A; Buchheim, Thomas; Draguhn, Andreas; Fuchs, Thomas; Hasler, Felix; Hastedt, Heiner; Hinterberger, Thilo; Northoff, Georg; Rentschler, Ingo; Schleim, Stephan; Sellmaier, Stephan; Tebartz Van Elst, Ludger; Tschacher, Wolfgang

    2016-01-01

    Neuroscience is a multidisciplinary effort to understand the structures and functions of the brain and brain-mind relations. This effort results in an increasing amount of data, generated by sophisticated technologies. However, these data enhance our descriptive knowledge , rather than improve our understanding of brain functions. This is caused by methodological gaps both within and between subdisciplines constituting neuroscience, and the atomistic approach that limits the study of macro- and mesoscopic issues. Whole-brain measurement technologies do not resolve these issues, but rather aggravate them by the complexity problem. The present article is devoted to methodological and epistemic problems that obstruct the development of human neuroscience. We neither discuss ontological questions (e.g., the nature of the mind) nor review data, except when it is necessary to demonstrate a methodological issue. As regards intradisciplinary methodological problems, we concentrate on those within neurobiology (e.g., the gap between electrical and chemical approaches to neurophysiological processes) and psychology (missing theoretical concepts). As regards interdisciplinary problems, we suggest that core disciplines of neuroscience can be integrated using systemic concepts that also entail human-environment relations. We emphasize the necessity of a meta-discussion that should entail a closer cooperation with philosophy as a discipline of systematic reflection. The atomistic reduction should be complemented by the explicit consideration of the embodiedness of the brain and the embeddedness of humans. The discussion is aimed at the development of an explicit methodology of integrative human neuroscience , which will not only link different fields and levels, but also help in understanding clinical phenomena.

  6. Methodological Problems on the Way to Integrative Human Neuroscience

    PubMed Central

    Kotchoubey, Boris; Tretter, Felix; Braun, Hans A.; Buchheim, Thomas; Draguhn, Andreas; Fuchs, Thomas; Hasler, Felix; Hastedt, Heiner; Hinterberger, Thilo; Northoff, Georg; Rentschler, Ingo; Schleim, Stephan; Sellmaier, Stephan; Tebartz Van Elst, Ludger; Tschacher, Wolfgang

    2016-01-01

    Neuroscience is a multidisciplinary effort to understand the structures and functions of the brain and brain-mind relations. This effort results in an increasing amount of data, generated by sophisticated technologies. However, these data enhance our descriptive knowledge, rather than improve our understanding of brain functions. This is caused by methodological gaps both within and between subdisciplines constituting neuroscience, and the atomistic approach that limits the study of macro- and mesoscopic issues. Whole-brain measurement technologies do not resolve these issues, but rather aggravate them by the complexity problem. The present article is devoted to methodological and epistemic problems that obstruct the development of human neuroscience. We neither discuss ontological questions (e.g., the nature of the mind) nor review data, except when it is necessary to demonstrate a methodological issue. As regards intradisciplinary methodological problems, we concentrate on those within neurobiology (e.g., the gap between electrical and chemical approaches to neurophysiological processes) and psychology (missing theoretical concepts). As regards interdisciplinary problems, we suggest that core disciplines of neuroscience can be integrated using systemic concepts that also entail human-environment relations. We emphasize the necessity of a meta-discussion that should entail a closer cooperation with philosophy as a discipline of systematic reflection. The atomistic reduction should be complemented by the explicit consideration of the embodiedness of the brain and the embeddedness of humans. The discussion is aimed at the development of an explicit methodology of integrative human neuroscience, which will not only link different fields and levels, but also help in understanding clinical phenomena. PMID:27965548

  7. Case study of a problem-based learning course of physics in a telecommunications engineering degree

    NASA Astrophysics Data System (ADS)

    Macho-Stadler, Erica; Jesús Elejalde-García, Maria

    2013-08-01

    Active learning methods can be appropriate in engineering, as their methodology promotes meta-cognition, independent learning and problem-solving skills. Problem-based learning is the educational process by which problem-solving activities and instructor's guidance facilitate learning. Its key characteristic involves posing a 'concrete problem' to initiate the learning process, generally implemented by small groups of students. Many universities have developed and used active methodologies successfully in the teaching-learning process. During the past few years, the University of the Basque Country has promoted the use of active methodologies through several teacher training programmes. In this paper, we describe and analyse the results of the educational experience using the problem-based learning (PBL) method in a physics course for undergraduates enrolled in the technical telecommunications engineering degree programme. From an instructors' perspective, PBL strengths include better student attitude in class and increased instructor-student and student-student interactions. The students emphasised developing teamwork and communication skills in a good learning atmosphere as positive aspects.

  8. Layer Stripping Solutions of Inverse Seismic Problems.

    DTIC Science & Technology

    1985-03-21

    problems--more so than has generally been recognized. The subject of this thesis is the theoretical development of the . layer-stripping methodology , and...medium varies sharply at each interface, which would be expected to cause difficulties for the algorithm, since it was designed for a smoothy varying... methodology was applied in a novel way. The inverse problem considered in this chapter was that of reconstructing a layered medium from measurement of its

  9. A GIS semiautomatic tool for classifying and mapping wetland soils

    NASA Astrophysics Data System (ADS)

    Moreno-Ramón, Héctor; Marqués-Mateu, Angel; Ibáñez-Asensio, Sara

    2016-04-01

    Wetlands are one of the most productive and biodiverse ecosystems in the world. Water is the main resource and controls the relationships between agents and factors that determine the quality of the wetland. However, vegetation, wildlife and soils are also essential factors to understand these environments. It is possible that soils have been the least studied resource due to their sampling problems. This feature has caused that sometimes wetland soils have been classified broadly. The traditional methodology states that homogeneous soil units should be based on the five soil forming-factors. The problem can appear when the variation of one soil-forming factor is too small to differentiate a change in soil units, or in case that there is another factor, which is not taken into account (e.g. fluctuating water table). This is the case of Albufera of Valencia, a coastal wetland located in the middle east of the Iberian Peninsula (Spain). The saline water table fluctuates throughout the year and it generates differences in soils. To solve this problem, the objectives of this study were to establish a reliable methodology to avoid that problems, and develop a GIS tool that would allow us to define homogeneous soil units in wetlands. This step is essential for the soil scientist, who has to decide the number of soil profiles in a study. The research was conducted with data from 133 soil pits of a previous study in the wetland. In that study, soil parameters of 401 samples (organic carbon, salinity, carbonates, n-value, etc.) were analysed. In a first stage, GIS layers were generated according to depth. The method employed was Bayesian Maxim Entropy. Subsequently, it was designed a program in GIS environment that was based on the decision tree algorithms. The goal of this tool was to create a single layer, for each soil variable, according to the different diagnostic criteria of Soil Taxonomy (properties, horizons and diagnostic epipedons). At the end, the program generated a set of layers with the geographical information, which corresponded with each diagnostic criteria. Finally, the superposition of layers generated the different homogeneous soil units where the soil scientist should locate the soil profiles. Historically, the Albufera of Valencia has been classified as a soil homogeneous unit, but it was demonstrated that there were six homogeneous units after the methodology and the GIS tool application. In that regard, the outcome reveals that it had been necessary to open only six profiles, against the 19 profiles opened when the real study was carried out. As a conclusion, the methodology and the SIG tool demonstrated that could be employed in areas where the soil forming-factors cannot be distinguished. The application of rapid measurement methods and this methodology could economise the definition process of homogeneous units.

  10. Deciphering the complex: methodological overview of statistical models to derive OMICS-based biomarkers.

    PubMed

    Chadeau-Hyam, Marc; Campanella, Gianluca; Jombart, Thibaut; Bottolo, Leonardo; Portengen, Lutzen; Vineis, Paolo; Liquet, Benoit; Vermeulen, Roel C H

    2013-08-01

    Recent technological advances in molecular biology have given rise to numerous large-scale datasets whose analysis imposes serious methodological challenges mainly relating to the size and complex structure of the data. Considerable experience in analyzing such data has been gained over the past decade, mainly in genetics, from the Genome-Wide Association Study era, and more recently in transcriptomics and metabolomics. Building upon the corresponding literature, we provide here a nontechnical overview of well-established methods used to analyze OMICS data within three main types of regression-based approaches: univariate models including multiple testing correction strategies, dimension reduction techniques, and variable selection models. Our methodological description focuses on methods for which ready-to-use implementations are available. We describe the main underlying assumptions, the main features, and advantages and limitations of each of the models. This descriptive summary constitutes a useful tool for driving methodological choices while analyzing OMICS data, especially in environmental epidemiology, where the emergence of the exposome concept clearly calls for unified methods to analyze marginally and jointly complex exposure and OMICS datasets. Copyright © 2013 Wiley Periodicals, Inc.

  11. Semantic Segmentation of Forest Stands of Pure Species as a Global Optimization Problem

    NASA Astrophysics Data System (ADS)

    Dechesne, C.; Mallet, C.; Le Bris, A.; Gouet-Brunet, V.

    2017-05-01

    Forest stand delineation is a fundamental task for forest management purposes, that is still mainly manually performed through visual inspection of geospatial (very) high spatial resolution images. Stand detection has been barely addressed in the literature which has mainly focused, in forested environments, on individual tree extraction and tree species classification. From a methodological point of view, stand detection can be considered as a semantic segmentation problem. It offers two advantages. First, one can retrieve the dominant tree species per segment. Secondly, one can benefit from existing low-level tree species label maps from the literature as a basis for high-level object extraction. Thus, the semantic segmentation issue becomes a regularization issue in a weakly structured environment and can be formulated in an energetical framework. This papers aims at investigating which regularization strategies of the literature are the most adapted to delineate and classify forest stands of pure species. Both airborne lidar point clouds and multispectral very high spatial resolution images are integrated for that purpose. The local methods (such as filtering and probabilistic relaxation) are not adapted for such problem since the increase of the classification accuracy is below 5%. The global methods, based on an energy model, tend to be more efficient with an accuracy gain up to 15%. The segmentation results using such models have an accuracy ranging from 96% to 99%.

  12. The role of food-security solutions in the protection of natural resources and environment of developing countries.

    PubMed

    Lashgarara, Farhad; Mirdamadi, Seyyed Mehdi; Hosseini, Seyyed Jamal Farajollah; Chizari, Mohammad

    2008-10-01

    The majority of the countries of the world, especially developing countries, face environmental problems. Limitations of basic resources (water and soil) and population growth have been the cause of these environmental problems that countries are confronted with. Developing countries have numerous problems, including destruction of forests, vegetable and animal species, and pollution of the environment. Damage to natural resources and the environment can influence the food-security situation. One of the main millennium development goals (MDGs) is protection of the environment and people's health. This cannot obtained unless there is ensured food security. Food security has been defined as a situation when all people, at all times, have physical and economic access to sufficient, safe, and nutritious food needed to maintain a healthy and active life. At the same time, with ensured food security, we can hope to protect the natural resources and environment. The methodology used is descriptive-analytical, and its main purpose is determining the importance and role of food-security solutions in the reduction of environmental hazards and improvement of natural resources and the environmental situation in developing countries. Therefore, some of the most important food-security solutions that can play an important role in this relation were discussed, including conventional research-based technology, biotechnology, information and communication technologies (ICTs), alternative energy sources, and food irradiation.

  13. Researching Street Children: Methodological and Ethical Issues.

    ERIC Educational Resources Information Center

    Hutz, Claudio S.; And Others

    This paper describes the ethical and methodological problems associated with studying prosocial moral reasoning of street children and children of low and high SES living with their families, and problems associated with studying sexual attitudes and behavior of street children and their knowledge of sexually transmitted diseases, especially AIDS.…

  14. Problem-Based Learning: Lessons for Administrators, Educators and Learners

    ERIC Educational Resources Information Center

    Yeo, Roland

    2005-01-01

    Purpose: The paper aims to explore the challenges of problem-based learning (PBL) as an unconventional teaching methodology experienced by a higher learning institute in Singapore. Design/methodology/approach: The exploratory study was conducted using focus group discussions and semi-structured interviews. Four groups of people were invited to…

  15. The Speaker Respoken: Material Rhetoric as Feminist Methodology.

    ERIC Educational Resources Information Center

    Collins, Vicki Tolar

    1999-01-01

    Presents a methodology based on the concept of "material rhetoric" that can help scholars avoid problems as they reclaim women's historical texts. Defines material rhetoric and positions it theoretically in relation to other methodologies, including bibliographical studies, reception theory, and established feminist methodologies. Illustrates…

  16. An Analysis of Delay and Travel Times at Sao Paulo International Airport (AISP/GRU): Planning Based on Simulation Model

    NASA Technical Reports Server (NTRS)

    Santana, Erico Soriano Martins; Mueller, Carlos

    2003-01-01

    The occurrence of flight delays in Brazil, mostly verified at the ground (airfield), is responsible for serious disruptions at the airport level but also for the unchaining of problems in all the airport system, affecting also the airspace. The present study develops an analysis of delay and travel times at Sao Paulo International Airport/ Guarulhos (AISP/GRU) airfield based on simulation model. Different airport physical and operational scenarios had been analyzed by means of simulation. SIMMOD Plus 4.0, the computational tool developed to represent aircraft operation in the airspace and airside of airports, was used to perform these analysis. The study was mainly focused on aircraft operations on ground, at the airport runway, taxi-lanes and aprons. The visualization of the operations with increasing demand facilitated the analyses. The results generated in this work certify the viability of the methodology, they also indicated the solutions capable to solve the delay problem by travel time analysis, thus diminishing the costs for users mainly airport authority. It also indicated alternatives for airport operations, assisting the decision-making process and in the appropriate timing of the proposed changes in the existing infrastructure.

  17. A methodology for the assessment of manned flight simulator fidelity

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; Malsbury, Terry N.

    1989-01-01

    A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.

  18. Determinats of Interregional Competion of Subjects of Russian Federation

    NASA Astrophysics Data System (ADS)

    Gaisin, R. I.; Latypov, R. A.; Gaisin, I. T.; Kubyshkina, E. N.; Hayaleeva, A. D.

    2018-05-01

    In the article, problems of the analysis of competitiveness of subjects of the Russian Federation at the level of the national market of the country are considered. For a research of indicators and dynamics of competitiveness of certain territorial subjects of the Russian Federation, the methodology and tools of the theory of the interregional markets of the country developed by one of authors of epy article are used. On the basis of the known theory of competitiveness of M. Porter, the main directions of an increase of the competition in the interregional market of Russia are offered. Keywords: competitiveness, competitiveness determinants, interregional competition, interregional markets of the country

  19. Computational Everyday Life Human Behavior Model as Servicable Knowledge

    NASA Astrophysics Data System (ADS)

    Motomura, Yoichi; Nishida, Yoshifumi

    A project called `Open life matrix' is not only a research activity but also real problem solving as an action research. This concept is realized by large-scale data collection, probabilistic causal structure model construction and information service providing using the model. One concrete outcome of this project is childhood injury prevention activity in new team consist of hospital, government, and many varieties of researchers. The main result from the project is a general methodology to apply probabilistic causal structure models as servicable knowledge for action research. In this paper, the summary of this project and future direction to emphasize action research driven by artificial intelligence technology are discussed.

  20. User Interaction Modeling and Profile Extraction in Interactive Systems: A Groupware Application Case Study †

    PubMed Central

    Tîrnăucă, Cristina; Duque, Rafael; Montaña, José L.

    2017-01-01

    A relevant goal in human–computer interaction is to produce applications that are easy to use and well-adjusted to their users’ needs. To address this problem it is important to know how users interact with the system. This work constitutes a methodological contribution capable of identifying the context of use in which users perform interactions with a groupware application (synchronous or asynchronous) and provides, using machine learning techniques, generative models of how users behave. Additionally, these models are transformed into a text that describes in natural language the main characteristics of the interaction of the users with the system. PMID:28726762

  1. A technological approach to studying motor planning ability in children at high risk for ASD.

    PubMed

    Taffoni, F; Focaroli, V; Keller, F; Iverson, J M

    2014-01-01

    In this work we propose a new method to study the development of motor planning abilities in children and, in particular, in children at high risk for ASD. Although several modified motor signs have been found in children with ASD, no specific markers enabling the early assessment of risk have been found yet. In this work, we discuss the problem posed by objective and quantitative behavioral analysis in non-structured environment. After an initial description of the main constraints imposed by the ecological approach, a technological and methodological solution to these issues is presented. Preliminary results on 12 children are reported and briefly discussed.

  2. Getting to Darwin: Obstacles to Accepting Evolution by Natural Selection

    NASA Astrophysics Data System (ADS)

    Thagard, Paul; Findlay, Scott

    2010-06-01

    Darwin’s theory of evolution by natural selection is central to modern biology, but is resisted by many people. This paper discusses the major psychological obstacles to accepting Darwin’s theory. Cognitive obstacles to adopting evolution by natural selection include conceptual difficulties, methodological issues, and coherence problems that derive from the intuitiveness of alternative theories. The main emotional obstacles to accepting evolution are its apparent conflict with valued beliefs about God, souls, and morality. We draw on the philosophy of science and on a psychological theory of cognitive and emotional belief revision to make suggestions about what can be done to improve acceptance of Darwinian ideas.

  3. The contribution of a gender perspective to the understanding of migrants' health

    PubMed Central

    Llácer, Alicia; Zunzunegui, María Victoria; del Amo, Julia; Mazarrasa, Lucía; Bolůmar, Francisco

    2007-01-01

    In 2005 women represented approximately half of all 190 million international migrants worldwide. This paper addresses the need to integrate a gender perspective into epidemiological studies on migration and health, outlines conceptual gaps and discusses some methodological problems. We mainly consider the international voluntary migrant. Women may emigrate as wives or as workers in a labour market in which they face double segregation, both as migrants and as women. We highlight migrant women's heightened vulnerability to situations of violence, as well as important gaps in our knowledge of the possible differential health effects of factors such as poverty, unemployment, social networks and support, discrimination, health behaviours and use of services. We provide an overview of the problems of characterising migrant populations in the health information systems, and of possible biases in the health effects caused by failure to take the triple dimension of gender, social class and ethnicity into account. PMID:18000117

  4. LCP method for a planar passive dynamic walker based on an event-driven scheme

    NASA Astrophysics Data System (ADS)

    Zheng, Xu-Dong; Wang, Qi

    2018-06-01

    The main purpose of this paper is to present a linear complementarity problem (LCP) method for a planar passive dynamic walker with round feet based on an event-driven scheme. The passive dynamic walker is treated as a planar multi-rigid-body system. The dynamic equations of the passive dynamic walker are obtained by using Lagrange's equations of the second kind. The normal forces and frictional forces acting on the feet of the passive walker are described based on a modified Hertz contact model and Coulomb's law of dry friction. The state transition problem of stick-slip between feet and floor is formulated as an LCP, which is solved with an event-driven scheme. Finally, to validate the methodology, four gaits of the walker are simulated: the stance leg neither slips nor bounces; the stance leg slips without bouncing; the stance leg bounces without slipping; the walker stands after walking several steps.

  5. Expected Power-Utility Maximization Under Incomplete Information and with Cox-Process Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fujimoto, Kazufumi, E-mail: m_fuji@kvj.biglobe.ne.jp; Nagai, Hideo, E-mail: nagai@sigmath.es.osaka-u.ac.jp; Runggaldier, Wolfgang J., E-mail: runggal@math.unipd.it

    2013-02-15

    We consider the problem of maximization of expected terminal power utility (risk sensitive criterion). The underlying market model is a regime-switching diffusion model where the regime is determined by an unobservable factor process forming a finite state Markov process. The main novelty is due to the fact that prices are observed and the portfolio is rebalanced only at random times corresponding to a Cox process where the intensity is driven by the unobserved Markovian factor process as well. This leads to a more realistic modeling for many practical situations, like in markets with liquidity restrictions; on the other hand itmore » considerably complicates the problem to the point that traditional methodologies cannot be directly applied. The approach presented here is specific to the power-utility. For log-utilities a different approach is presented in Fujimoto et al. (Preprint, 2012).« less

  6. LCP method for a planar passive dynamic walker based on an event-driven scheme

    NASA Astrophysics Data System (ADS)

    Zheng, Xu-Dong; Wang, Qi

    2018-02-01

    The main purpose of this paper is to present a linear complementarity problem (LCP) method for a planar passive dynamic walker with round feet based on an event-driven scheme. The passive dynamic walker is treated as a planar multi-rigid-body system. The dynamic equations of the passive dynamic walker are obtained by using Lagrange's equations of the second kind. The normal forces and frictional forces acting on the feet of the passive walker are described based on a modified Hertz contact model and Coulomb's law of dry friction. The state transition problem of stick-slip between feet and floor is formulated as an LCP, which is solved with an event-driven scheme. Finally, to validate the methodology, four gaits of the walker are simulated: the stance leg neither slips nor bounces; the stance leg slips without bouncing; the stance leg bounces without slipping; the walker stands after walking several steps.

  7. The immigration experience among elderly Korean immigrants.

    PubMed

    Lee, Y-M

    2007-06-01

    The purpose of this preliminary, qualitative study was to describe elderly Korean immigrants' perception of stressors they experienced through immigration and the acculturation process. The methodology used was naturalistic inquiry, a descriptive approach used to elicit the elderly immigrants' own perception of their immigration and acculturation experiences. The six elderly Korean immigrants were interviewed via a semi-structured, open-ended interview guide. The main stressors identified by the subjects as a result of adjusting to life in the United States were language barriers, isolation and loneliness, fear of dependence upon their children, fear of being a burden, financial problems, transportation problems, discrimination, and fear of death. These Korean elders also perceived changes in the traditional family values of respect for elders and support for the aged. The results of this research help to provide an understanding of the immigration and acculturation experiences of elderly Korean immigrants.

  8. A methodology to enhance electromagnetic compatibility in joint military operations

    NASA Astrophysics Data System (ADS)

    Buckellew, William R.

    The development and validation of an improved methodology to identify, characterize, and prioritize potential joint EMI (electromagnetic interference) interactions and identify and develop solutions to reduce the effects of the interference are discussed. The methodology identifies potential EMI problems using results from field operations, historical data bases, and analytical modeling. Operational expertise, engineering analysis, and testing are used to characterize and prioritize the potential EMI problems. Results can be used to resolve potential EMI during the development and acquisition of new systems and to develop engineering fixes and operational workarounds for systems already employed. The analytic modeling portion of the methodology is a predictive process that uses progressive refinement of the analysis and the operational electronic environment to eliminate noninterfering equipment pairs, defer further analysis on pairs lacking operational significance, and resolve the remaining EMI problems. Tests are conducted on equipment pairs to ensure that the analytical models provide a realistic description of the predicted interference.

  9. Teaching energy using an integrated science approach

    NASA Astrophysics Data System (ADS)

    Poggi, Valeria; Miceli, Cristina; Testa, Italo

    2017-01-01

    Despite its relevance to all scientific domains, the debate surrounding the teaching of energy is still open. The main point remains the problems students have in understanding some aspects of the energy concept and in applying their knowledge to the comprehension of natural phenomena. In this paper, we present a research-based interdisciplinary approach to the teaching of energy in which the first and second laws of thermodynamics were used to interpret physical, chemical and biological processes. The contents of the three disciplines (physics, chemistry, biology) were reconstructed focusing on six basic aspects of energy (forms, transfer, transformation, conservation, degradation, and entropy) and using common teaching methodologies. The module was assessed with 39 secondary school students (aged 15-16) using a 30-question research instrument and a treatment/control group methodology. Analysis of students’ learning outcomes suggests a better understanding of the energy concept, supporting the effectiveness of an interdisciplinary approach in the teaching of energy in physics and science in general. Implications for the teaching of energy are briefly discussed.

  10. Torsional Ultrasound Sensor Optimization for Soft Tissue Characterization

    PubMed Central

    Melchor, Juan; Muñoz, Rafael; Rus, Guillermo

    2017-01-01

    Torsion mechanical waves have the capability to characterize shear stiffness moduli of soft tissue. Under this hypothesis, a computational methodology is proposed to design and optimize a piezoelectrics-based transmitter and receiver to generate and measure the response of torsional ultrasonic waves. The procedure employed is divided into two steps: (i) a finite element method (FEM) is developed to obtain a transmitted and received waveform as well as a resonance frequency of a previous geometry validated with a semi-analytical simplified model and (ii) a probabilistic optimality criteria of the design based on inverse problem from the estimation of robust probability of detection (RPOD) to maximize the detection of the pathology defined in terms of changes of shear stiffness. This study collects different options of design in two separated models, in transmission and contact, respectively. The main contribution of this work describes a framework to establish such as forward, inverse and optimization procedures to choose a set of appropriate parameters of a transducer. This methodological framework may be generalizable for other different applications. PMID:28617353

  11. Methodology of ecooriented assessment of constructive schemes of cast in-situ RC framework in civil engineering

    NASA Astrophysics Data System (ADS)

    Avilova, I. P.; Krutilova, M. O.

    2018-01-01

    Economic growth is the main determinant of the trend to increased greenhouse gas (GHG) emission. Therefore, the reduction of emission and stabilization of GHG levels in the atmosphere become an urgent task to avoid the worst predicted consequences of climate change. GHG emissions in construction industry take a significant part of industrial GHG emission and are expected to consistently increase. The problem could be successfully solved with a help of both economical and organizational restrictions, based on enhanced algorithms of calculation and amercement of environmental harm in building industry. This study aims to quantify of GHG emission caused by different constructive schemes of RC framework in concrete casting. The result shows that proposed methodology allows to make a comparative analysis of alternative projects in residential housing, taking into account an environmental damage, caused by construction process. The study was carried out in the framework of the Program of flagship university development on the base of Belgorod State Technological University named after V.G. Shoukhov

  12. An index-based approach for the sustainability assessment of irrigation practice based on the water-energy-food nexus framework

    NASA Astrophysics Data System (ADS)

    de Vito, Rossella; Portoghese, Ivan; Pagano, Alessandro; Fratino, Umberto; Vurro, Michele

    2017-12-01

    Increasing pressure affects water resources, especially in the agricultural sector, with cascading impacts on energy consumption. This is particularly relevant in the Mediterranean area, showing significant water scarcity problems, further exacerbated by the crucial economic role of agricultural production. Assessing the sustainability of water resource use is thus essential to preserving ecosystems and maintaining high levels of agricultural productivity. This paper proposes an integrated methodology based on the Water-Energy-Food Nexus to evaluate the multi-dimensional implications of irrigation practices. Three different indices are introduced, based on an analysis of the most influential factors. The methodology is then implemented in a catchment located in Puglia (Italy) and a comparative analysis of the three indices is presented. The results mainly highlight that economic land productivity is a key driver of irrigated agriculture, and that groundwater is highly affordable compared to surface water, thus being often dangerously perceived as freely available.

  13. Singing for respiratory health: theory, evidence and challenges.

    PubMed

    Gick, Mary L; Nicol, Jennifer J

    2016-09-01

    The premise that singing is a health promoting activity for people with respiratory conditions of chronic obstructive pulmonary disease (COPD) and asthma is a growing area of interest being investigated by researchers from various disciplines. The preliminary evidence, a theoretical framework and identification of methodological challenges are discussed in this perspective article with an eye to recommendations for further research to advance knowledge. After a brief summary of main research findings on singing in healthy people to provide background context, research is reviewed on singing in people with COPD and asthma. Studies include published research and as yet unpublished work by the authors. Methodological challenges arising from the reviewed studies are identified such as attrition from singing or control groups based on weak and strong, respectively, beliefs about singing's effectiveness. Potential solutions for these problems are considered with further recommendations made for other singing research. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Geometric stiffening in multibody dynamics formulations

    NASA Technical Reports Server (NTRS)

    Sharf, Inna

    1993-01-01

    In this paper we discuss the issue of geometric stiffening as it arises in the context of multibody dynamics. This topic has been treated in a number of previous publications in this journal and appears to be a debated subject. The controversy revolves primarily around the 'correct' methodology for incorporating the stiffening effect into dynamics formulations. The main goal of this work is to present the different approaches that have been developed for this problem through an in-depth review of several publications dealing with this subject. This is done with the goal of contributing to a precise understanding of the existing methodologies for modelling the stiffening effects in multibody systems. Thus, in presenting the material we attempt to illuminate the key characteristics of the various methods as well as show how they relate to each other. In addition, we offer a number of novel insights and clarifying interpretations of these schemes. The paper is completed with a general classification and comparison of the different approaches.

  15. SeCom - Serious Community 2.0 prevent flooding

    NASA Astrophysics Data System (ADS)

    Komma, Juergen; Breuer, Roman; Sewilam, Hani; Concia, Francesca; Aliprandi, Bruno; Siegmund, Sabine; Goossens, Jannis

    2013-04-01

    There is a significant need for raising the awareness and building the capacity of water professionals in different water sectors cross Europe. There is also a need for qualified graduates to implement the EU Flood Risk Directive (FRD). The main aim of this work is to prepare and build the capacity of both groups in flood risk management through identifying synergies, sharing knowledge, and strengthen partnerships between universities and different stakeholders(mainly water professionals). The specific objectives are to develop; a) Development of a dynamic and active tool that allows all target-groups/users to assess their knowledge about flood risk management. b) Development of an innovative, active and problem-based learning methodology for flood risk education and training. c)Development of flood related Vocational Education & Training (VET) modules for water professionals (involving the students to gain practical experience). This will include some modules for undergraduate students on flood risk management and protection.

  16. Methodological Issues and Practical Problems in Conducting Research on Abused Children.

    ERIC Educational Resources Information Center

    Kinard, E. Milling

    In order to inform policy and programs, research on child abuse must be not only methodologically rigorous, but also practically feasible. However, practical problems make child abuse research difficult to conduct. Definitions of abuse must be explicit and different types of abuse must be assessed separately. Study samples should be as…

  17. Misleading University Rankings: Cause and Cure for Discrepancies between Nominal and Attained Weights

    ERIC Educational Resources Information Center

    Soh, Kaycheng

    2013-01-01

    Recent research into university ranking methodologies uncovered several methodological problems among the systems currently in vogue. One of these is the discrepancy between the nominal and attained weights. The problem is the summation of unstandardized indicators for the total scores used in ranking. It is demonstrated that weight discrepancy…

  18. A Methodological Critique of "Interventions for Boys with Conduct Problems"

    ERIC Educational Resources Information Center

    Kent, Ronald; And Others

    1976-01-01

    Kent criticizes Patterson's study on treating the behavior problems of boys, on several methodological bases concluding that more rigorous research is required in this field. Patterson answers Kent's criticisms arguing that they are not based on sound grounds. Patterson offers further evidence to support the efficacy of his treatment procedures.…

  19. Research Methodology in Second Language Studies: Trends, Concerns, and New Directions

    ERIC Educational Resources Information Center

    King, Kendall A.; Mackey, Alison

    2016-01-01

    The field of second language studies is using increasingly sophisticated methodological approaches to address a growing number of urgent, real-world problems. These methodological developments bring both new challenges and opportunities. This article briefly reviews recent ontological and methodological debates in the field, then builds on these…

  20. Eight-dimensional methodology for innovative thinking about the case and ethics of the Mount Graham, Large Binocular Telescope project.

    PubMed

    Berne, Rosalyn W; Raviv, Daniel

    2004-04-01

    This paper introduces the Eight Dimensional Methodology for Innovative Thinking (the Eight Dimensional Methodology), for innovative problem solving, as a unified approach to case analysis that builds on comprehensive problem solving knowledge from industry, business, marketing, math, science, engineering, technology, arts, and daily life. It is designed to stimulate innovation by quickly generating unique "out of the box" unexpected and high quality solutions. It gives new insights and thinking strategies to solve everyday problems faced in the workplace, by helping decision makers to see otherwise obscure alternatives and solutions. Daniel Raviv, the engineer who developed the Eight Dimensional Methodology, and paper co-author, technology ethicist Rosalyn Berne, suggest that this tool can be especially useful in identifying solutions and alternatives for particular problems of engineering, and for the ethical challenges which arise with them. First, the Eight Dimensional Methodology helps to elucidate how what may appear to be a basic engineering problem also has ethical dimensions. In addition, it offers to the engineer a methodology for penetrating and seeing new dimensions of those problems. To demonstrate the effectiveness of the Eight Dimensional Methodology as an analytical tool for thinking about ethical challenges to engineering, the paper presents the case of the construction of the Large Binocular Telescope (LBT) on Mount Graham in Arizona. Analysis of the case offers to decision makers the use of the Eight Dimensional Methodology in considering alternative solutions for how they can proceed in their goals of exploring space. It then follows that same process through the second stage of exploring the ethics of each of those different solutions. The LBT project pools resources from an international partnership of universities and research institutes for the construction and maintenance of a highly sophisticated, powerful new telescope. It will soon mark the erection of the world's largest and most powerful optical telescope, designed to see fine detail otherwise visible only from space. It also represents a controversial engineering project that is being undertaken on land considered to be sacred by the local, native Apache people. As presented, the case features the University of Virginia, and its challenges in consideration of whether and how to join the LBT project consortium.

  1. IMSF: Infinite Methodology Set Framework

    NASA Astrophysics Data System (ADS)

    Ota, Martin; Jelínek, Ivan

    Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.

  2. Virtual manufacturing in reality

    NASA Astrophysics Data System (ADS)

    Papstel, Jyri; Saks, Alo

    2000-10-01

    SMEs play an important role in manufacturing industry. But from time to time there is a shortage in resources to complete the particular order in time. Number of systems is introduced to produce digital information in order to support product and process development activities. Main problem is lack of opportunity for direct data transition within design system modules when needed temporary extension of design capacity (virtuality) or to implement integrated concurrent product development principles. The planning experience in the field is weakly used as well. The concept of virtual manufacturing is a supporting idea to solve this problem. At the same time a number of practical problems should be solved like information conformity, data transfer, unified technological concepts acceptation etc. In the present paper the proposed ways to solve the practical problems of virtual manufacturing are described. General objective is to introduce the knowledge-based CAPP system as missing module for Virtual Manufacturing in the selected product domain. Surface-centered planning concept based on STEP- based modeling principles, and knowledge-based process planning methodology will be used to gain the objectives. As a result the planning module supplied by design data with direct access, and supporting advising environment is expected. Mould producing SME would be as test basis.

  3. [Role of an educational-and-methodological complex in the optimization of teaching at the stage of additional professional education of physicians in the specialty "anesthesiology and reanimatology"].

    PubMed

    Buniatian, A A; Sizova, Zh M; Vyzhigina, M A; Shikh, E V

    2010-01-01

    An educational-and-methodological complex (EMC) in the specialty 'Anesthesiology and Reanimatology", which promotes manageability, flexibility, and dynamism of an educational process, is of great importance in solving the problem in the systematization of knowledge and its best learning by physicians at a stage of additional professional education (APE). EMC is a set of educational-and-methodological materials required to organize and hold an educational process for the advanced training of anesthesiologists and resuscitation specialists at the stage of APE. EMC includes a syllabus for training in the area "Anesthesiology and Reanimatology" by the appropriate training pattern (certification cycles, topical advanced training cycles); a work program for training in the specialty "Anesthesiology and Reanimatology"; a work curriculums for training in allied specialties (surgery, traumatology and orthopedics, obstetrics and gynecology, and pediatrics); work programs on basic disciplines (pharmacology, normal and pathological physiology, normal anatomy, chemistry and biology); working programs on the area "Public health care and health care service", guidelines for the teacher; educational-and-methodological materials for the student; and quiz programs. The main point of EMC in the specialty "Anesthesiology and Reanimatology" is a work program. Thus, educational-and-methodological and teaching materials included into the EMC in the specialty 'Anesthesiology and Reanimatology" should envisage the logically successive exposition of a teaching material, the use of currently available methods and educational facilities, which facilitates the optimization of training of anesthesiologists and resuscitation specialists at the stage of APE.

  4. The Beliefs of Teachers and Daycare Staff regarding Children of Divorce: A Q Methodological Study

    ERIC Educational Resources Information Center

    Overland, Klara; Thorsen, Arlene Arstad; Storksen, Ingunn

    2012-01-01

    This Q methodological study explores beliefs of daycare staff and teachers regarding young children's reactions related to divorce. The Q factor analysis resulted in two viewpoints. Participants on the viewpoint "Child problems" believe that children show various emotional and behavioral problems related to divorce, while those on the "Structure…

  5. The Ranking of Higher Education Institutions in Russia: Some Methodological Problems.

    ERIC Educational Resources Information Center

    Filinov, Nikolay B.; Ruchkina, Svetlana

    2002-01-01

    The ranking of higher education institutions in Russia is examined from two points of view: as a social phenomenon and as a multi-criteria decision-making problem. The first point of view introduces the idea of interested and involved parties; the second introduces certain principles on which a rational ranking methodology should be based.…

  6. Integration of PBL Methodologies into Online Learning Courses and Programs

    ERIC Educational Resources Information Center

    van Oostveen, Roland; Childs, Elizabeth; Flynn, Kathleen; Clarkson, Jessica

    2014-01-01

    Problem-based learning (PBL) challenges traditional views of teaching and learning as the learner determines, to a large extent with support from a skilled facilitator, what topics will be explored, to what depth and which processes will be used. This paper presents the implementation of problem-based learning methodologies in an online Bachelor's…

  7. Methodology and measures for preventing unacceptable flow-accelerated corrosion thinning of pipelines and equipment of NPP power generating units

    NASA Astrophysics Data System (ADS)

    Tomarov, G. V.; Shipkov, A. A.; Lovchev, V. N.; Gutsev, D. F.

    2016-10-01

    Problems of metal flow-accelerated corrosion (FAC) in the pipelines and equipment of the condensate- feeding and wet-steam paths of NPP power-generating units (PGU) are examined. Goals, objectives, and main principles of the methodology for the implementation of an integrated program of AO Concern Rosenergoatom for the prevention of unacceptable FAC thinning and for increasing operational flow-accelerated corrosion resistance of NPP EaP are worded (further the Program). A role is determined and potentialities are shown for the use of Russian software packages in the evaluation and prediction of FAC rate upon solving practical problems for the timely detection of unacceptable FAC thinning in the elements of pipelines and equipment (EaP) of the secondary circuit of NPP PGU. Information is given concerning the structure, properties, and functions of the software systems for plant personnel support in the monitoring and planning of the inservice inspection of FAC thinning elements of pipelines and equipment of the secondary circuit of NPP PGUs, which are created and implemented at some Russian NPPs equipped with VVER-1000, VVER-440, and BN-600 reactors. It is noted that one of the most important practical results of software packages for supporting NPP personnel concerning the issue of flow-accelerated corrosion consists in revealing elements under a hazard of intense local FAC thinning. Examples are given for successful practice at some Russian NPP concerning the use of software systems for supporting the personnel in early detection of secondary-circuit pipeline elements with FAC thinning close to an unacceptable level. Intermediate results of working on the Program are presented and new tasks set in 2012 as a part of the updated program are denoted. The prospects of the developed methods and tools in the scope of the Program measures at the stages of design and construction of NPP PGU are discussed. The main directions of the work on solving the problems of flow-accelerated corrosion of pipelines and equipment in Russian NPP PGU are defined.

  8. Global Optimal Trajectory in Chaos and NP-Hardness

    NASA Astrophysics Data System (ADS)

    Latorre, Vittorio; Gao, David Yang

    This paper presents an unconventional theory and method for solving general nonlinear dynamical systems. Instead of the direct iterative methods, the discretized nonlinear system is first formulated as a global optimization problem via the least squares method. A newly developed canonical duality theory shows that this nonconvex minimization problem can be solved deterministically in polynomial time if a global optimality condition is satisfied. The so-called pseudo-chaos produced by linear iterative methods are mainly due to the intrinsic numerical error accumulations. Otherwise, the global optimization problem could be NP-hard and the nonlinear system can be really chaotic. A conjecture is proposed, which reveals the connection between chaos in nonlinear dynamics and NP-hardness in computer science. The methodology and the conjecture are verified by applications to the well-known logistic equation, a forced memristive circuit and the Lorenz system. Computational results show that the canonical duality theory can be used to identify chaotic systems and to obtain realistic global optimal solutions in nonlinear dynamical systems. The method and results presented in this paper should bring some new insights into nonlinear dynamical systems and NP-hardness in computational complexity theory.

  9. Convergence of the standard RLS method and UDUT factorisation of covariance matrix for solving the algebraic Riccati equation of the DLQR via heuristic approximate dynamic programming

    NASA Astrophysics Data System (ADS)

    Moraes Rêgo, Patrícia Helena; Viana da Fonseca Neto, João; Ferreira, Ernesto M.

    2015-08-01

    The main focus of this article is to present a proposal to solve, via UDUT factorisation, the convergence and numerical stability problems that are related to the covariance matrix ill-conditioning of the recursive least squares (RLS) approach for online approximations of the algebraic Riccati equation (ARE) solution associated with the discrete linear quadratic regulator (DLQR) problem formulated in the actor-critic reinforcement learning and approximate dynamic programming context. The parameterisations of the Bellman equation, utility function and dynamic system as well as the algebra of Kronecker product assemble a framework for the solution of the DLQR problem. The condition number and the positivity parameter of the covariance matrix are associated with statistical metrics for evaluating the approximation performance of the ARE solution via RLS-based estimators. The performance of RLS approximators is also evaluated in terms of consistence and polarisation when associated with reinforcement learning methods. The used methodology contemplates realisations of online designs for DLQR controllers that is evaluated in a multivariable dynamic system model.

  10. Introducing soft systems methodology plus (SSM+): why we need it and what it can contribute.

    PubMed

    Braithwaite, Jeffrey; Hindle, Don; Iedema, Rick; Westbrook, Johanna I

    2002-01-01

    There are many complicated and seemingly intractable problems in the health care sector. Past ways to address them have involved political responses, economic restructuring, biomedical and scientific studies, and managerialist or business-oriented tools. Few methods have enabled us to develop a systematic response to problems. Our version of soft systems methodology, SSM+, seems to improve problem solving processes by providing an iterative, staged framework that emphasises collaborative learning and systems redesign involving both technical and cultural fixes.

  11. Helping the decision maker effectively promote various experts’ views into various optimal solutions to China’s institutional problem of health care provider selection through the organization of a pilot health care provider research system

    PubMed Central

    2013-01-01

    Background The main aim of China’s Health Care System Reform was to help the decision maker find the optimal solution to China’s institutional problem of health care provider selection. A pilot health care provider research system was recently organized in China’s health care system, and it could efficiently collect the data for determining the optimal solution to China’s institutional problem of health care provider selection from various experts, then the purpose of this study was to apply the optimal implementation methodology to help the decision maker effectively promote various experts’ views into various optimal solutions to this problem under the support of this pilot system. Methods After the general framework of China’s institutional problem of health care provider selection was established, this study collaborated with the National Bureau of Statistics of China to commission a large-scale 2009 to 2010 national expert survey (n = 3,914) through the organization of a pilot health care provider research system for the first time in China, and the analytic network process (ANP) implementation methodology was adopted to analyze the dataset from this survey. Results The market-oriented health care provider approach was the optimal solution to China’s institutional problem of health care provider selection from the doctors’ point of view; the traditional government’s regulation-oriented health care provider approach was the optimal solution to China’s institutional problem of health care provider selection from the pharmacists’ point of view, the hospital administrators’ point of view, and the point of view of health officials in health administration departments; the public private partnership (PPP) approach was the optimal solution to China’s institutional problem of health care provider selection from the nurses’ point of view, the point of view of officials in medical insurance agencies, and the health care researchers’ point of view. Conclusions The data collected through a pilot health care provider research system in the 2009 to 2010 national expert survey could help the decision maker effectively promote various experts’ views into various optimal solutions to China’s institutional problem of health care provider selection. PMID:23557082

  12. Child maltreatment prevention: a systematic review of reviews

    PubMed Central

    Butchart, Alexander

    2009-01-01

    Abstract Objective To synthesize recent evidence from systematic and comprehensive reviews on the effectiveness of universal and selective child maltreatment prevention interventions, evaluate the methodological quality of the reviews and outcome evaluation studies they are based on, and map the geographical distribution of the evidence. Methods A systematic review of reviews was conducted. The quality of the systematic reviews was evaluated with a tool for the assessment of multiple systematic reviews (AMSTAR), and the quality of the outcome evaluations was assessed using indicators of internal validity and of the construct validity of outcome measures. Findings The review focused on seven main types of interventions: home visiting, parent education, child sex abuse prevention, abusive head trauma prevention, multi-component interventions, media-based interventions, and support and mutual aid groups. Four of the seven – home-visiting, parent education, abusive head trauma prevention and multi-component interventions – show promise in preventing actual child maltreatment. Three of them – home visiting, parent education and child sexual abuse prevention – appear effective in reducing risk factors for child maltreatment, although these conclusions are tentative due to the methodological shortcomings of the reviews and outcome evaluation studies they draw on. An analysis of the geographical distribution of the evidence shows that outcome evaluations of child maltreatment prevention interventions are exceedingly rare in low- and middle-income countries and make up only 0.6% of the total evidence base. Conclusion Evidence for the effectiveness of four of the seven main types of interventions for preventing child maltreatment is promising, although it is weakened by methodological problems and paucity of outcome evaluations from low- and middle-income countries. PMID:19551253

  13. Evaluation of the measurement properties of self-reported health-related work-functioning instruments among workers with common mental disorders.

    PubMed

    Abma, Femke I; van der Klink, Jac J L; Terwee, Caroline B; Amick, Benjamin C; Bültmann, Ute

    2012-01-01

    During the past decade, common mental disorders (CMD) have emerged as a major public and occupational health problem in many countries. Several instruments have been developed to measure the influence of health on functioning at work. To select appropriate instruments for use in occupational health practice and research, the measurement properties (eg, reliability, validity, responsiveness) must be evaluated. The objective of this study is to appraise critically and compare the measurement properties of self-reported health-related work-functioning instruments among workers with CMD. A systematic review was performed searching three electronic databases. Papers were included that: (i) mainly focused on the development and/or evaluation of the measurement properties of a self-reported health-related work-functioning instrument; (ii) were conducted in a CMD population; and (iii) were fulltext original papers. Quality appraisal was performed using the consensus-based standards for the selection of health status measurement instruments (COSMIN) checklist. Five papers evaluating measurement properties of five self-reported health-related work-functioning instruments in CMD populations were included. There is little evidence available for the measurement properties of the identified instruments in this population, mainly due to low methodological quality of the included studies. The available evidence on measurement properties is based on studies of poor-to-fair methodological quality. Information on a number of measurement properties, such as measurement error, content validity, and cross-cultural validity is still lacking. Therefore, no evidence-based decisions and recommendations can be made for the use of health-related work functioning instruments. Studies of high methodological quality are needed to properly assess the existing instruments' measurement properties.

  14. A New Vegetation Segmentation Approach for Cropped Fields Based on Threshold Detection from Hue Histograms

    PubMed Central

    Hassanein, Mohamed; El-Sheimy, Naser

    2018-01-01

    Over the last decade, the use of unmanned aerial vehicle (UAV) technology has evolved significantly in different applications as it provides a special platform capable of combining the benefits of terrestrial and aerial remote sensing. Therefore, such technology has been established as an important source of data collection for different precision agriculture (PA) applications such as crop health monitoring and weed management. Generally, these PA applications depend on performing a vegetation segmentation process as an initial step, which aims to detect the vegetation objects in collected agriculture fields’ images. The main result of the vegetation segmentation process is a binary image, where vegetations are presented in white color and the remaining objects are presented in black. Such process could easily be performed using different vegetation indexes derived from multispectral imagery. Recently, to expand the use of UAV imagery systems for PA applications, it was important to reduce the cost of such systems through using low-cost RGB cameras Thus, developing vegetation segmentation techniques for RGB images is a challenging problem. The proposed paper introduces a new vegetation segmentation methodology for low-cost UAV RGB images, which depends on using Hue color channel. The proposed methodology follows the assumption that the colors in any agriculture field image can be distributed into vegetation and non-vegetations colors. Therefore, four main steps are developed to detect five different threshold values using the hue histogram of the RGB image, these thresholds are capable to discriminate the dominant color, either vegetation or non-vegetation, within the agriculture field image. The achieved results for implementing the proposed methodology showed its ability to generate accurate and stable vegetation segmentation performance with mean accuracy equal to 87.29% and standard deviation as 12.5%. PMID:29670055

  15. Trends and Issues in ELT Methods and Methodology

    ERIC Educational Resources Information Center

    Waters, Alan

    2012-01-01

    Trends and issues in ELT methods and methodology can be identified at two main levels. One is in terms of the theoretical pronouncements of the "professional discourse", as manifested by major publications, conference presentations, and so on. This article therefore begins by briefly summarizing some of the main developments of this kind from 1995…

  16. Evidence on public policy: methodological issues, political issues and examples.

    PubMed

    Attanasio, Orazio P

    2014-03-01

    In this paper I discuss how evidence on public policy is generated and in particular the issue of evaluation of public policies. In economics, the issue of attribution and the identification of causal links has recently received considerable attention. Important methodological issues have been tackled and new techniques have been proposed and used. Randomized Control Trials have become some sort of gold standard. However, they are not exempt from problems and have important limitations: in some case they cannot be constructed and, more generally, problems of external validity and transferability of results can be important. The paper then moves on to discuss the political economy of policy evaluations for policy evaluations to have an impact for the conduct of actual policy, it is important that the demand for evaluation comes directly from the policy making process and is generated endogenously within it. In this sense it is important that the institutional design of policy making is such that policy making institutions are incentivized to use rigorous evaluation in the process of designing policies and allocating resources to alternative options. Economists are currently involved in the design and evaluation of many policies, including policies about health, nutrition and education. The role they can play in these fields is not completely obvious. The paper argues that their main contribution is in the modelling of how individual reacts to incentives (including those provided by public policies).

  17. The neuropharmacology of relapse to food seeking: methodology, main findings, and comparison with relapse to drug seeking.

    PubMed

    Nair, Sunila G; Adams-Deutsch, Tristan; Epstein, David H; Shaham, Yavin

    2009-09-01

    Relapse to old, unhealthy eating habits is a major problem in human dietary treatments. The mechanisms underlying this relapse are unknown. Surprisingly, until recently this clinical problem has not been systematically studied in animal models. Here, we review results from recent studies in which a reinstatement model (commonly used to study relapse to abused drugs) was employed to characterize the effect of pharmacological agents on relapse to food seeking induced by either food priming (non-contingent exposure to small amounts of food), cues previously associated with food, or injections of the pharmacological stressor yohimbine. We also address methodological issues related to the use of the reinstatement model to study relapse to food seeking, similarities and differences in mechanisms underlying reinstatement of food seeking versus drug seeking, and the degree to which the reinstatement procedure provides a suitable model for studying relapse in humans. We conclude by discussing implications for medication development and future research. We offer three tentative conclusions: (1)The neuronal mechanisms of food-priming- and cue-induced reinstatement are likely different from those of reinstatement induced by the pharmacological stressor yohimbine. (2)The neuronal mechanisms of reinstatement of food seeking are possibly different from those of ongoing food-reinforced operant responding. (3)The neuronal mechanisms underlying reinstatement of food seeking overlap to some degree with those of reinstatement of drug seeking.

  18. A method for the definition of a self-awareness behavior dimension with clinical subjects: a latent trait analysis.

    PubMed

    Mannarini, Stefania

    2009-11-01

    The main scope of the present study was to devise a method in order to define a dimension characteristic of self-awareness behaviors with clinical subjects. To do so, I adopted a latent trait methodological approach. I studied the way patients expressed their treatment requests through their behaviors, both during their admission to a medical center in Northern Italy and after a period of treatment that involved an integrated (psychoanalytical and pharmacological) approach. The subjects were 48 females suffering from affective disorders, often combined with personality disorders. Five self-awareness indicators were identified, based both on interviews conducted with the patients and on the literature on the subject. The data gathered were analyzed by means of the many-facet Rasch model (Linacre, 1989). The results confirmed the existence of a self-awareness dimension characterized by the five indicators. Moreover, there was evidence that an improvement in self-awareness occurred during the pretreatment to posttreatment time period for both the affective disorders with personality problems patients and the affective disorders without personality problems patients. The estimation of bias/interactions showed the existence of specific behavioral differences between the two groups of patients. This study demonstrates the appropriateness of the methodological tool adopted, opening new expectations with regard to the integration of two approaches-psychoanalytical and pharmacological ones-in the treatment of psychiatric subjects.

  19. An efficient and accurate solution methodology for bilevel multi-objective programming problems using a hybrid evolutionary-local-search algorithm.

    PubMed

    Deb, Kalyanmoy; Sinha, Ankur

    2010-01-01

    Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.

  20. Modeling and Accuracy Assessment for 3D-VIRTUAL Reconstruction in Cultural Heritage Using Low-Cost Photogrammetry: Surveying of the "santa MARÍA Azogue" Church's Front

    NASA Astrophysics Data System (ADS)

    Robleda Prieto, G.; Pérez Ramos, A.

    2015-02-01

    Sometimes it could be difficult to represent "on paper" an architectural idea, a solution, a detail or a newly created element, depending on the complexity what it want be conveyed through its graphical representation but it may be even harder to represent the existing reality. (a building, a detail,...), at least with an acceptable degree of definition and accuracy. As a solution to this hypothetical problem, this paper try to show a methodology to collect measure data by combining different methods or techniques, to obtain the characteristic geometry of architectonic elements, especially in those highly decorated and/or complex geometry, as well as to assess the accuracy of the results obtained, but in an accuracy level enough and not very expensive costs. In addition, we can obtain a 3D recovery model that allows us a strong support, beyond point clouds obtained through another more expensive methods as using laser scanner, to obtain orthoimages. This methodology was used in the study case of the 3D-virtual reconstruction of a main medieval church façade because of the geometrical complexity in many elements as the existing main doorway with archivolts and many details, as well as the rose window located above it so it's inaccessible due to the height.

  1. Rapid classification of landsat TM imagery for phase 1 stratification using the automated NDVI threshold supervised classification (ANTSC) methodology

    Treesearch

    William H. Cooke; Dennis M. Jacobs

    2002-01-01

    FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land useiland cover information....

  2. Learning outcomes of "The Oncology Patient" study among nursing students: A comparison of teaching strategies.

    PubMed

    Roca, Judith; Reguant, Mercedes; Canet, Olga

    2016-11-01

    Teaching strategies are essential in order to facilitate meaningful learning and the development of high-level thinking skills in students. To compare three teaching methodologies (problem-based learning, case-based teaching and traditional methods) in terms of the learning outcomes achieved by nursing students. This quasi-experimental research was carried out in the Nursing Degree programme in a group of 74 students who explored the subject of The Oncology Patient through the aforementioned strategies. A performance test was applied based on Bloom's Revised Taxonomy. A significant correlation was found between the intragroup theoretical and theoretical-practical dimensions. Likewise, intergroup differences were related to each teaching methodology. Hence, significant differences were estimated between the traditional methodology (x-=9.13), case-based teaching (x-=12.96) and problem-based learning (x-=14.84). Problem-based learning was shown to be the most successful learning method, followed by case-based teaching and the traditional methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. [The urgent problems of the improvement of the environment management system based on the analysis of health risk assessment].

    PubMed

    Avaliani, S L; Novikov, S M; Shashina, T A; Dodina, N S; Kislitsin, V A; Mishina, A L

    2014-01-01

    The lack of adequate legislative and regulatory framework for ensuring minimization of the health risks in the field of environmental protection is the obstacle for the application of the risk analysis methodology as a leading tool for administrative activity in Russia. "Principles of the state policy in the sphere of ensuring chemical and biological safety of the Russian Federation for the period up to 2025 and beyond", approved by the President of the Russian Federation on 01 November 2013, No PR-25 73, are aimed at the legal support for the health risk analysis methodology. In the article there have been supposed the main stages of the operative control of the environmental quality, which lead to the reduction of the health risk to the acceptable level. The further improvement of the health risk analysis methodology in Russia should contribute to the implementation of the state policy in the sphere of chemical and biological safety through the introduction of complex measures on neutralization of chemical and biological threats to the human health and the environment, as well as evaluation of the economic effectiveness of these measures. The primary step should be the legislative securing of the quantitative value for the term: "acceptable risk".

  4. Numerical characteristics of quantum computer simulation

    NASA Astrophysics Data System (ADS)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  5. Geomagnetic main field modeling using magnetohydrodynamic constraints

    NASA Technical Reports Server (NTRS)

    Estes, R. H.

    1985-01-01

    The influence of physical constraints are investigated which may be approximately satisfied by the Earth's liquid core on models of the geomagnetic main field and its secular variation. A previous report describes the methodology used to incorporate nonlinear equations of constraint into the main field model. The application of that methodology to the GSFC 12/83 field model to test the frozen-flux hypothesis and the usefulness of incorporating magnetohydrodynamic constraints for obtaining improved geomagnetic field models is described.

  6. RBT-GA: a novel metaheuristic for solving the Multiple Sequence Alignment problem.

    PubMed

    Taheri, Javid; Zomaya, Albert Y

    2009-07-07

    Multiple Sequence Alignment (MSA) has always been an active area of research in Bioinformatics. MSA is mainly focused on discovering biologically meaningful relationships among different sequences or proteins in order to investigate the underlying main characteristics/functions. This information is also used to generate phylogenetic trees. This paper presents a novel approach, namely RBT-GA, to solve the MSA problem using a hybrid solution methodology combining the Rubber Band Technique (RBT) and the Genetic Algorithm (GA) metaheuristic. RBT is inspired by the behavior of an elastic Rubber Band (RB) on a plate with several poles, which is analogues to locations in the input sequences that could potentially be biologically related. A GA attempts to mimic the evolutionary processes of life in order to locate optimal solutions in an often very complex landscape. RBT-GA is a population based optimization algorithm designed to find the optimal alignment for a set of input protein sequences. In this novel technique, each alignment answer is modeled as a chromosome consisting of several poles in the RBT framework. These poles resemble locations in the input sequences that are most likely to be correlated and/or biologically related. A GA-based optimization process improves these chromosomes gradually yielding a set of mostly optimal answers for the MSA problem. RBT-GA is tested with one of the well-known benchmarks suites (BALiBASE 2.0) in this area. The obtained results show that the superiority of the proposed technique even in the case of formidable sequences.

  7. Kaiser and Felicity effects and their application for evaluation of concrete by acoustic emission method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nesvijski, E.; Nesvijski, T.

    1996-12-31

    Concrete as one of the main construction materials, which is used for building of industrial and civil structures, highways, bridges, etc. requires periodical evaluation of its properties by different nondestructive methods. Application of acoustic emission (AE) for these purposes occupies a modest place among other nondestructive methods. But the AE methods proved to be very effective for testing of concrete and reinforced concrete elements and structures under load. This work is devoted to an important, from methodological point of view, problem connected with two opposite effects: of Kaiser and of Felicity, and their application for evaluation of concrete by themore » AE method.« less

  8. [The cavernosometry].

    PubMed

    Mantovani, F; Mastromarino, G; Fenice, O; Canclini, L; Patelli, E; Colombo, F; Vecchio, D; Austoni, E

    1994-09-01

    The recent clinical and experimental research innovations in Andrology make possible the following classification of impotence: "Failure to initiate" "Failure to store" "Failure to fill" The last aspect, including veno-occlusive dysfunction, is continuously reevaluated by andrologic studies. The main diagnostic procedure of this complex problem, in constant evolution, is represented by cavernometry. Recently, but with full success, we are utilizing direct radioisotopic penogram in video sexy stimulation: in preselection function but probably in future with substitutive function of the more invasive and traditional cavernometry. In spite of this methodologic progress the findings of cavernometry are in continuous discussion as in tumultuous evolution, in anatomo-physiological environment, is the intracavernous district that, for many aspects, necessity of ulterior histochemical, pharmacodynamic and neurophysiological acknowledgements.

  9. [The prevalence of suicide attempts among children and adolescents].

    PubMed

    Woźniak, Ewelina; Talarowska, Monika; Orzechowska, Agata; Florkowski, Antoni; Gałecki, Piotr

    2013-03-01

    Suicide is the act of a fatal outcome. People who think about suicide perceive death as a way to avoid problems. Suicide attempts by children and young people likely to arise from the fact that the identified single or co-occurring mental disorders. was to illustrate the suicide problem, which is increasingly frequent attempts to take their own life for children and youth. Its main objective is to determine the prevalence and determinants of suicide attempts made by young people. The study group consisted of patients Babinski Hospital in Lodz. The study included 18 patients, 9 boys and 9 girls. Research methodology is based on the stories of young patients diseases. In order to verify the prevalence of trial and / or thoughts, suicidal tendencies among children and adolescents, was used as a research,tool - a survey of its own design. The survey consists of 21 questions about basic information on the state of social, physical and mental patients. Subjective verification made disseminate ideas, trends and / or suicide attempts among children and adolescents in most reflects the actual collection of information gathered by various authors. Children coming from families reconstructed and largely incomplete exhibit suicidal behavior. The main risk factors indicating the attempt on his own life are mental disorders: depression and behavioral disorders. Family situation of young people: conflicts between the father and the mother, violence, physical / mental, has a significant effect on the risk of an attempt on his own life. Superficial self-mutilation, is the main way to make a suicide bombing.

  10. Selected analytical challenges in the determination of pharmaceuticals in drinking/marine waters and soil/sediment samples.

    PubMed

    Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr

    2016-03-20

    Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. A PDE-based methodology for modeling, parameter estimation and feedback control in structural and structural acoustic systems

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Brown, D. E.; Metcalf, Vern L.; Silcox, R. J.; Smith, Ralph C.; Wang, Yun

    1994-01-01

    A problem of continued interest concerns the control of vibrations in a flexible structure and the related problem of reducing structure-borne noise in structural acoustic systems. In both cases, piezoceramic patches bonded to the structures have been successfully used as control actuators. Through the application of a controlling voltage, the patches can be used to reduce structural vibrations which in turn lead to methods for reducing structure-borne noise. A PDE-based methodology for modeling, estimating physical parameters, and implementing a feedback control scheme for problems of this type is discussed. While the illustrating example is a circular plate, the methodology is sufficiently general so as to be applicable in a variety of structural and structural acoustic systems.

  12. Problems related to the integration of fault tolerant aircraft electronic systems

    NASA Technical Reports Server (NTRS)

    Bannister, J. A.; Adlakha, V.; Triyedi, K.; Alspaugh, T. A., Jr.

    1982-01-01

    Problems related to the design of the hardware for an integrated aircraft electronic system are considered. Taxonomies of concurrent systems are reviewed and a new taxonomy is proposed. An informal methodology intended to identify feasible regions of the taxonomic design space is described. Specific tools are recommended for use in the methodology. Based on the methodology, a preliminary strawman integrated fault tolerant aircraft electronic system is proposed. Next, problems related to the programming and control of inegrated aircraft electronic systems are discussed. Issues of system resource management, including the scheduling and allocation of real time periodic tasks in a multiprocessor environment, are treated in detail. The role of software design in integrated fault tolerant aircraft electronic systems is discussed. Conclusions and recommendations for further work are included.

  13. Meaning and Problems of Planning

    ERIC Educational Resources Information Center

    Brieve, Fred J.; Johnston, A. P.

    1973-01-01

    Examines the educational planning process. Discusses what planning is, how methodological planning can work in education, misunderstandings about planning, and difficulties in applying the planning methodology. (DN)

  14. Cost-Utility Analysis: Current Methodological Issues and Future Perspectives

    PubMed Central

    Nuijten, Mark J. C.; Dubois, Dominique J.

    2011-01-01

    The use of cost–effectiveness as final criterion in the reimbursement process for listing of new pharmaceuticals can be questioned from a scientific and policy point of view. There is a lack of consensus on main methodological issues and consequently we may question the appropriateness of the use of cost–effectiveness data in health care decision-making. Another concern is the appropriateness of the selection and use of an incremental cost–effectiveness threshold (Cost/QALY). In this review, we focus mainly on only some key methodological concerns relating to discounting, the utility concept, cost assessment, and modeling methodologies. Finally we will consider the relevance of some other important decision criteria, like social values and equity. PMID:21713127

  15. School-Based Methylphenidate Placebo Protocols: Methodological and Practical Issues.

    ERIC Educational Resources Information Center

    Hyman, Irwin A.; Wojtowicz, Alexandra; Lee, Kee Duk; Haffner, Mary Elizabeth; Fiorello, Catherine A.; And Others

    1998-01-01

    Focuses on methodological issues involved in choosing instruments to monitor behavior, once a comprehensive evaluation has suggested trials on Ritalin. Case examples illustrate problems of teacher compliance in filling out measures, supplying adequate placebos, and obtaining physical cooperation. Emerging school-based methodologies are discussed…

  16. Daycare Staff Emotions and Coping Related to Children of Divorce: A Q Methodological Study

    ERIC Educational Resources Information Center

    Øverland, Klara; Størksen, Ingunn; Bru, Edvin; Thorsen, Arlene Arstad

    2014-01-01

    This Q methodological study explores emotional experiences and coping of daycare staff when working with children of divorce and their families. Two main coping strategies among daycare staff were identified: 1) Confident copers, and 2) Non-confident copers. Interviews exemplify the two main experiences. Both groups may struggle with coping in…

  17. The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis

    NASA Astrophysics Data System (ADS)

    Yu, Huiling; Fan, Delin; Zhang, Yizhuo

    This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.

  18. Design space pruning heuristics and global optimization method for conceptual design of low-thrust asteroid tour missions

    NASA Astrophysics Data System (ADS)

    Alemany, Kristina

    Electric propulsion has recently become a viable technology for spacecraft, enabling shorter flight times, fewer required planetary gravity assists, larger payloads, and/or smaller launch vehicles. With the maturation of this technology, however, comes a new set of challenges in the area of trajectory design. Because low-thrust trajectory optimization has historically required long run-times and significant user-manipulation, mission design has relied on expert-based knowledge for selecting departure and arrival dates, times of flight, and/or target bodies and gravitational swing-bys. These choices are generally based on known configurations that have worked well in previous analyses or simply on trial and error. At the conceptual design level, however, the ability to explore the full extent of the design space is imperative to locating the best solutions in terms of mass and/or flight times. Beginning in 2005, the Global Trajectory Optimization Competition posed a series of difficult mission design problems, all requiring low-thrust propulsion and visiting one or more asteroids. These problems all had large ranges on the continuous variables---launch date, time of flight, and asteroid stay times (when applicable)---as well as being characterized by millions or even billions of possible asteroid sequences. Even with recent advances in low-thrust trajectory optimization, full enumeration of these problems was not possible within the stringent time limits of the competition. This investigation develops a systematic methodology for determining a broad suite of good solutions to the combinatorial, low-thrust, asteroid tour problem. The target application is for conceptual design, where broad exploration of the design space is critical, with the goal being to rapidly identify a reasonable number of promising solutions for future analysis. The proposed methodology has two steps. The first step applies a three-level heuristic sequence developed from the physics of the problem, which allows for efficient pruning of the design space. The second phase applies a global optimization scheme to locate a broad suite of good solutions to the reduced problem. The global optimization scheme developed combines a novel branch-and-bound algorithm with a genetic algorithm and an industry-standard low-thrust trajectory optimization program to solve for the following design variables: asteroid sequence, launch date, times of flight, and asteroid stay times. The methodology is developed based on a small sample problem, which is enumerated and solved so that all possible discretized solutions are known. The methodology is then validated by applying it to a larger intermediate sample problem, which also has a known solution. Next, the methodology is applied to several larger combinatorial asteroid rendezvous problems, using previously identified good solutions as validation benchmarks. These problems include the 2nd and 3rd Global Trajectory Optimization Competition problems. The methodology is shown to be capable of achieving a reduction in the number of asteroid sequences of 6-7 orders of magnitude, in terms of the number of sequences that require low-thrust optimization as compared to the number of sequences in the original problem. More than 70% of the previously known good solutions are identified, along with several new solutions that were not previously reported by any of the competitors. Overall, the methodology developed in this investigation provides an organized search technique for the low-thrust mission design of asteroid rendezvous problems.

  19. Use of Invariant Manifolds for Transfers Between Three-Body Systems

    NASA Technical Reports Server (NTRS)

    Beckman, Mark; Howell, Kathleen

    2003-01-01

    The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits does not exist. This paper presents the initial approaches to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing 7-dimensional invariant manifold data are presented. Some particular solutions are presented for the transfer problem, though the emphasis is on developing methodology for solving the general problem.

  20. Representations of Invariant Manifolds for Applications in Three-Body Systems

    NASA Technical Reports Server (NTRS)

    Howell, K.; Beckman, M.; Patterson, C.; Folta, D.

    2004-01-01

    The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits is currently being studied. This paper presents an initial approach to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing the invariant manifold data are presented. Some particular solutions are presented for two types of transfer problems, though the emphasis is on developing the methodology for solving the general problem.

  1. [Methodological problems in the scientific research on HIV /AIDS in Bolivia].

    PubMed

    Hita, Susana Ramírez

    2013-05-01

    This paper discusses the methodological problems in the scientific research on HIV/AIDS in Bolivia, both in the areas of epidemiology and social sciences. Studies associated with this research served as the basis for the implementation of health programs run by The Global Fund, The Pan-American Health Organization, International Cooperation, Non-Governmental Organizations and the Bolivian Ministry of Health and Sports. An analysis of the methodological contradictions and weaknesses was made by reviewing the bibliography of the studies and by conducting qualitative methodological research, that was focused on the quality of health care available to people living with HIV/AIDS in public hospitals and health centers, and looked at how programs targeted at this sector of the population are designed and delivered. In this manner, it was possible to observe the shortcomings of the methodological design in the epidemiological and social science studies which serve as the basis for the implementation of these health programs.

  2. Methodological problems in the neuropsychological assessment of effects of exposure to welding fumes and manganese.

    PubMed

    Lees-Haley, Paul R; Greiffenstein, M Frank; Larrabee, Glenn J; Manning, Edward L

    2004-08-01

    Recently, Kaiser (2003) raised concerns over the increase in brain damage claims reportedly due to exposure to welding fumes. In the present article, we discuss methodological problems in conducting neuropsychological research on the effects of welding exposure, using a recent paper by Bowler et al. (2003) as an example to illustrate problems common in the neurotoxicity literature. Our analysis highlights difficulties in conducting such quasi-experimental investigations, including subject selection bias, litigation effects on symptom report and neuropsychological test performance, response bias, and scientifically inadequate casual reasoning.

  3. Birth-death prior on phylogeny and speed dating

    PubMed Central

    2008-01-01

    Background In recent years there has been a trend of leaving the strict molecular clock in order to infer dating of speciations and other evolutionary events. Explicit modeling of substitution rates and divergence times makes formulation of informative prior distributions for branch lengths possible. Models with birth-death priors on tree branching and auto-correlated or iid substitution rates among lineages have been proposed, enabling simultaneous inference of substitution rates and divergence times. This problem has, however, mainly been analysed in the Markov chain Monte Carlo (MCMC) framework, an approach requiring computation times of hours or days when applied to large phylogenies. Results We demonstrate that a hill-climbing maximum a posteriori (MAP) adaptation of the MCMC scheme results in considerable gain in computational efficiency. We demonstrate also that a novel dynamic programming (DP) algorithm for branch length factorization, useful both in the hill-climbing and in the MCMC setting, further reduces computation time. For the problem of inferring rates and times parameters on a fixed tree, we perform simulations, comparisons between hill-climbing and MCMC on a plant rbcL gene dataset, and dating analysis on an animal mtDNA dataset, showing that our methodology enables efficient, highly accurate analysis of very large trees. Datasets requiring a computation time of several days with MCMC can with our MAP algorithm be accurately analysed in less than a minute. From the results of our example analyses, we conclude that our methodology generally avoids getting trapped early in local optima. For the cases where this nevertheless can be a problem, for instance when we in addition to the parameters also infer the tree topology, we show that the problem can be evaded by using a simulated-annealing like (SAL) method in which we favour tree swaps early in the inference while biasing our focus towards rate and time parameter changes later on. Conclusion Our contribution leaves the field open for fast and accurate dating analysis of nucleotide sequence data. Modeling branch substitutions rates and divergence times separately allows us to include birth-death priors on the times without the assumption of a molecular clock. The methodology is easily adapted to take data from fossil records into account and it can be used together with a broad range of rate and substitution models. PMID:18318893

  4. Birth-death prior on phylogeny and speed dating.

    PubMed

    Akerborg, Orjan; Sennblad, Bengt; Lagergren, Jens

    2008-03-04

    In recent years there has been a trend of leaving the strict molecular clock in order to infer dating of speciations and other evolutionary events. Explicit modeling of substitution rates and divergence times makes formulation of informative prior distributions for branch lengths possible. Models with birth-death priors on tree branching and auto-correlated or iid substitution rates among lineages have been proposed, enabling simultaneous inference of substitution rates and divergence times. This problem has, however, mainly been analysed in the Markov chain Monte Carlo (MCMC) framework, an approach requiring computation times of hours or days when applied to large phylogenies. We demonstrate that a hill-climbing maximum a posteriori (MAP) adaptation of the MCMC scheme results in considerable gain in computational efficiency. We demonstrate also that a novel dynamic programming (DP) algorithm for branch length factorization, useful both in the hill-climbing and in the MCMC setting, further reduces computation time. For the problem of inferring rates and times parameters on a fixed tree, we perform simulations, comparisons between hill-climbing and MCMC on a plant rbcL gene dataset, and dating analysis on an animal mtDNA dataset, showing that our methodology enables efficient, highly accurate analysis of very large trees. Datasets requiring a computation time of several days with MCMC can with our MAP algorithm be accurately analysed in less than a minute. From the results of our example analyses, we conclude that our methodology generally avoids getting trapped early in local optima. For the cases where this nevertheless can be a problem, for instance when we in addition to the parameters also infer the tree topology, we show that the problem can be evaded by using a simulated-annealing like (SAL) method in which we favour tree swaps early in the inference while biasing our focus towards rate and time parameter changes later on. Our contribution leaves the field open for fast and accurate dating analysis of nucleotide sequence data. Modeling branch substitutions rates and divergence times separately allows us to include birth-death priors on the times without the assumption of a molecular clock. The methodology is easily adapted to take data from fossil records into account and it can be used together with a broad range of rate and substitution models.

  5. Design and Diagnosis Problem Solving with Multifunctional Technical Knowledge Bases

    DTIC Science & Technology

    1992-09-29

    STRUCTURE METHODOLOGY Design problem solving is a complex activity involving a number of subtasks. and a number of alternative methods potentially available...Conference on Artificial Intelligence. London: The British Computer Society, pp. 621-633. Friedland, P. (1979). Knowledge-based experimental design ...Computing Milieuxl: Management of Computing and Information Systems- -ty,*m man- agement General Terms: Design . Methodology Additional Key Words and Phrases

  6. Rapid Classification of Landsat TM Imagery for Phase 1 Stratification Using the Automated NDVI Threshold Supervised Classification (ANTSC) Methodology

    Treesearch

    William H. Cooke; Dennis M. Jacobs

    2005-01-01

    FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land use/land cover information....

  7. Science and Television Commercials: Adding Relevance to the Research Methodology Course.

    ERIC Educational Resources Information Center

    Solomon, Paul R.

    1979-01-01

    Contends that research methodology courses can be relevant to issues outside of psychology and describes a method which relates the course to consumer problems. Students use experimental methodology to test claims made in television commercials advertising deodorant, bathroom tissues, and soft drinks. (KC)

  8. Investigation of Nonlinear Pressurization and Model Restart in MSC/NASTRAN for Modeling Thin Film Inflatable Structures

    NASA Technical Reports Server (NTRS)

    Smalley, Kurt B.; Tinker, Michael L.; Fischer, Richard T.

    2001-01-01

    This paper is written for the purpose of providing an introduction and set of guidelines for the use of a methodology for NASTRAN eigenvalue modeling of thin film inflatable structures. It is hoped that this paper will spare the reader from the problems and headaches the authors were confronted with during their investigation by presenting here not only an introduction and verification of the methodology, but also a discussion of the problems that this methodology can ensue. Our goal in this investigation was to verify the basic methodology through the creation and correlation of a simple model. An overview of thin film structures, their history, and their applications is given. Previous modeling work is then briefly discussed. An introduction is then given for the method of modeling. The specific mechanics of the method are then discussed in parallel with a basic discussion of NASTRAN s implementation of these mechanics. The problems encountered with the method are then given along with suggestions for their work-a-rounds. The methodology is verified through the correlation between an analytical model and modal test results of a thin film strut. Recommendations are given for the needed advancement of our understanding of this method and ability to accurately model thin film structures. Finally, conclusions are drawn regarding the usefulness of the methodology.

  9. The PMHT: solutions for some of its problems

    NASA Astrophysics Data System (ADS)

    Wieneke, Monika; Koch, Wolfgang

    2007-09-01

    Tracking multiple targets in a cluttered environment is a challenging task. Probabilistic Multiple Hypothesis Tracking (PMHT) is an efficient approach for dealing with it. Essentially PMHT is based on the method of Expectation-Maximization for handling with association conflicts. Linearity in the number of targets and measurements is the main motivation for a further development and extension of this methodology. Unfortunately, compared with the Probabilistic Data Association Filter (PDAF), PMHT has not yet shown its superiority in terms of track-lost statistics. Furthermore, the problem of track extraction and deletion is apparently not yet satisfactorily solved within this framework. Four properties of PMHT are responsible for its problems in track maintenance: Non-Adaptivity, Hospitality, Narcissism and Local Maxima. 1, 2 In this work we present a solution for each of them and derive an improved PMHT by integrating the solutions into the PMHT formalism. The new PMHT is evaluated by Monte-Carlo simulations. A sequential Likelihood-Ratio (LR) test for track extraction has been developed and already integrated into the framework of traditional Bayesian Multiple Hypothesis Tracking. 3 As a multi-scan approach, also the PMHT methodology has the potential for track extraction. In this paper an analogous integration of a sequential LR test into the PMHT framework is proposed. We present an LR formula for track extraction and deletion using the PMHT update formulae. As PMHT provides all required ingredients for a sequential LR calculation, the LR is thus a by-product of the PMHT iteration process. Therefore the resulting update formula for the sequential LR test affords the development of Track-Before-Detect algorithms for PMHT. The approach is illustrated by a simple example.

  10. Measurement of health system performance at district level: A study protocol

    PubMed Central

    Sharma, Atul; Prinja, Shankar; Aggarwal, Arun Kumar

    2018-01-01

    Background Limited efforts have been observed in low and middle income countries to undertake health system performance assessment at district level. Absence of a comprehensive data collection tool and lack of a standardised single summary measure defining overall performance are some of the main problems. Present study has been undertaken to develop a summary composite health system performance index at district level. Methods A broad range of indicators covering all six domains as per building block framework were finalized by an expert panel. The domains were classified into twenty sub-domains, with 70 input and process indicators to measure performance. Seven sub-domains for assessing health system outputs and outcomes were identified, with a total of 28 indicators. Districts in Haryana state from north India were selected for the study. Primary and secondary data will be collected from 378 health facilities, district and state health directorate headquarters. Indicators will be normalized, aggregated to generate composite performance index at district level. Domain specific scores will present the quality of individual building block domains in the public health system. Robustness of the results will be checked using sensitivity analysis. Expected impact for public health: The study presents a methodology for comprehensive assessment of all health system domains on basis of input, process, output and outcome indicators which has never been reported from India. Generation of this index will help identify policy and implementation areas of concern and point towards potential solutions. Results may also help understand relationships between individual building blocks and their sub-components. Significance for public health Measuring performance of health system is important to understand progress and challenges, and create systems that are efficient, equitable and patient-focused. However, very few assessments of such nature have been observed in low and middle income countries, especially at district level, mainly because of methodological challenges. This study presents a methodology for comprehensive assessment of all domains of health system and generation of a composite Health System Performance Index on the basis of input, process, output and outcome indicators. It will help identify policy and implementation problems worthy of attention and point towards potential solutions to health system bottlenecks resulting in poor performance. The results may also help better understand the relationships between individual building blocks and their sub-components and the overall performance of the health system. PMID:29441330

  11. [Development of New Mathematical Methodology in Air Traffic Control for the Analysis of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Hermann, Robert

    1997-01-01

    The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.

  12. Exploiting periodicity to extract the atrial activity in atrial arrhythmias

    NASA Astrophysics Data System (ADS)

    Llinares, Raul; Igual, Jorge

    2011-12-01

    Atrial fibrillation disorders are one of the main arrhythmias of the elderly. The atrial and ventricular activities are decoupled during an atrial fibrillation episode, and very rapid and irregular waves replace the usual atrial P-wave in a normal sinus rhythm electrocardiogram (ECG). The estimation of these wavelets is a must for clinical analysis. We propose a new approach to this problem focused on the quasiperiodicity of these wavelets. Atrial activity is characterized by a main atrial rhythm in the interval 3-12 Hz. It enables us to establish the problem as the separation of the original sources from the instantaneous linear combination of them recorded in the ECG or the extraction of only the atrial component exploiting the quasiperiodic feature of the atrial signal. This methodology implies the previous estimation of such main atrial period. We present two algorithms that separate and extract the atrial rhythm starting from a prior estimation of the main atrial frequency. The first one is an algebraic method based on the maximization of a cost function that measures the periodicity. The other one is an adaptive algorithm that exploits the decorrelation of the atrial and other signals diagonalizing the correlation matrices at multiple lags of the period of atrial activity. The algorithms are applied successfully to synthetic and real data. In simulated ECGs, the average correlation index obtained was 0.811 and 0.847, respectively. In real ECGs, the accuracy of the results was validated using spectral and temporal parameters. The average peak frequency and spectral concentration obtained were 5.550 and 5.554 Hz and 56.3 and 54.4%, respectively, and the kurtosis was 0.266 and 0.695. For validation purposes, we compared the proposed algorithms with established methods, obtaining better results for simulated and real registers.

  13. Applying axiomatic design to a medication distribution system

    NASA Astrophysics Data System (ADS)

    Raguini, Pepito B.

    As the need to minimize medication errors drives many medical facilities to come up with robust solutions to the most common error that affects patient's safety, these hospitals would be wise to put a concerted effort into finding methodologies that can facilitate an optimized medical distribution system. If the hospitals' upper management is looking for an optimization method that is an ideal fit, it is just as important that the right tool be selected for the application at hand. In the present work, we propose the application of Axiomatic Design (AD), which is a process that focuses on the generation and selection of functional requirements to meet the customer needs for product and/or process design. The appeal of the axiomatic approach is to provide both a formal design process and a set of technical coefficients for meeting the customer's needs. Thus, AD offers a strategy for the effective integration of people, design methods, design tools and design data. Therefore, we propose the AD methodology to medical applications with the main objective of allowing nurses the opportunity to provide cost effective delivery of medications to inpatients, thereby improving quality patient care. The AD methodology will be implemented through the use of focused stores, where medications can be readily stored and can be conveniently located near patients, as well as a mobile apparatus that can also store medications and is commonly used by hospitals, the medication cart. Moreover, a robust methodology called the focused store methodology will be introduced and developed for both the uncapacitated and capacitated case studies, which will set up an appropriate AD framework and design problem for a medication distribution case study.

  14. Financial Support for the Humanities: A Special Methodological Report.

    ERIC Educational Resources Information Center

    Gomberg, Irene L.; Atelsek, Frank J.

    Findings and methodological problems of a survey on financial support for humanities in higher education are discussed. Usable data were gathered from 351 of 671 Higher Education Panel member institutions. Two weighting methodologies were employed. The conventional method assumed that nonrespondents were similar to respondents, whereas a…

  15. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  16. Technology transfer methodology

    NASA Technical Reports Server (NTRS)

    Labotz, Rich

    1991-01-01

    Information on technology transfer methodology is given in viewgraph form. Topics covered include problems in economics, technology drivers, inhibitors to using improved technology in development, technology application opportunities, and co-sponsorship of technology.

  17. Engineering tradeoff problems viewed as multiple objective optimizations and the VODCA methodology

    NASA Astrophysics Data System (ADS)

    Morgan, T. W.; Thurgood, R. L.

    1984-05-01

    This paper summarizes a rational model for making engineering tradeoff decisions. The model is a hybrid from the fields of social welfare economics, communications, and operations research. A solution methodology (Vector Optimization Decision Convergence Algorithm - VODCA) firmly grounded in the economic model is developed both conceptually and mathematically. The primary objective for developing the VODCA methodology was to improve the process for extracting relative value information about the objectives from the appropriate decision makers. This objective was accomplished by employing data filtering techniques to increase the consistency of the relative value information and decrease the amount of information required. VODCA is applied to a simplified hypothetical tradeoff decision problem. Possible use of multiple objective analysis concepts and the VODCA methodology in product-line development and market research are discussed.

  18. A modular inverse elastostatics approach to resolve the pressure-induced stress state for in vivo imaging based cardiovascular modeling.

    PubMed

    Peirlinck, Mathias; De Beule, Matthieu; Segers, Patrick; Rebelo, Nuno

    2018-05-28

    Patient-specific biomechanical modeling of the cardiovascular system is complicated by the presence of a physiological pressure load given that the imaged tissue is in a pre-stressed and -strained state. Neglect of this prestressed state into solid tissue mechanics models leads to erroneous metrics (e.g. wall deformation, peak stress, wall shear stress) which in their turn are used for device design choices, risk assessment (e.g. procedure, rupture) and surgery planning. It is thus of utmost importance to incorporate this deformed and loaded tissue state into the computational models, which implies solving an inverse problem (calculating an undeformed geometry given the load and the deformed geometry). Methodologies to solve this inverse problem can be categorized into iterative and direct methodologies, both having their inherent advantages and disadvantages. Direct methodologies are typically based on the inverse elastostatics (IE) approach and offer a computationally efficient single shot methodology to compute the in vivo stress state. However, cumbersome and problem-specific derivations of the formulations and non-trivial access to the finite element analysis (FEA) code, especially for commercial products, refrain a broad implementation of these methodologies. For that reason, we developed a novel, modular IE approach and implemented this methodology in a commercial FEA solver with minor user subroutine interventions. The accuracy of this methodology was demonstrated in an arterial tube and porcine biventricular myocardium model. The computational power and efficiency of the methodology was shown by computing the in vivo stress and strain state, and the corresponding unloaded geometry, for two models containing multiple interacting incompressible, anisotropic (fiber-embedded) and hyperelastic material behaviors: a patient-specific abdominal aortic aneurysm and a full 4-chamber heart model. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Short-term forecasting of turbidity in trunk main networks.

    PubMed

    Meyers, Gregory; Kapelan, Zoran; Keedwell, Edward

    2017-11-01

    Water discolouration is an increasingly important and expensive issue due to rising customer expectations, tighter regulatory demands and ageing Water Distribution Systems (WDSs) in the UK and abroad. This paper presents a new turbidity forecasting methodology capable of aiding operational staff and enabling proactive management strategies. The turbidity forecasting methodology developed here is completely data-driven and does not require hydraulic or water quality network model that is expensive to build and maintain. The methodology is tested and verified on a real trunk main network with observed turbidity measurement data. Results obtained show that the methodology can detect if discolouration material is mobilised, estimate if sufficient turbidity will be generated to exceed a preselected threshold and approximate how long the material will take to reach the downstream meter. Classification based forecasts of turbidity can be reliably made up to 5 h ahead although at the expense of increased false alarm rates. The methodology presented here could be used as an early warning system that can enable a multitude of cost beneficial proactive management strategies to be implemented as an alternative to expensive trunk mains cleaning programs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. SUDOQU: a new dose model to derive criteria for surface contamination of non-food (consumer) goods, containers and conveyances.

    PubMed

    van Dillen, Teun

    2015-04-01

    The Fukushima nuclear accident (Japan, 11 March 2011) revealed the need for well-founded criteria for surface contamination and associated screening levels related to the import of non-food (consumer) goods, containers and conveyances. The only available European-harmonised criteria are those laid down in the IAEA transport regulations, but these criteria date back from the early 1960s and only apply to the safe transport of radioactive materials. The main problem is that a generic dose-assessment model for consumer products is missing. Therefore, RIVM (National Institute for Public Health and the Environment) developed a new methodology entitled SUDOQU (SUrface DOse QUantification) to calculate the annual effective dose for both consumers and non-radiological workers, addressing issues of removability of surface contamination. The methodology can be used to derive criteria and screening levels for surface contamination and could serve as a useful tool for policy-makers and radiation-protection specialists. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. NASA FDL: Accelerating Artificial Intelligence Applications in the Space Sciences.

    NASA Astrophysics Data System (ADS)

    Parr, J.; Navas-Moreno, M.; Dahlstrom, E. L.; Jennings, S. B.

    2017-12-01

    NASA has a long history of using Artificial Intelligence (AI) for exploration purposes, however due to the recent explosion of the Machine Learning (ML) field within AI, there are great opportunities for NASA to find expanded benefit. For over two years now, the NASA Frontier Development Lab (FDL) has been at the nexus of bright academic researchers, private sector expertise in AI/ML and NASA scientific problem solving. The FDL hypothesis of improving science results was predicated on three main ideas, faster results could be achieved through sprint methodologies, better results could be achieved through interdisciplinarity, and public-private partnerships could lower costs We present select results obtained during two summer sessions in 2016 and 2017 where the research was focused on topics in planetary defense, space resources and space weather, and utilized variational auto encoders, bayesian optimization, and deep learning techniques like deep, recurrent and residual neural networks. The FDL results demonstrate the power of bridging research disciplines and the potential that AI/ML has for supporting research goals, improving on current methodologies, enabling new discovery and doing so in accelerated timeframes.

  2. A method for estimating Dekkera/Brettanomyces populations in wines.

    PubMed

    Benito, S; Palomero, F; Morata, A; Calderón, F; Suárez-Lepe, J A

    2009-05-01

    The formation of ethylphenols in wines, a consequence of Dekkera/Brettanomyces metabolism, can affect their quality. The main aims of this work were to further our knowledge of Dekkera/Brettanomyces with respect to ethylphenol production, and to develop a methodology for detecting this spoilage yeast and for estimating its population size in wines using differential-selective media and high performance liquid chromatography (HPLC). This work examines the reduction of p-coumaric acid and the formation of 4-vinylphenol and 4-ethylphenol (recorded by HPLC-DAD) in a prepared medium because of the activities of different yeast species and populations. A regression model was constructed for estimating the population of Dekkera/Brettanomyces at the beginning of fermentation via the conversion of hydroxycinnamic acids into ethylphenols. The proposed methodology allows the populations of Dekkera/Brettanomyces at the beginning of fermentation to be estimated in problem wines. Moreover, it avoids false positives because of yeasts resistant to the effects of the selective elements of the medium. This may help prevent the appearance of organoleptic anomalies in wines at the winery level.

  3. Anthropology and Epidemiology: learning epistemological lessons through a collaborative venture

    PubMed Central

    Béhague, Dominique Pareja; Gonçalves, Helen; Victora, Cesar Gomes

    2009-01-01

    Collaboration between anthropology and epidemiology has a long and tumultuous history. Based on empirical examples, this paper describes a number of epistemological lessons we have learned through our experience of cross-disciplinary collaboration. Although critical of both mainstream epidemiology and medical anthropology, our analysis focuses on the implications of addressing each discipline’s main epistemological differences, while addressing the goal of adopting a broader social approach to health improvement. We believe it is important to push the boundaries of research collaborations from the more standard forms of “multidisciplinarity,” to the adoption of theoretically imbued “interdisciplinarity.” The more we challenge epistemological limitations and modify ways of knowing, the more we will be able to provide in-depth explanations for the emergence of disease-patterns and thus, to problem-solve. In our experience, both institutional support and the adoption of a relativistic attitude are necessary conditions for sustained theoretical interdisciplinarity. Until researchers acknowledge that methodology is merely a human-designed tool to interpret reality, unnecessary methodological hyper-specialization will continue to alienate one field of knowledge from the other. PMID:18833344

  4. Parallelization of fine-scale computation in Agile Multiscale Modelling Methodology

    NASA Astrophysics Data System (ADS)

    Macioł, Piotr; Michalik, Kazimierz

    2016-10-01

    Nowadays, multiscale modelling of material behavior is an extensively developed area. An important obstacle against its wide application is high computational demands. Among others, the parallelization of multiscale computations is a promising solution. Heterogeneous multiscale models are good candidates for parallelization, since communication between sub-models is limited. In this paper, the possibility of parallelization of multiscale models based on Agile Multiscale Methodology framework is discussed. A sequential, FEM based macroscopic model has been combined with concurrently computed fine-scale models, employing a MatCalc thermodynamic simulator. The main issues, being investigated in this work are: (i) the speed-up of multiscale models with special focus on fine-scale computations and (ii) on decreasing the quality of computations enforced by parallel execution. Speed-up has been evaluated on the basis of Amdahl's law equations. The problem of `delay error', rising from the parallel execution of fine scale sub-models, controlled by the sequential macroscopic sub-model is discussed. Some technical aspects of combining third-party commercial modelling software with an in-house multiscale framework and a MPI library are also discussed.

  5. Debate on vaccines and autoimmunity: Do not attack the author, yet discuss it methodologically.

    PubMed

    Bragazzi, Nicola Luigi; Watad, Abdulla; Amital, Howard; Shoenfeld, Yehuda

    2017-10-09

    Since Jenner, vaccines and vaccinations have stirred a hot, highly polarized debate, leading to contrasting positions and feelings, ranging from acritical enthusiasm to blind denial. On the one hand, we find anti-vaccination movements which divulge and disseminate misleading information, myths, prejudices, and even frauds, with the main aim of denying that vaccination practices represent a major public health measure, being effective in controlling infectious diseases and safeguarding the wellbeing of entire communities. Recently, the authors of many vaccine safety investigations are being personally criticized rather than the actual science being methodologically assessed and critiqued. Unfortunately, this could result in making vaccine safety science a "hazardous occupation". Critiques should focus on the science and not on the authors and on the scientists that publish reasonably high-quality science suggesting a problem with a given vaccine. These scientists require adequate professional protection so there are not disincentives to publish and to carry out researches in the field. The issues for vaccine safety are not dissimilar to other areas such as medical errors and drug safety. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. A Joint Time-Frequency and Matrix Decomposition Feature Extraction Methodology for Pathological Voice Classification

    NASA Astrophysics Data System (ADS)

    Ghoraani, Behnaz; Krishnan, Sridhar

    2009-12-01

    The number of people affected by speech problems is increasing as the modern world places increasing demands on the human voice via mobile telephones, voice recognition software, and interpersonal verbal communications. In this paper, we propose a novel methodology for automatic pattern classification of pathological voices. The main contribution of this paper is extraction of meaningful and unique features using Adaptive time-frequency distribution (TFD) and nonnegative matrix factorization (NMF). We construct Adaptive TFD as an effective signal analysis domain to dynamically track the nonstationarity in the speech and utilize NMF as a matrix decomposition (MD) technique to quantify the constructed TFD. The proposed method extracts meaningful and unique features from the joint TFD of the speech, and automatically identifies and measures the abnormality of the signal. Depending on the abnormality measure of each signal, we classify the signal into normal or pathological. The proposed method is applied on the Massachusetts Eye and Ear Infirmary (MEEI) voice disorders database which consists of 161 pathological and 51 normal speakers, and an overall classification accuracy of 98.6% was achieved.

  7. Using genetically modified tomato crop plants with purple leaves for absolute weed/crop classification.

    PubMed

    Lati, Ran N; Filin, Sagi; Aly, Radi; Lande, Tal; Levin, Ilan; Eizenberg, Hanan

    2014-07-01

    Weed/crop classification is considered the main problem in developing precise weed-management methodologies, because both crops and weeds share similar hues. Great effort has been invested in the development of classification models, most based on expensive sensors and complicated algorithms. However, satisfactory results are not consistently obtained due to imaging conditions in the field. We report on an innovative approach that combines advances in genetic engineering and robust image-processing methods to detect weeds and distinguish them from crop plants by manipulating the crop's leaf color. We demonstrate this on genetically modified tomato (germplasm AN-113) which expresses a purple leaf color. An autonomous weed/crop classification is performed using an invariant-hue transformation that is applied to images acquired by a standard consumer camera (visible wavelength) and handles variations in illumination intensities. The integration of these methodologies is simple and effective, and classification results were accurate and stable under a wide range of imaging conditions. Using this approach, we simplify the most complicated stage in image-based weed/crop classification models. © 2013 Society of Chemical Industry.

  8. On the validity of language: speaking, knowing and understanding in medical geography.

    PubMed

    Scarpaci, J L

    1993-09-01

    This essay examines methodological problems concerning the conceptualization and operationalization of phenomena central to medical geography. Its main argument is that qualitative research can be strengthened if the differences between instrumental and apparent validity are better understood than the current research in medical geography suggests. Its premise is that our definitions of key terms and concepts must be reinforced throughout the design of research should our knowledge and understanding be enhanced. In doing so, the paper aims to move the methodological debate beyond the simple dichotomies of quantitative vs qualitative approaches and logical positivism vs phenomenology. Instead, the argument is couched in a postmodernist hermeneutic sense which questions the validity of one discourse of investigation over another. The paper begins by discussing methods used in conceptualizing and operationalizing variables in quantitative and qualitative research design. Examples derive from concepts central to a geography of health-care behavior and well-being. The latter half of the essay shows the uses and misuses of validity studies in selected health services research and the current debate on national health insurance.

  9. Expert systems for superalloy studies

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kaukler, William F.

    1990-01-01

    There are many areas in science and engineering which require knowledge of an extremely complex foundation of experimental results in order to design methodologies for developing new materials or products. Superalloys are an area which fit well into this discussion in the sense that they are complex combinations of elements which exhibit certain characteristics. Obviously the use of superalloys in high performance, high temperature systems such as the Space Shuttle Main Engine is of interest to NASA. The superalloy manufacturing process is complex and the implementation of an expert system within the design process requires some thought as to how and where it should be implemented. A major motivation is to develop a methodology to assist metallurgists in the design of superalloy materials using current expert systems technology. Hydrogen embrittlement is disasterous to rocket engines and the heuristics can be very complex. Attacking this problem as one module in the overall design process represents a significant step forward. In order to describe the objectives of the first phase implementation, the expert system was designated Hydrogen Environment Embrittlement Expert System (HEEES).

  10. The effect of different standard illumination conditions on color balance failure in offset printed images on glossy coated paper expressed by color difference

    NASA Astrophysics Data System (ADS)

    Spiridonov, I.; Shopova, M.; Boeva, R.; Nikolov, M.

    2012-05-01

    One of the biggest problems in color reproduction processes is color shifts occurring when images are viewed under different illuminants. Process ink colors and their combinations that match under one light source will often appear different under another light source. This problem is referred to as color balance failure or color inconstancy. The main goals of the present study are to investigate and determine the color balance failure (color inconstancy) of offset printed images expressed by color difference and color gamut changes depending on three of the most commonly used in practice illuminants, CIE D50, CIE F2 and CIE A. The results obtained are important from a scientific and a practical point of view. For the first time, a methodology is suggested and implemented for the examination and estimation of color shifts by studying a large number of color and gamut changes in various ink combinations for different illuminants.

  11. Phytofilter - environmental friendly solution for purification of surface plate from urbanized territories

    NASA Astrophysics Data System (ADS)

    Ruchkinova, O.; Shchuckin, I.

    2017-06-01

    Its proved, that phytofilters are environmental friendly solution of problem of purification of surface plate from urbanized territories. Phytofilters answer the nowadays purposes to systems of purification of land drainage. The main problem of it is restrictions, connecter with its use in the conditions of cold temperature. Manufactured a technology and mechanism, which provide a whole-year purification of surface plate and its storage. Experimentally stated optimal makeup of filtering load: peat, zeolite and sand in per cent of volume, which provides defined hydraulic characteristics. Stated sorbate and ion-selective volume of complex filtering load of ordered composition in dynamic conditions. Estimated dependences of exit concentrations of oil products and heavy metals on temperature by filtering through complex filtering load of ordered composition. Defined effectiveness of purification at phytofiltering installation. Fixed an influence of embryophytes on process of phytogeneration and capacity of filtering load. Recommended swamp iris, mace reed and reed grass. Manufactured phytofilter calculation methodology. Calculated economic effect from use of phytofiltration technology in comparison with traditional block-modular installations.

  12. Varieties of second modernity: the cosmopolitan turn in social and political theory and research.

    PubMed

    Beck, Ulrich; Grande, Edgar

    2010-09-01

    The theme of this special issue is the necessity of a cosmopolitan turn in social and political theory. The question at the heart of this introductory chapter takes the challenge of 'methodological cosmopolitanism', already addressed in a Special Issue on Cosmopolitan Sociology in this journal (Beck and Sznaider 2006), an important step further: How can social and political theory be opened up, theoretically as well as methodologically and normatively, to a historically new, entangled Modernity which threatens its own foundations? How can it account for the fundamental fragility, the mutability of societal dynamics (of unintended side effects, domination and power), shaped by the globalization of capital and risks at the beginning of the twenty-first century? What theoretical and methodological problems arise and how can they be addressed in empirical research? In the following, we will develop this 'cosmopolitan turn' in four steps: firstly, we present the major conceptual tools for a theory of cosmopolitan modernities; secondly, we de-construct Western modernity by using examples taken from research on individualization and risk; thirdly, we address the key problem of methodological cosmopolitanism, namely the problem of defining the appropriate unit of analysis; and finally,we discuss normative questions, perspectives, and dilemmas of a theory of cosmopolitan modernities, in particular problems of political agency and prospects of political realization.

  13. A Bayesian approach to truncated data sets: An application to Malmquist bias in Supernova Cosmology

    NASA Astrophysics Data System (ADS)

    March, Marisa Cristina

    2018-01-01

    A problem commonly encountered in statistical analysis of data is that of truncated data sets. A truncated data set is one in which a number of data points are completely missing from a sample, this is in contrast to a censored sample in which partial information is missing from some data points. In astrophysics this problem is commonly seen in a magnitude limited survey such that the survey is incomplete at fainter magnitudes, that is, certain faint objects are simply not observed. The effect of this `missing data' is manifested as Malmquist bias and can result in biases in parameter inference if it is not accounted for. In Frequentist methodologies the Malmquist bias is often corrected for by analysing many simulations and computing the appropriate correction factors. One problem with this methodology is that the corrections are model dependent. In this poster we derive a Bayesian methodology for accounting for truncated data sets in problems of parameter inference and model selection. We first show the methodology for a simple Gaussian linear model and then go on to show the method for accounting for a truncated data set in the case for cosmological parameter inference with a magnitude limited supernova Ia survey.

  14. Selection of Representative Models for Decision Analysis Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  15. Globalisation, transnational policies and adult education

    NASA Astrophysics Data System (ADS)

    Milana, Marcella

    2012-12-01

    Globalisation, transnational policies and adult education - This paper examines policy documents produced by the United Nations Educational, Scientific and Cultural Organization (UNESCO) and the European Union (EU) in the field of adult education and learning. Both these entities address adult education as an explicit object of policy. This paper investigates how globalisation processes are constructed as policy problems when these transnational political agents propose adult education as a response. The author's main argument is that while UNESCO presents the provision of adult education as a means for governments worldwide to overcome disadvantages experienced by their own citizenry, the EU institutionalises learning experiences as a means for governments to sustain regional economic growth and political expansion. After reviewing the literature on globalisation to elucidate the theories that inform current understanding of contemporary economic, political, cultural and ecological changes as political problems, she presents the conceptual and methodological framework of her analysis. The author then examines the active role played by UNESCO and the EU in promoting adult education as a policy objective at transnational level, and unpacks the specific problem "representations" that are substantiated by these organisations. She argues that UNESCO and EU processes assign specific values and meanings to globalisation, and that these reflect a limited understanding of the complexity of globalisation. Finally, she considers two of the effects produced by these problem representations.

  16. Applications of fuzzy theories to multi-objective system optimization

    NASA Technical Reports Server (NTRS)

    Rao, S. S.; Dhingra, A. K.

    1991-01-01

    Most of the computer aided design techniques developed so far deal with the optimization of a single objective function over the feasible design space. However, there often exist several engineering design problems which require a simultaneous consideration of several objective functions. This work presents several techniques of multiobjective optimization. In addition, a new formulation, based on fuzzy theories, is also introduced for the solution of multiobjective system optimization problems. The fuzzy formulation is useful in dealing with systems which are described imprecisely using fuzzy terms such as, 'sufficiently large', 'very strong', or 'satisfactory'. The proposed theory translates the imprecise linguistic statements and multiple objectives into equivalent crisp mathematical statements using fuzzy logic. The effectiveness of all the methodologies and theories presented is illustrated by formulating and solving two different engineering design problems. The first one involves the flight trajectory optimization and the main rotor design of helicopters. The second one is concerned with the integrated kinematic-dynamic synthesis of planar mechanisms. The use and effectiveness of nonlinear membership functions in fuzzy formulation is also demonstrated. The numerical results indicate that the fuzzy formulation could yield results which are qualitatively different from those provided by the crisp formulation. It is felt that the fuzzy formulation will handle real life design problems on a more rational basis.

  17. A Physics-Based Engineering Methodology for Calculating Soft Error Rates of Bulk CMOS and SiGe Heterojunction Bipolar Transistor Integrated Circuits

    NASA Astrophysics Data System (ADS)

    Fulkerson, David E.

    2010-02-01

    This paper describes a new methodology for characterizing the electrical behavior and soft error rate (SER) of CMOS and SiGe HBT integrated circuits that are struck by ions. A typical engineering design problem is to calculate the SER of a critical path that commonly includes several circuits such as an input buffer, several logic gates, logic storage, clock tree circuitry, and an output buffer. Using multiple 3D TCAD simulations to solve this problem is too costly and time-consuming for general engineering use. The new and simple methodology handles the problem with ease by simple SPICE simulations. The methodology accurately predicts the measured threshold linear energy transfer (LET) of a bulk CMOS SRAM. It solves for circuit currents and voltage spikes that are close to those predicted by expensive 3D TCAD simulations. It accurately predicts the measured event cross-section vs. LET curve of an experimental SiGe HBT flip-flop. The experimental cross section vs. frequency behavior and other subtle effects are also accurately predicted.

  18. Schedule Risks Due to Delays in Advanced Technology Development

    NASA Technical Reports Server (NTRS)

    Reeves, John D. Jr.; Kayat, Kamal A.; Lim, Evan

    2008-01-01

    This paper discusses a methodology and modeling capability that probabilistically evaluates the likelihood and impacts of delays in advanced technology development prior to the start of design, development, test, and evaluation (DDT&E) of complex space systems. The challenges of understanding and modeling advanced technology development considerations are first outlined, followed by a discussion of the problem in the context of lunar surface architecture analysis. The current and planned methodologies to address the problem are then presented along with sample analyses and results. The methodology discussed herein provides decision-makers a thorough understanding of the schedule impacts resulting from the inclusion of various enabling advanced technology assumptions within system design.

  19. Epidemiology of childhood conduct problems in Brazil: systematic review and meta-analysis.

    PubMed

    Murray, Joseph; Anselmi, Luciana; Gallo, Erika Alejandra Giraldo; Fleitlich-Bilyk, Bacy; Bordin, Isabel A

    2013-10-01

    This study aimed to review evidence on the prevalence of and risk factors for conduct problems in Brazil. We searched electronic databases and contacted Brazilian researchers up to 05/2012. Studies were included in the review if they reported the prevalence of or risk factors for conduct problems, conduct disorder, or oppositional defiant disorder for 100 + Brazilian children aged ≤18 years, systematically sampled in schools or the community. Prevalence rates and sex differences were meta-analysed. Risk factor studies were reviewed one by one. The average prevalence of conduct problems in screening questionnaires was 20.8%, and the average prevalence of conduct disorder/oppositional defiant disorder was 4.1%. There was systematic variation in the results of screening studies according to methodology: recruitment location, informants, instruments, impairment criterion for case definition, and response rates. Risk factors previously identified in high-income countries were mainly replicated in Brazil, including comorbid mental health problems, educational failure, low religiosity, harsh physical punishment and abuse, parental mental health problems, single parent family, and low socioeconomic status. However, boys did not always have higher risk for conduct problems than girls. Studies using screening questionnaires suggest that Brazilian children have higher rates of conduct problems than children in other countries, but diagnostic studies do not show this difference. Risk factors in Brazil were similar to those in high-income countries, apart from child sex. Future research should investigate developmental patterns of antisocial behaviour, employ a variety of research designs to identify causal risk mechanisms, and examine a broader range of risk factors.

  20. Development of Semantic Description for Multiscale Models of Thermo-Mechanical Treatment of Metal Alloys

    NASA Astrophysics Data System (ADS)

    Macioł, Piotr; Regulski, Krzysztof

    2016-08-01

    We present a process of semantic meta-model development for data management in an adaptable multiscale modeling framework. The main problems in ontology design are discussed, and a solution achieved as a result of the research is presented. The main concepts concerning the application and data management background for multiscale modeling were derived from the AM3 approach—object-oriented Agile multiscale modeling methodology. The ontological description of multiscale models enables validation of semantic correctness of data interchange between submodels. We also present a possibility of using the ontological model as a supervisor in conjunction with a multiscale model controller and a knowledge base system. Multiscale modeling formal ontology (MMFO), designed for describing multiscale models' data and structures, is presented. A need for applying meta-ontology in the MMFO development process is discussed. Examples of MMFO application in describing thermo-mechanical treatment of metal alloys are discussed. Present and future applications of MMFO are described.

  1. Fault-tolerant optimised tracking control for unknown discrete-time linear systems using a combined reinforcement learning and residual compensation methodology

    NASA Astrophysics Data System (ADS)

    Han, Ke-Zhen; Feng, Jian; Cui, Xiaohong

    2017-10-01

    This paper considers the fault-tolerant optimised tracking control (FTOTC) problem for unknown discrete-time linear system. A research scheme is proposed on the basis of data-based parity space identification, reinforcement learning and residual compensation techniques. The main characteristic of this research scheme lies in the parity-space-identification-based simultaneous tracking control and residual compensation. The specific technical line consists of four main contents: apply subspace aided method to design observer-based residual generator; use reinforcement Q-learning approach to solve optimised tracking control policy; rely on robust H∞ theory to achieve noise attenuation; adopt fault estimation triggered by residual generator to perform fault compensation. To clarify the design and implementation procedures, an integrated algorithm is further constructed to link up these four functional units. The detailed analysis and proof are subsequently given to explain the guaranteed FTOTC performance of the proposed conclusions. Finally, a case simulation is provided to verify its effectiveness.

  2. Mobilization strategy to overcome global crisis of water consumption

    NASA Astrophysics Data System (ADS)

    Suzdaleva, Antonina; Goryunova, Svetlana; Marchuk, Aleksey; Borovkov, Valery

    2017-10-01

    Today, the global water consumption crisis is one of the main threats that can disrupt socio-economic and environmental conditions of life of the majority of the world’s population. The water consumption mobilization strategy is based on the idea of increasing the available water resources. The main direction for the implementation of this strategy is the construction of anti-rivers - the systems for inter-basin (interregional) water resources redistribution. Antirivers are intended for controlled redistribution of water resources from regions with their catastrophic excess to regions with their critical shortage. The creation of anti-rivers, taking into account the requirements of environmental safety, will form large-scale managed natural- engineering systems and implement the principle of sustainable development adopted by the United Nations. The aim of the article is to substantiate a new methodological approach to address the problem, where the implementation of this approach can prevent large-scale humanitarian and environmental disasters expected in the coming years.

  3. Methodological Problems of Soviet Pedagogy

    ERIC Educational Resources Information Center

    Noah, Harold J., Ed.; Beach, Beatrice S., Ed.

    1974-01-01

    Selected papers presented at the First Scientific Conference of Pedagogical Scholars of Socialist Countries, Moscow, 1971, deal with methodology in relation to science, human development, sociology, psychology, cybernetics, and the learning process. (KM)

  4. Inverse problems in quantum chemistry

    NASA Astrophysics Data System (ADS)

    Karwowski, Jacek

    Inverse problems constitute a branch of applied mathematics with well-developed methodology and formalism. A broad family of tasks met in theoretical physics, in civil and mechanical engineering, as well as in various branches of medical and biological sciences has been formulated as specific implementations of the general theory of inverse problems. In this article, it is pointed out that a number of approaches met in quantum chemistry can (and should) be classified as inverse problems. Consequently, the methodology used in these approaches may be enriched by applying ideas and theorems developed within the general field of inverse problems. Several examples, including the RKR method for the construction of potential energy curves, determining parameter values in semiempirical methods, and finding external potentials for which the pertinent Schrödinger equation is exactly solvable, are discussed in detail.

  5. Economic evaluation of health promotion for older people-methodological problems and challenges.

    PubMed

    Huter, Kai; Kocot, Ewa; Kissimova-Skarbek, Katarzyna; Dubas-Jakóbczyk, Katarzyna; Rothgang, Heinz

    2016-09-05

    The support of health promotion activities for older people gains societal relevance in terms of enhancing the health and well-being of older people with a view to the efficient use of financial resources in the healthcare sector. Health economic evaluations have become an important instrument to support decision-making processes in many countries. Sound evidence on the cost-effectiveness of health promotion activities would encourage support for the implementation of health promotion activities for older people. This debate article discusses to what extent economic evaluation techniques are appropriate to support decision makers in the allocation of resources regarding health promotion activities for older people. We address the problem that the economic evaluation of these interventions is hampered by methodological obstacles that limit comparability, e.g. with economic evaluations of curative measures. Our central objective is to describe and discuss the specific problems and challenges entailed in the economic evaluation of health promotion activities especially for older people with regard to their usefulness for informing decision making processes. Beyond general problems concerning the economic evaluation of health promotion, our discussion focusses on problems that pertain to the analysis of cost and outcomes of health promotion interventions for older people. With regard to costs these are general problems of economic evaluations, namely the actual implementation of a societal perspective, the appropriate measurement and valuation of informal caregiver time, the measurement and valuation of productivity costs and costs incurred in added years of life. The main problems concerning the identification and measurement of outcomes are related to the identification of outcome parameters that, firstly, adequately reflect the broad effects of health promotion interventions, especially social benefits that gain importance for older people, and secondly, ensure a comparability of effects across different age groups. In particular, the limitations of the widely used QALY for older people are discussed and recently developed alternatives are presented. The key conclusion of the article is that a comparison of the effects of different health promotion initiatives between different age groups by means of economic evaluation is not recommendable. Taking into account the complex outcomes of health promotion interventions it has to be accepted that the outcomes of these interventions will often not be comparable with clinical interventions and have to be assessed differently.

  6. [Problem based learning: achievement of educational goals in the information and comprehension sub-categories of Bloom cognitive domain].

    PubMed

    Montecinos, P; Rodewald, A M

    1994-06-01

    The aim this work was to assess and compare the achievements of medical students, subjected to problem based learning methodology. The information and comprehension categories of Bloom were tested in 17 medical students in four different occasions during the physiopathology course, using a multiple choice knowledge test. There was a significant improvement in the number of correct answers towards the end of the course. It is concluded that these medical students obtained adequate learning achievements in the information subcategory of Bloom using problem based learning methodology, during the physiopathology course.

  7. [About History of Scientific Clinical Schools in Russia: Certain Disputable Issues of Methodology of Studying Problem].

    PubMed

    Borodulin, V I; Gliantsev, S P

    2017-07-01

    The article considers particular key methodological aspects of problem of scientific clinical school in national medicine. These aspects have to do with notion of school, its profile, issues of pedagogues, teachings and followers, subsidiary schools and issue of ethical component of scientific school. The article is a polemic one hence one will find no definite answers to specified questions. The reader is proposed to ponder over answers independently adducing examples of pro and contra. The conclusion is made about necessity of studying scientific schools in other areas of medicine and further elaboration of problem.

  8. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    PubMed

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  9. Innovative Mixed-Methods Research: Moving beyond Design Technicalities to Epistemological and Methodological Realizations

    ERIC Educational Resources Information Center

    Riazi, A. Mehdi

    2016-01-01

    Mixed-methods research (MMR), as an inter-discourse (quantitative and qualitative) methodology, can provide applied linguistics researchers the opportunity to draw on and integrate the strengths of the two research methodological approaches in favour of making more rigorous inferences about research problems. In this article, the argument is made…

  10. Active Methodologies in a Queueing Systems Course for Telecommunication Engineering Studies

    ERIC Educational Resources Information Center

    Garcia, J.; Hernandez, A.

    2010-01-01

    This paper presents the results of a one-year experiment in incorporating active methodologies in a Queueing Systems course as part of the Telecommunication Engineering degree at the University of Zaragoza, Spain, during the period of adaptation to the European Higher Education Area. A problem-based learning methodology has been introduced, and…

  11. The impact of digital media on health: children's perspectives.

    PubMed

    Smahel, David; Wright, Michelle F; Cernikova, Martina

    2015-02-01

    Previous research has mainly focused on the effects of excessive digital media use or overuse on the health of children, primarily utilizing quantitative designs. More research should be conducted on general populations of children, rather than focusing exclusively on excessive technology users. This qualitative study describes technology's impact on physical and mental health from children's perspectives. Focus groups and interviews were conducted with children between the ages of 9 and 16 in 9 European countries (N = 368). During focus groups and interviews, researchers asked what children perceive as being potentially negative or problematic while using the internet and technology. In this study, children reported several physical and mental health problems without indicating internet addiction or overuse. Physical health symptoms included eye problems, headaches, not eating, and tiredness. For mental health symptoms, children reported cognitive salience of online events, aggression, and sleeping problems. Sometimes they reported these problems within 30 min of technology usage. This suggests that even shorter time usage can cause self-reported health problems for some children. Qualitative methodology helps to understand what children's perspectives are concerning the impact of digital media on health. We recommend future studies focused on average technology users and low technology users to determine whether average levels of technology usage relate to health problems of children. Parents and teachers should also be informed about the possible physical and mental health issues associated with children's average usage of technology.

  12. IIR filtering based adaptive active vibration control methodology with online secondary path modeling using PZT actuators

    NASA Astrophysics Data System (ADS)

    Boz, Utku; Basdogan, Ipek

    2015-12-01

    Structural vibrations is a major cause for noise problems, discomfort and mechanical failures in aerospace, automotive and marine systems, which are mainly composed of plate-like structures. In order to reduce structural vibrations on these structures, active vibration control (AVC) is an effective approach. Adaptive filtering methodologies are preferred in AVC due to their ability to adjust themselves for varying dynamics of the structure during the operation. The filtered-X LMS (FXLMS) algorithm is a simple adaptive filtering algorithm widely implemented in active control applications. Proper implementation of FXLMS requires availability of a reference signal to mimic the disturbance and model of the dynamics between the control actuator and the error sensor, namely the secondary path. However, the controller output could interfere with the reference signal and the secondary path dynamics may change during the operation. This interference problem can be resolved by using an infinite impulse response (IIR) filter which considers feedback of the one or more previous control signals to the controller output and the changing secondary path dynamics can be updated using an online modeling technique. In this paper, IIR filtering based filtered-U LMS (FULMS) controller is combined with online secondary path modeling algorithm to suppress the vibrations of a plate-like structure. The results are validated through numerical and experimental studies. The results show that the FULMS with online secondary path modeling approach has more vibration rejection capabilities with higher convergence rate than the FXLMS counterpart.

  13. Coordinative Voltage Control Strategy with Multiple Resources for Distribution Systems of High PV Penetration: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Xiangqi; Zhang, Yingchen

    This paper presents an optimal voltage control methodology with coordination among different voltage-regulating resources, including controllable loads, distributed energy resources such as energy storage and photovoltaics (PV), and utility voltage-regulating devices such as voltage regulators and capacitors. The proposed methodology could effectively tackle the overvoltage and voltage regulation device distortion problems brought by high penetrations of PV to improve grid operation reliability. A voltage-load sensitivity matrix and voltage-regulator sensitivity matrix are used to deploy the resources along the feeder to achieve the control objectives. Mixed-integer nonlinear programming is used to solve the formulated optimization control problem. The methodology has beenmore » tested on the IEEE 123-feeder test system, and the results demonstrate that the proposed approach could actively tackle the voltage problem brought about by high penetrations of PV and improve the reliability of distribution system operation.« less

  14. Topology optimization for nonlinear dynamic problems: Considerations for automotive crashworthiness

    NASA Astrophysics Data System (ADS)

    Kaushik, Anshul; Ramani, Anand

    2014-04-01

    Crashworthiness of automotive structures is most often engineered after an optimal topology has been arrived at using other design considerations. This study is an attempt to incorporate crashworthiness requirements upfront in the topology synthesis process using a mathematically consistent framework. It proposes the use of equivalent linear systems from the nonlinear dynamic simulation in conjunction with a discrete-material topology optimizer. Velocity and acceleration constraints are consistently incorporated in the optimization set-up. Issues specific to crash problems due to the explicit solution methodology employed, nature of the boundary conditions imposed on the structure, etc. are discussed and possible resolutions are proposed. A demonstration of the methodology on two-dimensional problems that address some of the structural requirements and the types of loading typical of frontal and side impact is provided in order to show that this methodology has the potential for topology synthesis incorporating crashworthiness requirements.

  15. How knowledge influences a MCDM analysis: WOCAT Portuguese experience on prevention of forest fires

    NASA Astrophysics Data System (ADS)

    Carreiras, M.; Ferreira, A. J. D.; Moreira, J.; Esteves, T. C. J.; Valente, S.; Soares, J.; Coelho, C. O. A.; Schwilch, G.; Bachmann, F.

    2012-04-01

    Forest management is a major concern for land managers due to its impact on biomass production, surface water quality or landscape beauty. Pursuing the development of a holistic view of the issue (considering economic, environmental and social aspects), an appreciation of the variety policies and techniques is considered essential due to its importance in the context of sustainability. It this context, MCDM could be an important tool on the establishment for the use of the forest. It could be used for exploiting the preferences of decision-makers, stakeholders, or environmental experts obtaining economic values for impacts whose monetization remains problematic. WOCAT has developed a framework for Sustainable Land Management knowledge, covering all steps from data collection, database implementation and decision support. WOCAT methodology allows the environmental risks knowledge and also stakeholder's participation and involvement. It leads to the discussion of issues of the territory and through a participatory, integrative, holistic and impartial process, it identifies environmental problems. In the end guidelines / actions for the territory are settled based on the problems identified. Having an active participatory nature, this process reveals itself as an excellent public participation process. The methodology also brings the territory's decision-makers in contact with the stakeholders. The procedure for identification, assessment and selection of strategies has been developed by the EU project DESIRE in collaboration with WOCAT. The methodology was tested by DESIRE in 16 study sites around the world. As an outcome of the procedure, the methodology may serve as a basis for prioritizing land-use policies, conservation measures and research at a national level. It integrates several exercises for prioritizing land-use policies, conservation measures and research at a regional and national level. In Portugal, forest fires are one of the major factors of land degradation processes. Affecting large areas every year, they also have serious human, socio-economic and psychological impacts. Under the DESIRE project two Portuguese study sites were selected - Góis e Mação. Both study sites are located in Central Portugal and are frequently affected by forest fires. Nowadays different types of solutions applied at the local level are related with the prevention, combat and mitigation of forest fires. At a higher level of analysis the main solution is related with the diversification of the soil uses, mainly by the mixture of cropland, pastures and forest areas. But the selection of the technique isn't so far an open, participative and effective process, and the interests of land users are not represented most of the time. This paper aims to present WOCAT approach and results to forest fire prevention in Portugal considering stakeholder's perspectives and policy recommendations and it's evolution based on an increased of knowledge.

  16. Social cognition interventions for people with schizophrenia: a systematic review focussing on methodological quality and intervention modality.

    PubMed

    Grant, Nina; Lawrence, Megan; Preti, Antonio; Wykes, Til; Cella, Matteo

    2017-08-01

    People with a diagnosis of schizophrenia have significant social and functional difficulties. Social cognition was found to influences these outcomes and in recent years interventions targeting this domain were developed. This paper reviews the existing literature on social cognition interventions for people with a diagnosis of schizophrenia focussing on: i) comparing focussed (i.e. targeting only one social cognitive domain) and global interventions and ii) studies methodological quality. Systematic search was conducted on PubMed and PsycInfo. Studies were included if they were randomised control trials, participants had a diagnosis of schizophrenia or schizoaffective disorder, and the intervention targeted at least one out of four social cognition domains (i.e. theory of mind, affect recognition, social perception and attribution bias). All papers were assessed for methodological quality. Information on the intervention, control condition, study methodology and the main findings from each study were extracted and critically summarised. Data from 32 studies fulfilled the inclusion criteria, considering a total of 1440 participants. Taking part in social cognition interventions produced significant improvements in theory of mind and affect recognition compared to both passive and active control conditions. Results were less clear for social perception and attributional bias. Focussed and global interventions had similar results on outcomes. Overall study methodological quality was modest. There was very limited evidence showing that social cognitive intervention result in functional outcome improvement. The evidence considered suggests that social cognition interventions may be a valuable approach for people with a diagnosis of schizophrenia. However, evidence quality is limited by measure heterogeneity, modest study methodology and short follow-up periods. The findings point to a number of recommendations for future research, including measurement standardisation, appropriately powered studies and investigation of the impact of social cognition improvements on functioning problems. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Methodology for application of field rainfall simulator to revise c-factor database for conditions of the Czech Republic

    NASA Astrophysics Data System (ADS)

    Neumann, Martin; Dostál, Tomáš; Krása, Josef; Kavka, Petr; Davidová, Tereza; Brant, Václav; Kroulík, Milan; Mistr, Martin; Novotný, Ivan

    2016-04-01

    The presentation will introduce a methodology of determination of crop and cover management factor (C-faktor) for the universal soil loss equation (USLE) using field rainfall simulator. The aim of the project is to determine the C-factor value for the different phenophases of the main crops of the central-european region, while also taking into account the different agrotechnical methods. By using the field rainfall simulator, it is possible to perform the measurements in specific phenophases, which is otherwise difficult to execute due to the variability and fortuity of the natural rainfall. Due to the number of measurements needed, two identical simulators will be used, operated by two independent teams, with coordinated methodology. The methodology will mainly specify the length of simulation, the rainfall intensity, and the sampling technique. The presentation includes a more detailed account of the methods selected. Due to the wide range of variable crops and soils, it is not possible to execute the measurements for all possible combinations. We therefore decided to perform the measurements for previously selected combinations of soils,crops and agrotechnologies that are the most common in the Czech Republic. During the experiments, the volume of the surface runoff and amount of sediment will be measured in their temporal distribution, as well as several other important parameters. The key values of the 3D matrix of the combinations of the crop, agrotechnique and soil will be determined experimentally. The remaining values will be determined by interpolation or by a model analogy. There are several methods used for C-factor calculation from measured experimental data. Some of these are not suitable to be used considering the type of data gathered. The presentation will discuss the benefits and drawbacks of these methods, as well as the final design of the method used. The problems concerning the selection of a relevant measurement method as well as the final method of simulation and C-factor determination for the gathered data will be discussed in more detail. The presentation was supported by research projects QJ1530181 and SGS14/180/OHK1/3T/11.

  18. Review and evaluation of innovative technologies for measuring diet in nutritional epidemiology.

    PubMed

    Illner, A-K; Freisling, H; Boeing, H; Huybrechts, I; Crispim, S P; Slimani, N

    2012-08-01

    The use of innovative technologies is deemed to improve dietary assessment in various research settings. However, their relative merits in nutritional epidemiological studies, which require accurate quantitative estimates of the usual intake at individual level, still need to be evaluated. To report on the inventory of available innovative technologies for dietary assessment and to critically evaluate their strengths and weaknesses as compared with the conventional methodologies (i.e. Food Frequency Questionnaires, food records, 24-hour dietary recalls) used in epidemiological studies. A list of currently available technologies was identified from English-language journals, using PubMed and Web of Science. The search criteria were principally based on the date of publication (between 1995 and 2011) and pre-defined search keywords. Six main groups of innovative technologies were identified ('Personal Digital Assistant-', 'Mobile-phone-', 'Interactive computer-', 'Web-', 'Camera- and tape-recorder-' and 'Scan- and sensor-based' technologies). Compared with the conventional food records, Personal Digital Assistant and mobile phone devices seem to improve the recording through the possibility for 'real-time' recording at eating events, but their validity to estimate individual dietary intakes was low to moderate. In 24-hour dietary recalls, there is still limited knowledge regarding the accuracy of fully automated approaches; and methodological problems, such as the inaccuracy in self-reported portion sizes might be more critical than in interview-based applications. In contrast, measurement errors in innovative web-based and in conventional paper-based Food Frequency Questionnaires are most likely similar, suggesting that the underlying methodology is unchanged by the technology. Most of the new technologies in dietary assessment were seen to have overlapping methodological features with the conventional methods predominantly used for nutritional epidemiology. Their main potential to enhance dietary assessment is through more cost- and time-effective, less laborious ways of data collection and higher subject acceptance, though their integration in epidemiological studies would need additional considerations, such as the study objectives, the target population and the financial resources available. However, even in innovative technologies, the inherent individual bias related to self-reported dietary intake will not be resolved. More research is therefore crucial to investigate the validity of innovative dietary assessment technologies.

  19. Scenario-Based Specification and Evaluation of Architectures for Health Monitoring of Aerospace Structures

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Sundaram, P.

    2001-01-01

    HUMS systems have been an area of increased research in the recent times due to two main reasons: (a) increase in the occurrences of accidents in the aerospace, and (b) stricter FAA regulations on aircrafts maintenance [2]. There are several problems associated with the maintenance of aircrafts that the HUMS systems can solve through the use of several monitoring technologies.This paper documents our methodology of employing scenarios in the specification and evaluation of architecture for HUMS. Section 2 investigates related works that use scenarios in software development. Section 3 describes how we use scenarios in our work, which is followed by a demonstration of our methods in the development of KUMS in section 4. Conclusion summarizes results.

  20. Simplifying the complexity of resistance heterogeneity in metastasis

    PubMed Central

    Lavi, Orit; Greene, James M.; Levy, Doron; Gottesman, Michael M.

    2014-01-01

    The main goal of treatment regimens for metastasis is to control growth rates, not eradicate all cancer cells. Mathematical models offer methodologies that incorporate high-throughput data with dynamic effects on net growth. The ideal approach would simplify, but not over-simplify, a complex problem into meaningful and manageable estimators that predict a patient’s response to specific treatments. Here, we explore three fundamental approaches with different assumptions concerning resistance mechanisms, in which the cells are categorized into either discrete compartments or described by a continuous range of resistance levels. We argue in favor of modeling resistance as a continuum and demonstrate how integrating cellular growth rates, density-dependent versus exponential growth, and intratumoral heterogeneity improves predictions concerning the resistance heterogeneity of metastases. PMID:24491979

  1. Adaptive PID formation control of nonholonomic robots without leader's velocity information.

    PubMed

    Shen, Dongbin; Sun, Weijie; Sun, Zhendong

    2014-03-01

    This paper proposes an adaptive proportional integral derivative (PID) algorithm to solve a formation control problem in the leader-follower framework where the leader robot's velocities are unknown for the follower robots. The main idea is first to design some proper ideal control law for the formation system to obtain a required performance, and then to propose the adaptive PID methodology to approach the ideal controller. As a result, the formation is achieved with much more enhanced robust formation performance. The stability of the closed-loop system is theoretically proved by Lyapunov method. Both numerical simulations and physical vehicle experiments are presented to verify the effectiveness of the proposed adaptive PID algorithm. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Building an adaptive agent to monitor and repair the electrical power system of an orbital satellite

    NASA Technical Reports Server (NTRS)

    Tecuci, Gheorghe; Hieb, Michael R.; Dybala, Tomasz

    1995-01-01

    Over several years we have developed a multistrategy apprenticeship learning methodology for building knowledge-based systems. Recently we have developed and applied our methodology to building intelligent agents. This methodology allows a subject matter expert to build an agent in the same way in which the expert would teach a human apprentice. The expert will give the agent specific examples of problems and solutions, explanations of these solutions, or supervise the agent as it solves new problems. During such interactions, the agent learns general rules and concepts, continuously extending and improving its knowledge base. In this paper we present initial results on applying this methodology to build an intelligent adaptive agent for monitoring and repair of the electrical power system of an orbital satellite, stressing the interaction with the expert during apprenticeship learning.

  3. Artificial Intelligence and Information Management

    NASA Astrophysics Data System (ADS)

    Fukumura, Teruo

    After reviewing the recent popularization of the information transmission and processing technologies, which are supported by the progress of electronics, the authors describe that by the introduction of the opto-electronics into the information technology, the possibility of applying the artificial intelligence (AI) technique to the mechanization of the information management has emerged. It is pointed out that althuogh AI deals with problems in the mental world, its basic methodology relies upon the verification by evidence, so the experiment on computers become indispensable for the study of AI. The authors also describe that as computers operate by the program, the basic intelligence which is concerned in AI is that expressed by languages. This results in the fact that the main tool of AI is the logical proof and it involves an intrinsic limitation. To answer a question “Why do you employ AI in your problem solving”, one must have ill-structured problems and intend to conduct deep studies on the thinking and the inference, and the memory and the knowledge-representation. Finally the authors discuss the application of AI technique to the information management. The possibility of the expert-system, processing of the query, and the necessity of document knowledge-base are stated.

  4. Optimal Control via Self-Generated Stochasticity

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2011-01-01

    The problem of global maxima of functionals has been examined. Mathematical roots of local maxima are the same as those for a much simpler problem of finding global maximum of a multi-dimensional function. The second problem is instability even if an optimal trajectory is found, there is no guarantee that it is stable. As a result, a fundamentally new approach is introduced to optimal control based upon two new ideas. The first idea is to represent the functional to be maximized as a limit of a probability density governed by the appropriately selected Liouville equation. Then, the corresponding ordinary differential equations (ODEs) become stochastic, and that sample of the solution that has the largest value will have the highest probability to appear in ODE simulation. The main advantages of the stochastic approach are that it is not sensitive to local maxima, the function to be maximized must be only integrable but not necessarily differentiable, and global equality and inequality constraints do not cause any significant obstacles. The second idea is to remove possible instability of the optimal solution by equipping the control system with a self-stabilizing device. The applications of the proposed methodology will optimize the performance of NASA spacecraft, as well as robot performance.

  5. RBT-GA: a novel metaheuristic for solving the multiple sequence alignment problem

    PubMed Central

    Taheri, Javid; Zomaya, Albert Y

    2009-01-01

    Background Multiple Sequence Alignment (MSA) has always been an active area of research in Bioinformatics. MSA is mainly focused on discovering biologically meaningful relationships among different sequences or proteins in order to investigate the underlying main characteristics/functions. This information is also used to generate phylogenetic trees. Results This paper presents a novel approach, namely RBT-GA, to solve the MSA problem using a hybrid solution methodology combining the Rubber Band Technique (RBT) and the Genetic Algorithm (GA) metaheuristic. RBT is inspired by the behavior of an elastic Rubber Band (RB) on a plate with several poles, which is analogues to locations in the input sequences that could potentially be biologically related. A GA attempts to mimic the evolutionary processes of life in order to locate optimal solutions in an often very complex landscape. RBT-GA is a population based optimization algorithm designed to find the optimal alignment for a set of input protein sequences. In this novel technique, each alignment answer is modeled as a chromosome consisting of several poles in the RBT framework. These poles resemble locations in the input sequences that are most likely to be correlated and/or biologically related. A GA-based optimization process improves these chromosomes gradually yielding a set of mostly optimal answers for the MSA problem. Conclusion RBT-GA is tested with one of the well-known benchmarks suites (BALiBASE 2.0) in this area. The obtained results show that the superiority of the proposed technique even in the case of formidable sequences. PMID:19594869

  6. Tracking problem solving by multivariate pattern analysis and Hidden Markov Model algorithms.

    PubMed

    Anderson, John R

    2012-03-01

    Multivariate pattern analysis can be combined with Hidden Markov Model algorithms to track the second-by-second thinking as people solve complex problems. Two applications of this methodology are illustrated with a data set taken from children as they interacted with an intelligent tutoring system for algebra. The first "mind reading" application involves using fMRI activity to track what students are doing as they solve a sequence of algebra problems. The methodology achieves considerable accuracy at determining both what problem-solving step the students are taking and whether they are performing that step correctly. The second "model discovery" application involves using statistical model evaluation to determine how many substates are involved in performing a step of algebraic problem solving. This research indicates that different steps involve different numbers of substates and these substates are associated with different fluency in algebra problem solving. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Creativity and psychopathology: a systematic review.

    PubMed

    Thys, Erik; Sabbe, Bernard; De Hert, Marc

    2014-01-01

    The possible link between creativity and psychopathology has been a long-time focus of research up to the present day. However, the research results in this field are heterogeneous and contradictory. Links between creativity and specific psychiatric disorders have been confirmed and refuted in different studies. This disparity is partly explained by the methodological challenges peculiar to this field. In this systematic review of the literature from 1950, research articles in the field of creativity and psychopathology are presented, focusing on the methodology and results of the collected studies. This review confirms the methodological problems and the heterogeneity of the study designs and results. The assessment of psychopathology, but more so of creativity, remains a fundamental challenge. On the whole, study results cautiously confirm an association between creativity and both bipolar disorder and schizotypy. The research on creativity and psychopathology is hampered by serious methodological problems. Study results are to be interpreted with caution and future research needs more methodological rigor. © 2014 S. Karger AG, Basel.

  8. [IBEAS design: adverse events prevalence in Latin American hospitals].

    PubMed

    Aranaz-Andrés, J M; Aibar-Remón, C; Limón-Ramírez, R; Amarilla, A; Restrepo, F R; Urroz, O; Sarabia, O; Inga, R; Santivañez, A; Gonseth-García, J; Larizgoitia-Jauregui, I; Agra-Varela, Y; Terol-García, E

    2011-01-01

    To describe the methodological characteristics of the IBEAS study: adverse events prevalence in Latin American hospitals, with the aim of analysing the magnitude, significance and impact of adverse events (AE); to identify the main problems associated with patient safety AE; to increase the capacity of professionals involved in patient safety; and the setting up of patient safety agendas in the participating countries. A patient safety study launched in 35 Latin American hospitals through the analysis of AE in 5 countries: Argentina, Colombia, Costa Rica, Mexico and Peru, using a cross-sectional study using a review of clinical records as the main method. The implications of using a cross-sectional design when studying AE are described, in terms of resources required, internal validity and usefulness related to risk management. The cross-sectional design seems an efficient methodology in terms of time and resources spent, as well as being easy to carry out. Although the cross-sectional design does not review the all hospital episodes, it is able to provide a reliable estimate of prevalence and to support a surveillance system. Because of a possible survival bias, it is likely that the AE which led to hospital admissions will be overestimated, as well as the health related infections or those adverse events which are difficult to identify if the patient is not examined (e.g. contusions). Communication with the ward staff (if the patient is still hospitalised) help in finding the causality and their prevention. Copyright © 2010 SECA. Published by Elsevier Espana. All rights reserved.

  9. Analyzing Problem's Difficulty Based on Neural Networks and Knowledge Map

    ERIC Educational Resources Information Center

    Kuo, Rita; Lien, Wei-Peng; Chang, Maiga; Heh, Jia-Sheng

    2004-01-01

    This paper proposes a methodology to calculate both the difficulty of the basic problems and the difficulty of solving a problem. The method to calculate the difficulty of problem is according to the process of constructing a problem, including Concept Selection, Unknown Designation, and Proposition Construction. Some necessary measures observed…

  10. Simultenious binary hash and features learning for image retrieval

    NASA Astrophysics Data System (ADS)

    Frantc, V. A.; Makov, S. V.; Voronin, V. V.; Marchuk, V. I.; Semenishchev, E. A.; Egiazarian, K. O.; Agaian, S.

    2016-05-01

    Content-based image retrieval systems have plenty of applications in modern world. The most important one is the image search by query image or by semantic description. Approaches to this problem are employed in personal photo-collection management systems, web-scale image search engines, medical systems, etc. Automatic analysis of large unlabeled image datasets is virtually impossible without satisfactory image-retrieval technique. It's the main reason why this kind of automatic image processing has attracted so much attention during recent years. Despite rather huge progress in the field, semantically meaningful image retrieval still remains a challenging task. The main issue here is the demand to provide reliable results in short amount of time. This paper addresses the problem by novel technique for simultaneous learning of global image features and binary hash codes. Our approach provide mapping of pixel-based image representation to hash-value space simultaneously trying to save as much of semantic image content as possible. We use deep learning methodology to generate image description with properties of similarity preservation and statistical independence. The main advantage of our approach in contrast to existing is ability to fine-tune retrieval procedure for very specific application which allow us to provide better results in comparison to general techniques. Presented in the paper framework for data- dependent image hashing is based on use two different kinds of neural networks: convolutional neural networks for image description and autoencoder for feature to hash space mapping. Experimental results confirmed that our approach has shown promising results in compare to other state-of-the-art methods.

  11. Feature extraction through parallel Probabilistic Principal Component Analysis for heart disease diagnosis

    NASA Astrophysics Data System (ADS)

    Shah, Syed Muhammad Saqlain; Batool, Safeera; Khan, Imran; Ashraf, Muhammad Usman; Abbas, Syed Hussnain; Hussain, Syed Adnan

    2017-09-01

    Automatic diagnosis of human diseases are mostly achieved through decision support systems. The performance of these systems is mainly dependent on the selection of the most relevant features. This becomes harder when the dataset contains missing values for the different features. Probabilistic Principal Component Analysis (PPCA) has reputation to deal with the problem of missing values of attributes. This research presents a methodology which uses the results of medical tests as input, extracts a reduced dimensional feature subset and provides diagnosis of heart disease. The proposed methodology extracts high impact features in new projection by using Probabilistic Principal Component Analysis (PPCA). PPCA extracts projection vectors which contribute in highest covariance and these projection vectors are used to reduce feature dimension. The selection of projection vectors is done through Parallel Analysis (PA). The feature subset with the reduced dimension is provided to radial basis function (RBF) kernel based Support Vector Machines (SVM). The RBF based SVM serves the purpose of classification into two categories i.e., Heart Patient (HP) and Normal Subject (NS). The proposed methodology is evaluated through accuracy, specificity and sensitivity over the three datasets of UCI i.e., Cleveland, Switzerland and Hungarian. The statistical results achieved through the proposed technique are presented in comparison to the existing research showing its impact. The proposed technique achieved an accuracy of 82.18%, 85.82% and 91.30% for Cleveland, Hungarian and Switzerland dataset respectively.

  12. An introduction to the Semantic Web for health sciences librarians*

    PubMed Central

    Robu, Ioana; Robu, Valentin; Thirion, Benoit

    2006-01-01

    Objectives: The paper (1) introduces health sciences librarians to the main concepts and principles of the Semantic Web (SW) and (2) briefly reviews a number of projects on the handling of biomedical information that uses SW technology. Methodology: The paper is structured into two main parts. “Semantic Web Technology” provides a high-level description, with examples, of the main standards and concepts: extensible markup language (XML), Resource Description Framework (RDF), RDF Schema (RDFS), ontologies, and their utility in information retrieval, concluding with mention of more advanced SW languages and their characteristics. “Semantic Web Applications and Research Projects in the Biomedical Field” is a brief review of the Unified Medical Language System (UMLS), Generalised Architecture for Languages, Encyclopedias and Nomenclatures in Medicine (GALEN), HealthCyberMap, LinkBase, and the thesaurus of the National Cancer Institute (NCI). The paper also mentions other benefits and by-products of the SW, citing projects related to them. Discussion and Conclusions: Some of the problems facing the SW vision are presented, especially the ways in which the librarians' expertise in organizing knowledge and in structuring information may contribute to SW projects. PMID:16636713

  13. Gender Discrimination among Medical Students in Pakistan: A Cross Sectional Survey

    PubMed Central

    Madeeh Hashmi, Ali; Rehman, Amra; Butt, Zeeshan; Awais Aftab, Muhammad; Shahid, Aimen; Abbas Khan, Sahar

    2013-01-01

    Objective: To examine the prevalence and magnitude of gender discrimination experienced by undergraduate medical students, and its repercussions on their academic performance and emotional health. Methodology: A cross sectional study of 500 medical and dental students studying at a private medical college in Lahore, Pakistan. Results: Majority (78%) of students reported being victims of gender discrimination. Females were the main perpetrators (70.8%).Most common forms were denied opportunities (63%), followed by neglecting students’ needs (44.3%), and unethical talk (43.6%). Most common places of gender discrimination were teachers’ offices (43.7%) and lecture halls (37.2%). Most of the perpetrators were clerical staff (48%) and professors (43%).Gender discrimination did not affect the academic performance of most victims (62.6%). The most common emotional responses were anger (57.6%), frustration (46.7%) and helplessness (40.3%). 52.4% of students said that gender discrimination still continues and the majority (83.3%) did not report the problem to college authorities. Conclusions: Results demonstrate that gender discrimination is widely prevalent in undergraduate medical education. Females are both the main victims as well as the main perpetrators. In most cases gender discrimination does not affect academic performance but does cause emotional distress. PMID:24353554

  14. Innovation design of medical equipment based on TRIZ.

    PubMed

    Gao, Changqing; Guo, Leiming; Gao, Fenglan; Yang, Bo

    2015-01-01

    Medical equipment is closely related to personal health and safety, and this can be of concern to the equipment user. Furthermore, there is much competition among medical equipment manufacturers. Innovative design is the key to success for those enterprises. The design of medical equipment usually covers vastly different domains of knowledge. The application of modern design methodology in medical equipment and technology invention is an urgent requirement. TRIZ (Russian abbreviation of what can be translated as `theory of inventive problem solving') was born in Russia, which contain some problem-solving methods developed by patent analysis around the world, including Conflict Matrix, Substance Field Analysis, Standard Solution, Effects, etc. TRIZ is an inventive methodology for problems solving. As an Engineering example, infusion system is analyzed and re-designed by TRIZ. The innovative idea is generated to liberate the caretaker from the infusion bag watching out. The research in this paper shows the process of the application of TRIZ in medical device inventions. It is proved that TRIZ is an inventive methodology for problems solving and can be used widely in medical device development.

  15. A risk assessment methodology for critical transportation infrastructure.

    DOT National Transportation Integrated Search

    2002-01-01

    Infrastructure protection typifies a problem of risk assessment and management in a large-scale system. This study offers a methodological framework to identify, prioritize, assess, and manage risks. It includes the following major considerations: (1...

  16. Methodology for nonwork travel analysis in suburban communities.

    DOT National Transportation Integrated Search

    1994-01-01

    The increase in the number of nonwork trips during the past decade has contributed substantially to congestion and to environmental problems. Data collection methodologies, descriptive information, and reliable models of nonwork travel behavior are n...

  17. An ontological case base engineering methodology for diabetes management.

    PubMed

    El-Sappagh, Shaker H; El-Masri, Samir; Elmogy, Mohammed; Riad, A M; Saddik, Basema

    2014-08-01

    Ontology engineering covers issues related to ontology development and use. In Case Based Reasoning (CBR) system, ontology plays two main roles; the first as case base and the second as domain ontology. However, the ontology engineering literature does not provide adequate guidance on how to build, evaluate, and maintain ontologies. This paper proposes an ontology engineering methodology to generate case bases in the medical domain. It mainly focuses on the research of case representation in the form of ontology to support the case semantic retrieval and enhance all knowledge intensive CBR processes. A case study on diabetes diagnosis case base will be provided to evaluate the proposed methodology.

  18. A Synergy between the Technological Process and a Methodology for Web Design: Implications for Technological Problem Solving and Design

    ERIC Educational Resources Information Center

    Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna

    2004-01-01

    Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…

  19. Hydration: certain basic aspects for developing technical and scientific parameters into the nutrition knowledge

    PubMed

    Perales-García, Aránzazu; Estévez-Martínez, Isabel; Urrialde, Rafael

    2016-07-12

    Introduction: Hydration is defined as the water intake coming from food and beverages. Its study has become an area by itself, within the nutrition field. Meaning that in 2010 the European Food Safety Authority (EFSA) approved the water intake recommendations, but the study of this topic implies a rigorous methodology, which represents several issues. Objective: Showing as a glance the main methodological issues in hydration studies. Material and methods: Bibliographic revision of scientific literature. Results: The main methodological issues presented are: sample selection (investigation field and sample design), selection of the method to evaluate hydration status (dilution techniques, bioelectrical impedance, plasmatic and urinary indicators, changes in body composition, water losses and clinic symptoms) selection of the method to evaluate water intake (biomarker, questionnaires, informatics programs, smartphone use, 24-h register, dietary history and food frequency questionnaire), and the main sources of hydration. Conclusions: Hydration status should be understood as a routine model, with daily frequency, according to gender, age, physical activity and environmental conditions. Furthermore, the correct design of the methodology has a special importance in order to take into account all the aspects

  20. Budgeted Interactive Learning

    DTIC Science & Technology

    2017-06-15

    the methodology of reducing the online-algorithm-selecting problem as a contextual bandit problem, which is yet another interactive learning...KH2016a] Kuan-Hao Huang and Hsuan-Tien Lin. Linear upper confidence bound algorithm for contextual bandit problem with piled rewards. In Proceedings

  1. Decomposition of timed automata for solving scheduling problems

    NASA Astrophysics Data System (ADS)

    Nishi, Tatsushi; Wakatake, Masato

    2014-03-01

    A decomposition algorithm for scheduling problems based on timed automata (TA) model is proposed. The problem is represented as an optimal state transition problem for TA. The model comprises of the parallel composition of submodels such as jobs and resources. The procedure of the proposed methodology can be divided into two steps. The first step is to decompose the TA model into several submodels by using decomposable condition. The second step is to combine individual solution of subproblems for the decomposed submodels by the penalty function method. A feasible solution for the entire model is derived through the iterated computation of solving the subproblem for each submodel. The proposed methodology is applied to solve flowshop and jobshop scheduling problems. Computational experiments demonstrate the effectiveness of the proposed algorithm compared with a conventional TA scheduling algorithm without decomposition.

  2. Cost-benefit analysis of space technology

    NASA Technical Reports Server (NTRS)

    Hein, G. F.; Stevenson, S. M.; Sivo, J. N.

    1976-01-01

    A discussion of the implications and problems associated with the use of cost-benefit techniques is presented. Knowledge of these problems is useful in the structure of a decision making process. A methodology of cost-benefit analysis is presented for the evaluation of space technology. The use of the methodology is demonstrated with an evaluation of ion thrusters for north-south stationkeeping aboard geosynchronous communication satellites. A critique of the concept of consumers surplus for measuring benefits is also presented.

  3. Dynamic Scaffolding in a Cloud-Based Problem Representation System: Empowering Pre-Service Teachers' Problem Solving

    ERIC Educational Resources Information Center

    Lee, Chwee Beng; Ling, Keck Voon; Reimann, Peter; Diponegoro, Yudho Ahmad; Koh, Chia Heng; Chew, Derwin

    2014-01-01

    Purpose: The purpose of this paper is to argue for the need to develop pre-service teachers' problem solving ability, in particular, in the context of real-world complex problems. Design/methodology/approach: To argue for the need to develop pre-service teachers' problem solving skills, the authors describe a web-based problem representation…

  4. Methodology of Diagnostics of Interethnic Relations and Ethnosocial Processes

    ERIC Educational Resources Information Center

    Maximova, Svetlana G.; Noyanzina, Oksana Ye.; Omelchenko, Daria A.; Maximov, Maxim B.; Avdeeva, Galina C.

    2016-01-01

    The purpose of this study was to research the methodological approaches to the study of interethnic relations and ethno-social processes. The analysis of the literature was conducted in three main areas: 1) the theoretical and methodological issues of organizing the research of inter-ethnic relations, allowing to highlight the current…

  5. Using the CPGI to Determine Problem Gambling Prevalence in Australia: Measurement Issues

    ERIC Educational Resources Information Center

    Jackson, Alun C.; Wynne, Harold; Dowling, Nicki A.; Tomnay, Jane E.; Thomas, Shane A.

    2010-01-01

    Most states and territories in Australia have adopted the Problem Gambling Severity Index (PGSI) of the Canadian Problem Gambling Index as the standard measure of problem gambling in their prevalence studies and research programs. However, notwithstanding this attempted standardisation, differences in sampling and recruitment methodologies and in…

  6. Using Problem-Based Learning to Enhance Team and Player Development in Youth Soccer

    ERIC Educational Resources Information Center

    Hubball, Harry; Robertson, Scott

    2004-01-01

    Problem-based learning (PBL) is a coaching and teaching methodology that develops knowledge, abilities, and skills. It also encourages participation, collaborative investigation, and the resolution of authentic, "ill-structured" problems through the use of problem definition, teamwork, communication, data collection, decision-making,…

  7. A methodology to find the elementary landscape decomposition of combinatorial optimization problems.

    PubMed

    Chicano, Francisco; Whitley, L Darrell; Alba, Enrique

    2011-01-01

    A small number of combinatorial optimization problems have search spaces that correspond to elementary landscapes, where the objective function f is an eigenfunction of the Laplacian that describes the neighborhood structure of the search space. Many problems are not elementary; however, the objective function of a combinatorial optimization problem can always be expressed as a superposition of multiple elementary landscapes if the underlying neighborhood used is symmetric. This paper presents theoretical results that provide the foundation for algebraic methods that can be used to decompose the objective function of an arbitrary combinatorial optimization problem into a sum of subfunctions, where each subfunction is an elementary landscape. Many steps of this process can be automated, and indeed a software tool could be developed that assists the researcher in finding a landscape decomposition. This methodology is then used to show that the subset sum problem is a superposition of two elementary landscapes, and to show that the quadratic assignment problem is a superposition of three elementary landscapes.

  8. Assessment of beverage intake and hydration status.

    PubMed

    Nissensohn, Mariela; López-Ufano, Marisa; Castro-Quezada, Itandehui; Serra-Majem, Lluis

    2015-02-26

    Water is the main constituent of the human body. It is involved in practically all its functions. It is particularly important for thermoregulation and in the physical and cognitive performance. Water balance reflects water intake and loss. Intake of water is done mainly through consumption of drinking water and beverages (70 to 80%) plus water containing foods (20 to 30%). Water loss is mainly due to excretion of water in urine, faeces and sweat. The interest in the type and quantity of beverage consumption is not new, and numerous approaches have been used to assess beverage intake, but the validity of these approaches has not been well established. There is no standardized questionnaire developed as a research tool for the evaluation of water intake in the general population. Sometimes, the information comes from different sources or from different methodological characteristics which raises problems of the comparability. In the European Union, current epidemiological studies that focus exclusively on beverage intake are scarce. Biomarkers of intake are able to objectively assess dietary intake/status without the bias of self-reported dietary intake errors and also overcome the problem of intra-individual diet variability. Furthermore, some methods of measuring dietary intake used biomarkers to validate the data it collects. Biological markers may offer advantages and be able to improve the estimates of dietary intake assessment, which impact into the statistical power of the study. There is a surprising paucity of studies that systematically examine the correlation of beverages intake and hydration biomarker in different populations. A pilot investigation was developed to evaluate the comparative validity and reliability of newly developed interactive multimedia (IMM) versions compared to validated paper-administered (PP) versions of the Hedrick et al. beverage questionnaire. The study showed that the IMM appears to be a valid and reliable measure to assess habitual beverage intake. Similar study was developed in China, but in this case, the use of Smartphone technology was employed for beverage assessment. The methodology for measuring beverage intake in population studies remains controversial. There are few validated and reproducible studies, so there is still lacking an ideal method (ie, short, easy to administer, inexpensive and accurate) in this regard. Clearly, this is an area of scientific interest that is still in development and seems to be very promising for improving health research. Copyright AULA MEDICA EDICIONES 2015. Published by AULA MEDICA. All rights reserved.

  9. On the generalized VIP time integral methodology for transient thermal problems

    NASA Technical Reports Server (NTRS)

    Mei, Youping; Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong

    1993-01-01

    The paper describes the development and applicability of a generalized VIrtual-Pulse (VIP) time integral method of computation for thermal problems. Unlike past approaches for general heat transfer computations, and with the advent of high speed computing technology and the importance of parallel computations for efficient use of computing environments, a major motivation via the developments described in this paper is the need for developing explicit computational procedures with improved accuracy and stability characteristics. As a consequence, a new and effective VIP methodology is described which inherits these improved characteristics. Numerical illustrative examples are provided to demonstrate the developments and validate the results obtained for thermal problems.

  10. A robust optimization methodology for preliminary aircraft design

    NASA Astrophysics Data System (ADS)

    Prigent, S.; Maréchal, P.; Rondepierre, A.; Druot, T.; Belleville, M.

    2016-05-01

    This article focuses on a robust optimization of an aircraft preliminary design under operational constraints. According to engineers' know-how, the aircraft preliminary design problem can be modelled as an uncertain optimization problem whose objective (the cost or the fuel consumption) is almost affine, and whose constraints are convex. It is shown that this uncertain optimization problem can be approximated in a conservative manner by an uncertain linear optimization program, which enables the use of the techniques of robust linear programming of Ben-Tal, El Ghaoui, and Nemirovski [Robust Optimization, Princeton University Press, 2009]. This methodology is then applied to two real cases of aircraft design and numerical results are presented.

  11. Why the way we consider the body matters – Reflections on four bioethical perspectives on the human body

    PubMed Central

    Schicktanz, Silke

    2007-01-01

    Background Within the context of applied bioethical reasoning, various conceptions of the human body are focused upon by the author in relation to normative notions of autonomy. Results The author begins by descriptively exploring some main positions in bioethics from which the "body" is conceptualized. Such positions conflict: the body is that which is constitutive of the individual's experience and perception, or it is conceived of materially or mechanistically; or as a constructed locus, always historically and culturally transformed. The author goes on to suggest a methodological approach that dialectically considers embodiment from four different perspectives: as bodily self-determination, as respect for the bodily unavailability of the other, as care for bodily individuality; and lastly, as acknowledgement of bodily-constituted communities. These four perspectives encompass autonomy in two of its main interpretations: as the capability of a person to act independent of external forces, and as the moral ideal of pursuing individual wishes by means of role distance, self-limitation and universalization. Various bioethical cases are utilized to show how the four perspectives on the body can complement one another. Conclusion The way we consider the body matters. The author's dialectical method allows a premise-critical identification and exploration of bioethical problems concerning the body. The method is potentially applicable to other bioethical problems. PMID:18053201

  12. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  13. Using soft systems methodology to develop a simulation of out-patient services.

    PubMed

    Lehaney, B; Paul, R J

    1994-10-01

    Discrete event simulation is an approach to modelling a system in the form of a set of mathematical equations and logical relationships, usually used for complex problems, which are difficult to address by using analytical or numerical methods. Managing out-patient services is such a problem. However, simulation is not in itself a systemic approach, in that it provides no methodology by which system boundaries and system activities may be identified. The investigation considers the use of soft systems methodology as an aid to drawing system boundaries and identifying system activities, for the purpose of simulating the outpatients' department at a local hospital. The long term aims are to examine the effects that the participative nature of soft systems methodology has on the acceptability of the simulation model, and to provide analysts and managers with a process that may assist in planning strategies for health care.

  14. Brain Dynamics: Methodological Issues and Applications in Psychiatric and Neurologic Diseases

    NASA Astrophysics Data System (ADS)

    Pezard, Laurent

    The human brain is a complex dynamical system generating the EEG signal. Numerical methods developed to study complex physical dynamics have been used to characterize EEG since the mid-eighties. This endeavor raised several issues related to the specificity of EEG. Firstly, theoretical and methodological studies should address the major differences between the dynamics of the human brain and physical systems. Secondly, this approach of EEG signal should prove to be relevant for dealing with physiological or clinical problems. A set of studies performed in our group is presented here within the context of these two problematic aspects. After the discussion of methodological drawbacks, we review numerical simulations related to the high dimension and spatial extension of brain dynamics. Experimental studies in neurologic and psychiatric disease are then presented. We conclude that if it is now clear that brain dynamics changes in relation with clinical situations, methodological problems remain largely unsolved.

  15. Integrated Design Methodology for Highly Reliable Liquid Rocket Engine

    NASA Astrophysics Data System (ADS)

    Kuratani, Naoshi; Aoki, Hiroshi; Yasui, Masaaki; Kure, Hirotaka; Masuya, Goro

    The Integrated Design Methodology is strongly required at the conceptual design phase to achieve the highly reliable space transportation systems, especially the propulsion systems, not only in Japan but also all over the world in these days. Because in the past some catastrophic failures caused some losses of mission and vehicle (LOM/LOV) at the operational phase, moreover did affect severely the schedule delays and cost overrun at the later development phase. Design methodology for highly reliable liquid rocket engine is being preliminarily established and investigated in this study. The sensitivity analysis is systematically performed to demonstrate the effectiveness of this methodology, and to clarify and especially to focus on the correlation between the combustion chamber, turbopump and main valve as main components. This study describes the essential issues to understand the stated correlations, the need to apply this methodology to the remaining critical failure modes in the whole engine system, and the perspective on the engine development in the future.

  16. Fuzzy inductive reasoning: a consolidated approach to data-driven construction of complex dynamical systems

    NASA Astrophysics Data System (ADS)

    Nebot, Àngela; Mugica, Francisco

    2012-10-01

    Fuzzy inductive reasoning (FIR) is a modelling and simulation methodology derived from the General Systems Problem Solver. It compares favourably with other soft computing methodologies, such as neural networks, genetic or neuro-fuzzy systems, and with hard computing methodologies, such as AR, ARIMA, or NARMAX, when it is used to predict future behaviour of different kinds of systems. This paper contains an overview of the FIR methodology, its historical background, and its evolution.

  17. OPUS: Optimal Projection for Uncertain Systems. Volume 1

    DTIC Science & Technology

    1991-09-01

    unifiedI control- design methodology that directly addresses these technology issues. 1 In particular, optimal projection theory addresses the need for...effects, and limited identification accuracy in a 1-g environment. The principal contribution of OPUS is a unified design methodology that...characterizing solutions to constrained control- design problems. Transforming OPUS into a practi- cal design methodology requires the development of

  18. Suggested criteria for evaluating systems engineering methodologies

    NASA Technical Reports Server (NTRS)

    Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.

    1989-01-01

    Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.

  19. The Water Footprint as an indicator of environmental sustainability in water use at the river basin level.

    PubMed

    Pellicer-Martínez, Francisco; Martínez-Paz, José Miguel

    2016-11-15

    One of the main challenges in water management is to determine how the current water use can condition its availability to future generations and hence its sustainability. This study proposes the use of the Water Footprint (WF) indicator to assess the environmental sustainability in water resources management at the river basin level. The current study presents the methodology developed and applies it to a case study. The WF is a relatively new indicator that measures the total volume of freshwater that is used as a production factor. Its application is ever growing in the evaluation of water use in production processes. The calculation of the WF involves water resources (blue), precipitation stored in the soil (green) and pollution (grey). It provides a comprehensive assessment of the environmental sustainability of water use in a river basin. The methodology is based upon the simulation of the anthropised water cycle, which is conducted by combining a hydrological model and a decision support system. The methodology allows the assessment of the environmental sustainability of water management at different levels, and/or ex-ante analysis of how the decisions made in water planning process affect sustainability. The sustainability study was carried out in the Segura River Basin (SRB) in South-eastern Spain. The SRB is among the most complex basins in Europe, given its special peculiarities: competition for the use, overexploitation of aquifers, pollution, alternative sources, among others. The results indicate that blue water use is not sustainable due to the generalised overexploitation of aquifers. They also reveal that surface water pollution, which is not sustainable, is mainly caused by phosphate concentrations. The assessment of future scenarios reveals that these problems will worsen if no additional measures are implemented, and therefore the water management in the SRB is environmentally unsustainable in both the short- and medium-term. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Qualitative studies of insomnia: Current state of knowledge in the field.

    PubMed

    Araújo, Taís; Jarrin, Denise C; Leanza, Yvan; Vallières, Annie; Morin, Charles M

    2017-02-01

    Despite its high prevalence and burden, insomnia is often trivialized, under-diagnosed, and under-treated in practice. Little information is available on the subjective experience and perceived consequences of insomnia, help-seeking behaviors, and treatment preferences. The use of qualitative approaches (e.g., ethnography, phenomenology, grounded theory) may help gain a better understanding of this sleep disorder. The present paper summarizes the evidence derived from insomnia studies using a qualitative research methodology (e.g., focus group, semi-structured interviews). A systematic review of the literature was conducted using PsycINFO and Medline databases. The review yielded 22 studies and the quality of the methodology of each of them was evaluated systematically using the critical appraisal skills programme (CASP) appraisal tool. Selected articles possess at least a very good methodological rigor and they were categorized according to their main focus: "Experience of insomnia", "Management of insomnia" and "Medicalization of insomnia". The main findings indicate that: 1) insomnia is often experienced as a 24-h problem and is perceived to affect several domains of life, 2) a sense of frustration and misunderstanding is very common among insomnia patients, which is possibly due to a mismatch between patients' and health care professionals' perspectives on insomnia and its treatment, 3) health care professionals pay more attention to sleep hygiene education and medication therapies and less to the patient's subjective experience of insomnia, and 4) health care professionals are often unaware of non-pharmacological interventions other than sleep hygiene education. An important implication of these findings is the need to develop new clinical measures with a broader scope on insomnia and more targeted treatments that take into account the patient's experience of insomnia. Greater use of qualitative approaches in future research may produce novel and more contextualized information leading to a more comprehensive understanding of insomnia. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Qualitative Studies of Insomnia: Current State of Knowledge in the Field

    PubMed Central

    Araújo, Taís; Jarrin, Denise C.; Leanza, Yvan; Vallières, Annie; Morin, Charles M.

    2016-01-01

    Summary Despite its high prevalence and burden, insomnia is often trivialized, under-diagnosed, and under-treated in practice. Little information is available on the subjective experience and perceived consequences of insomnia, help-seeking behaviors, and treatment preferences. The use of qualitative approaches (e.g., ethnography, phenomenology, grounded theory) may help gain a better understanding of this sleep disorder. The present paper summarizes the evidence derived from insomnia studies using a qualitative research methodology (e.g., focus group, semi-structured interviews). A systematic review of the literature was conducted using PsycINFO and Medline databases. The review yielded 22 studies and the quality of the methodology of each of them was evaluated systematically using the CASP appraisal tool. Selected articles possess at least a very good methodological rigor and they were categorized according to their main focus: “Experience of insomnia”, “Management of insomnia” and “Medicalization of insomnia”. The main findings indicate that: 1) insomnia is often experienced as a 24-hour problem and is perceived to affect several domains of life, 2) a sense of frustration and misunderstanding is very common among insomnia patients, which is possibly due to a mismatch between patients’ and health care professionals’ perspectives on insomnia and its treatment, 3) health care professionals pay more attention to sleep hygiene education and medication therapies and less to the patient’s subjective experience of insomnia, and 4) health care professionals are often unaware of non-pharmacological interventions other than sleep hygiene education. An important implication of these findings is the need to develop new clinical measures with a broader scope on insomnia and more targeted treatments that take into account the patient’s experience of insomnia. Greater use of qualitative approaches in future research may produce novel and more contextualized information leading to a more comprehensive understanding of insomnia. PMID:27090821

  2. Fuzzy Linear Programming and its Application in Home Textile Firm

    NASA Astrophysics Data System (ADS)

    Vasant, P.; Ganesan, T.; Elamvazuthi, I.

    2011-06-01

    In this paper, new fuzzy linear programming (FLP) based methodology using a specific membership function, named as modified logistic membership function is proposed. The modified logistic membership function is first formulated and its flexibility in taking up vagueness in parameter is established by an analytical approach. The developed methodology of FLP has provided a confidence in applying to real life industrial production planning problem. This approach of solving industrial production planning problem can have feedback with the decision maker, the implementer and the analyst.

  3. Robust Feedback Control of Flow Induced Structural Radiation of Sound

    NASA Technical Reports Server (NTRS)

    Heatwole, Craig M.; Bernhard, Robert J.; Franchek, Matthew A.

    1997-01-01

    A significant component of the interior noise of aircraft and automobiles is a result of turbulent boundary layer excitation of the vehicular structure. In this work, active robust feedback control of the noise due to this non-predictable excitation is investigated. Both an analytical model and experimental investigations are used to determine the characteristics of the flow induced structural sound radiation problem. The problem is shown to be broadband in nature with large system uncertainties associated with the various operating conditions. Furthermore the delay associated with sound propagation is shown to restrict the use of microphone feedback. The state of the art control methodologies, IL synthesis and adaptive feedback control, are evaluated and shown to have limited success for solving this problem. A robust frequency domain controller design methodology is developed for the problem of sound radiated from turbulent flow driven plates. The control design methodology uses frequency domain sequential loop shaping techniques. System uncertainty, sound pressure level reduction performance, and actuator constraints are included in the design process. Using this design method, phase lag was added using non-minimum phase zeros such that the beneficial plant dynamics could be used. This general control approach has application to lightly damped vibration and sound radiation problems where there are high bandwidth control objectives requiring a low controller DC gain and controller order.

  4. Applying Lakatos' Theory to the Theory of Mathematical Problem Solving.

    ERIC Educational Resources Information Center

    Nunokawa, Kazuhiko

    1996-01-01

    The relation between Lakatos' theory and issues in mathematics education, especially mathematical problem solving, is investigated by examining Lakatos' methodology of a scientific research program. (AIM)

  5. Methodologies in Cultural-Historical Activity Theory: The Example of School-Based Development

    ERIC Educational Resources Information Center

    Postholm, May Britt

    2015-01-01

    Background and purpose: Relatively little research has been conducted on methodology within Cultural-Historical Activity Theory (CHAT). CHAT is mainly used as a framework for developmental processes. The purpose of this article is to discuss both focuses for research and research questions within CHAT and to outline methodologies that can be used…

  6. Teaching and Learning Methodologies Supported by ICT Applied in Computer Science

    ERIC Educational Resources Information Center

    Capacho, Jose

    2016-01-01

    The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…

  7. Mining EEG with SVM for Understanding Cognitive Underpinnings of Math Problem Solving Strategies

    PubMed Central

    López, Julio

    2018-01-01

    We have developed a new methodology for examining and extracting patterns from brain electric activity by using data mining and machine learning techniques. Data was collected from experiments focused on the study of cognitive processes that might evoke different specific strategies in the resolution of math problems. A binary classification problem was constructed using correlations and phase synchronization between different electroencephalographic channels as characteristics and, as labels or classes, the math performances of individuals participating in specially designed experiments. The proposed methodology is based on using well-established procedures of feature selection, which were used to determine a suitable brain functional network size related to math problem solving strategies and also to discover the most relevant links in this network without including noisy connections or excluding significant connections. PMID:29670667

  8. Mining EEG with SVM for Understanding Cognitive Underpinnings of Math Problem Solving Strategies.

    PubMed

    Bosch, Paul; Herrera, Mauricio; López, Julio; Maldonado, Sebastián

    2018-01-01

    We have developed a new methodology for examining and extracting patterns from brain electric activity by using data mining and machine learning techniques. Data was collected from experiments focused on the study of cognitive processes that might evoke different specific strategies in the resolution of math problems. A binary classification problem was constructed using correlations and phase synchronization between different electroencephalographic channels as characteristics and, as labels or classes, the math performances of individuals participating in specially designed experiments. The proposed methodology is based on using well-established procedures of feature selection, which were used to determine a suitable brain functional network size related to math problem solving strategies and also to discover the most relevant links in this network without including noisy connections or excluding significant connections.

  9. On the Analysis of Two-Person Problem Solving Protocols.

    ERIC Educational Resources Information Center

    Schoenfeld, Alan H.

    Methodological issues in the use of protocol analysis for research into human problem solving processes are examined through a case study in which two students were videotaped as they worked together to solve mathematical problems "out loud." The students' chosen strategic or executive behavior in examining and solving a problem was…

  10. Problem? "No Problem!" Solving Technical Contradictions

    ERIC Educational Resources Information Center

    Kutz, K. Scott; Stefan, Victor

    2007-01-01

    TRIZ (pronounced TREES), the Russian acronym for the theory of inventive problem solving, enables a person to focus his attention on finding genuine, potential solutions in contrast to searching for ideas that "may" work through a happenstance way. It is a patent database-backed methodology that helps to reduce time spent on the problem,…

  11. Sequenced Integration and the Identification of a Problem-Solving Approach through a Learning Process

    ERIC Educational Resources Information Center

    Cormas, Peter C.

    2016-01-01

    Preservice teachers (N = 27) in two sections of a sequenced, methodological and process integrated mathematics/science course solved a levers problem with three similar learning processes and a problem-solving approach, and identified a problem-solving approach through one different learning process. Similar learning processes used included:…

  12. A TAPS Interactive Multimedia Package to Solve Engineering Dynamics Problem

    ERIC Educational Resources Information Center

    Sidhu, S. Manjit; Selvanathan, N.

    2005-01-01

    Purpose: To expose engineering students to using modern technologies, such as multimedia packages, to learn, visualize and solve engineering problems, such as in mechanics dynamics. Design/methodology/approach: A multimedia problem-solving prototype package is developed to help students solve an engineering problem in a step-by-step approach. A…

  13. Hardware proofs using EHDM and the RSRE verification methodology

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Sjogren, Jon A.

    1988-01-01

    Examined is a methodology for hardware verification developed by Royal Signals and Radar Establishment (RSRE) in the context of the SRI International's Enhanced Hierarchical Design Methodology (EHDM) specification/verification system. The methodology utilizes a four-level specification hierarchy with the following levels: functional level, finite automata model, block model, and circuit level. The properties of a level are proved as theorems in the level below it. This methodology is applied to a 6-bit counter problem and is critically examined. The specifications are written in EHDM's specification language, Extended Special, and the proofs are improving both the RSRE methodology and the EHDM system.

  14. How Root Cause Analysis Can Improve the Value Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wixson, James Robert

    2002-05-01

    Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can bemore » developed in the creativity phase because the team better understands the problems associated with these functions.« less

  15. Helicopter-V/STOL dynamic wind and turbulence design methodology

    NASA Technical Reports Server (NTRS)

    Bailey, J. Earl

    1987-01-01

    Aircraft and helicopter accidents due to severe dynamic wind and turbulence continue to present challenging design problems. The development of the current set of design analysis tools for a aircraft wind and turbulence design began in the 1940's and 1950's. The areas of helicopter dynamic wind and turbulence modeling and vehicle response to severe dynamic wind inputs (microburst type phenomena) during takeoff and landing remain as major unsolved design problems from a lack of both environmental data and computational methodology. The development of helicopter and V/STOL dynamic wind and turbulence response computation methology is reviewed, the current state of the design art in industry is outlined, and comments on design methodology are made which may serve to improve future flight vehicle design.

  16. Ambient Vibration Testing for Story Stiffness Estimation of a Heritage Timber Building

    PubMed Central

    Min, Kyung-Won; Kim, Junhee; Park, Sung-Ah; Park, Chan-Soo

    2013-01-01

    This paper investigates dynamic characteristics of a historic wooden structure by ambient vibration testing, presenting a novel estimation methodology of story stiffness for the purpose of vibration-based structural health monitoring. As for the ambient vibration testing, measured structural responses are analyzed by two output-only system identification methods (i.e., frequency domain decomposition and stochastic subspace identification) to estimate modal parameters. The proposed methodology of story stiffness is estimation based on an eigenvalue problem derived from a vibratory rigid body model. Using the identified natural frequencies, the eigenvalue problem is efficiently solved and uniquely yields story stiffness. It is noteworthy that application of the proposed methodology is not necessarily confined to the wooden structure exampled in the paper. PMID:24227999

  17. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  18. Methodological reviews of economic evaluations in health care: what do they target?

    PubMed

    Hutter, Maria-Florencia; Rodríguez-Ibeas, Roberto; Antonanzas, Fernando

    2014-11-01

    An increasing number of published studies of economic evaluations of health technologies have been reviewed and summarized with different purposes, among them to facilitate decision-making processes. These reviews have covered different aspects of economic evaluations, using a variety of methodological approaches. The aim of this study is to analyze the methodological characteristics of the reviews of economic evaluations in health care, published during the period 1990-2010, to identify their main features and the potential missing elements. This may help to develop a common procedure for elaborating these kinds of reviews. We performed systematic searches in electronic databases (Scopus, Medline and PubMed) of methodological reviews published in English, period 1990-2010. We selected the articles whose main purpose was to review and assess the methodology applied in the economic evaluation studies. We classified the data according to the study objectives, period of the review, number of reviewed studies, methodological and non-methodological items assessed, medical specialty, type of disease and technology, databases used for the review and their main conclusions. We performed a descriptive statistical analysis and checked how generalizability issues were considered in the reviews. We identified 76 methodological reviews, 42 published in the period 1990-2001 and 34 during 2002-2010. The items assessed most frequently (by 70% of the reviews) were perspective, type of economic study, uncertainty and discounting. The reviews also described the type of intervention and disease, funding sources, country in which the evaluation took place, type of journal and author's characteristics. Regarding the intertemporal comparison, higher frequencies were found in the second period for two key methodological items: the source of effectiveness data and the models used in the studies. However, the generalizability issues that apparently are creating a growing interest in the economic evaluation literature did not receive as much attention in the reviews of the second period. The remaining items showed similar frequencies in both periods. Increasingly more reviews of economic evaluation studies aim to analyze the application of methodological principles, and offer summaries of papers classified by either diseases or health technologies. These reviews are useful for finding literature trends, aims of studies and possible deficiencies in the implementation of methods of specific health interventions. As no significant methodological improvement was clearly detected in the two periods analyzed, it would be convenient to pay more attention to the methodological aspects of the reviews.

  19. Investigating mode errors on automated flight decks: illustrating the problem-driven, cumulative, and interdisciplinary nature of human factors research.

    PubMed

    Sarter, Nadine

    2008-06-01

    The goal of this article is to illustrate the problem-driven, cumulative, and highly interdisciplinary nature of human factors research by providing a brief overview of the work on mode errors on modern flight decks over the past two decades. Mode errors on modem flight decks were first reported in the late 1980s. Poor feedback, inadequate mental models of the automation, and the high degree of coupling and complexity of flight deck systems were identified as main contributors to these breakdowns in human-automation interaction. Various improvements of design, training, and procedures were proposed to address these issues. The author describes when and why the problem of mode errors surfaced, summarizes complementary research activities that helped identify and understand the contributing factors to mode errors, and describes some countermeasures that have been developed in recent years. This brief review illustrates how one particular human factors problem in the aviation domain enabled various disciplines and methodological approaches to contribute to a better understanding of, as well as provide better support for, effective human-automation coordination. Converging operations and interdisciplinary collaboration over an extended period of time are hallmarks of successful human factors research. The reported body of research can serve as a model for future research and as a teaching tool for students in this field of work.

  20. Essentials of psychoanalytic process and change: how can we investigate the neural effects of psychodynamic psychotherapy in individualized neuro-imaging?

    PubMed Central

    Boeker, Heinz; Richter, André; Himmighoffen, Holger; Ernst, Jutta; Bohleber, Laura; Hofmann, Elena; Vetter, Johannes; Northoff, Georg

    2013-01-01

    The paper focuses on the essentials of psychoanalytic process and change and the question of how the neural correlates and mechanisms of psychodynamic psychotherapy can be investigated. The psychoanalytic approach aims at enabling the patient to “remember, repeat, and work through” concerning explicit memory. Moreover, the relationship between analyst and patient establishes a new affective configuration which enables a reconstruction of the implicit memory. If psychic change can be achieved it corresponds to neuronal transformation. Individualized neuro-imaging requires controlling and measuring of variables that must be defined. Two main methodological problems can be distinguished: the design problem addresses the issue of how to account for functionally related variables in an experimentally independent way. The translation problem raises the question of how to bridge the gaps between different levels of the concepts presupposed in individualized neuro-imaging (e.g., the personal level of the therapist and the client, the neural level of the brain). An overview of individualized paradigms, which have been used until now is given, including Operationalized Psychodynamic Diagnosis (OPD-2) and the Maladaptive Interpersonal Patterns Q-Start (MIPQS). The development of a new paradigm that will be used in fMRI experiments, the “Interpersonal Relationship Picture Set” (IRPS), is described. Further perspectives and limitations of this new approach concerning the design and the translation problem are discussed. PMID:23935571

  1. Malnutrition in healthcare institutions: a review of the prevalence of under-nutrition in hospitals and care homes since 1994 in England.

    PubMed

    Ray, Sumantra; Laur, Celia; Golubic, Rajna

    2014-10-01

    One in four hospital patients in the UK are estimated to be affected by 'hospital malnutrition' (under-nutrition). There is a need for robust epidemiological data relating to the frequency, distribution and determinants of this clinical problem of public health importance. This review aims to undertake a narrative synthesis of data on the descriptive epidemiology of under-nutrition, and to address some of the methodological limitations. A methodical review of literature was undertaken, tracking the reported prevalence and incidence of under-nutrition in hospital, in the UK, since 1994. The 16 articles retrieved and reviewed demonstrate that nutrition in hospital is a long standing problem in UK hospitals and care homes. The existing literature is comprised mainly of cross-sectional surveys describing the prevalence of under-nutrition in hospital which ranges from 11 to 45%. There is considerable heterogeneity in the published literature on hospital malnutrition (under-nutrition) and very few studies either measure or have estimated incidence. Under-nutrition in hospital continues to be under-addressed, yet a major public health problem in the UK. Defining the descriptive epidemiology of this problem is one of the first steps towards understanding its aetiology or planning and evaluating appropriate prevention or treatment strategies. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  2. Consumers’ Loyalty Related to Labor Inclusion of People with Disabilities

    PubMed Central

    González, Marta; Luis Fernández, José

    2016-01-01

    Purpose: the purpose of this paper is to show that reporting the corporate commitment to labor exclusion of people with disability correlates with the increase of consumer loyalty. Methodology: It is a theoretical revision that will relate consumer loyalty to three main topics: disability and labor exclusion, responsible consumerism toward disability, and corporate communication to increase loyalty of those consumers that are concerned about this problem. Findings:      • Disability is an invisible phenomenon that concerns the whole of human society. So, the exclusion of the collective appears as a great social problem that might be dealt by the companies to be perceived as responsible.      • Responsible companies are awarded with the loyalty of the consumers.      • Clear corporate information about the commitment with this problem will reinforce the loyalty toward the brand.      • This information can be given in an informal way or by following a certification process. The impact of those methods will depend on how disability is understood by each consumer. Originality/value: This paper focuses on a topic usually neglected by companies and even by literature. However, the fact that more and more companies are paying attention to this problem allows us to think that we are facing a social change that will challenge companies. PMID:27445880

  3. A Novel Clustering Methodology Based on Modularity Optimisation for Detecting Authorship Affinities in Shakespearean Era Plays

    PubMed Central

    Craig, Hugh; Berretta, Regina; Moscato, Pablo

    2016-01-01

    In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416

  4. [Problem-based learning, a strategy to employ it].

    PubMed

    Guillamet Lloveras, Ana; Celma Vicente, Matilde; González Carrión, Pilar; Cano-Caballero Gálvez, Ma Dolores; Pérez Ramírez, Francisca

    2009-02-01

    The Virgen de las Nieves University School of Nursing has adopted the methodology of Problem-Based Learning (ABP in Spanish acronym) as a supplementary method to gain specific transversal competencies. In so doing, all basic required/obligatory subjects necessary for a degree have been partially affected. With the objective of identifying and administering all the structural and cultural barriers which could impede the success or effectiveness of its adoption, a strategic analysis at the School was carried out. This technique was based on a) knowing the strong and weak points the School has for adopting the Problem-Based Learning methodology; b) describing the structural problems and necessities to carry out this teaching innovation; c) to discover the needs professors have regarding knowledge and skills related to Problem-Based Learning; d) to prepare students by informing them about the characteristics of Problem-Based Learning; e) to evaluate the results obtained by means of professor and student opinions, f) to adopt the improvements identified. The stages followed were: strategic analysis, preparation, pilot program, adoption and evaluation.

  5. A Nursing Process Methodology.

    ERIC Educational Resources Information Center

    Ryan-Wenger, Nancy M.

    1990-01-01

    A nursing methodology developed by the faculty at The Ohio State University teaches nursing students problem-solving techniques applicable to any nursing situation. It also provides faculty and students with a basis for measuring students' progress and ability in applying the nursing process. (Author)

  6. Software production methodology tested project

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    The history and results of a 3 1/2-year study in software development methodology are reported. The findings of this study have become the basis for DSN software development guidelines and standard practices. The article discusses accomplishments, discoveries, problems, recommendations and future directions.

  7. STUDYING FOREST ROOT SYSTEMS - AN OVERVIEW OF METHODOLOGICAL PROBLEMS

    EPA Science Inventory

    The study of tree root systems is central to understanding forest ecosystem carbon and nutrient cycles, nutrient and water uptake, C allocation patterns by trees, soil microbial populations, adaptation of trees to stress, soil organic matter production, etc. Methodological probl...

  8. Combining morphometric features and convolutional networks fusion for glaucoma diagnosis

    NASA Astrophysics Data System (ADS)

    Perdomo, Oscar; Arevalo, John; González, Fabio A.

    2017-11-01

    Glaucoma is an eye condition that leads to loss of vision and blindness. Ophthalmoscopy exam evaluates the shape, color and proportion between the optic disc and physiologic cup, but the lack of agreement among experts is still the main diagnosis problem. The application of deep convolutional neural networks combined with automatic extraction of features such as: the cup-to-disc distance in the four quadrants, the perimeter, area, eccentricity, the major radio, the minor radio in optic disc and cup, in addition to all the ratios among the previous parameters may help with a better automatic grading of glaucoma. This paper presents a strategy to merge morphological features and deep convolutional neural networks as a novel methodology to support the glaucoma diagnosis in eye fundus images.

  9. Supplier Selection based on the Performance by using PROMETHEE Method

    NASA Astrophysics Data System (ADS)

    Sinaga, T. S.; Siregar, K.

    2017-03-01

    Generally, companies faced problem to identify vendors that can provide excellent service in availability raw material and on time delivery. The performance of suppliers in a company have to be monitored to ensure the availability to fulfill the company needs. This research is intended to explain how to assess suppliers to improve manufacturing performance. The criteria that considered in evaluating suppliers is criteria of Dickson. There are four main criteria which further split into seven sub-criteria, namely compliance with accuracy, consistency, on-time delivery, right quantity order, flexibility and negotiation, timely of order confirmation, and responsiveness. This research uses PROMETHEE methodology in assessing the supplier performances and obtaining a selected supplier as the best one that shown from the degree of alternative comparison preference between suppliers.

  10. Estimating economic thresholds for pest control: an alternative procedure.

    PubMed

    Ramirez, O A; Saunders, J L

    1999-04-01

    An alternative methodology to determine profit maximizing economic thresholds is developed and illustrated. An optimization problem based on the main biological and economic relations involved in determining a profit maximizing economic threshold is first advanced. From it, a more manageable model of 2 nonsimultaneous reduced-from equations is derived, which represents a simpler but conceptually and statistically sound alternative. The model recognizes that yields and pest control costs are a function of the economic threshold used. Higher (less strict) economic thresholds can result in lower yields and, therefore, a lower gross income from the sale of the product, but could also be less costly to maintain. The highest possible profits will be obtained by using the economic threshold that results in a maximum difference between gross income and pest control cost functions.

  11. A linguistic geometry for 3D strategic planning

    NASA Technical Reports Server (NTRS)

    Stilman, Boris

    1995-01-01

    This paper is a new step in the development and application of the Linguistic Geometry. This formal theory is intended to discover the inner properties of human expert heuristics, which have been successful in a certain class of complex control systems, and apply them to different systems. In this paper we investigate heuristics extracted in the form of hierarchical networks of planning paths of autonomous agents. Employing Linguistic Geometry tools the dynamic hierarchy of networks is represented as a hierarchy of formal attribute languages. The main ideas of this methodology are shown in this paper on the new pilot example of the solution of the extremely complex 3D optimization problem of strategic planning for the space combat of autonomous vehicles. This example demonstrates deep and highly selective search in comparison with conventional search algorithms.

  12. [Femicide Across Europe COST Action, a transnational cooperation network for the study of and approach to femicide in Europe].

    PubMed

    Sanz-Barbero, Belén; Otero-García, Laura; Boira, Santiago; Marcuello, Chaime; Vives Cases, Carmen

    2016-01-01

    Femicide or the murder of women because of their gender is a recognised public health problem as well as a serious violation of human rights. Its magnitude worldwide is still unknown, given the methodological difficulties to differentiate these murders from other female homicides. The European Union programme entitled «European Cooperation in Science and Technology» (COST) launched the «Femicide across Europe» COST Action in 2013, establishing an optimal European framework for transnational cooperation among experts addressing great social and public health challenges such as femicide. This field note describes the main objectives, the participating groups of experts and the mid-term results of this experience. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. A Scientific Approach to the Investigation on Anomalous Atmospheric Light Phenomena

    NASA Astrophysics Data System (ADS)

    Teodorani, M.

    2011-12-01

    Anomalous atmospheric light phenomena tend to occur recurrently in several places of our planet. Statistical studies show that a phenomenon's real recurrence area can be identified only after pondering reported cases on the population number and on the diffusion of communication media. The main scientific results that have been obtained so far after explorative instrumented missions have been carried out are presented, including the empirical models that have been set up in order to describe the observed reality. Subsequently, a focused theorization is discussed in order to attack the physical problem concerning the structure and the dynamics of "light balls" and the enigma related to the central force that maintains them in spherical shape. Finally, several important issues are discussed regarding methodology, strategy, tactics and interdisciplinary approaches.

  14. [Risk Management: concepts and chances for public health].

    PubMed

    Palm, Stefan; Cardeneo, Margareta; Halber, Marco; Schrappe, Matthias

    2002-01-15

    Errors are a common problem in medicine and occur as a result of a complex process involving many contributing factors. Medical errors significantly reduce the safety margin for the patient and contribute additional costs in health care delivery. In most cases adverse events cannot be attributed to a single underlying cause. Therefore an effective risk management strategy must follow a system approach, which is based on counting and analysis of near misses. The development of defenses against the undesired effects of errors should be the main focus rather than asking the question "Who blundered?". Analysis of near misses (which in this context can be compared to indicators) offers several methodological advantages as compared to the analysis of errors and adverse events. Risk management is an integral element of quality management.

  15. A new algorithm for epilepsy seizure onset detection and spread estimation from EEG signals

    NASA Astrophysics Data System (ADS)

    Quintero-Rincón, Antonio; Pereyra, Marcelo; D'Giano, Carlos; Batatia, Hadj; Risk, Marcelo

    2016-04-01

    Appropriate diagnosis and treatment of epilepsy is a main public health issue. Patients suffering from this disease often exhibit different physical characterizations, which result from the synchronous and excessive discharge of a group of neurons in the cerebral cortex. Extracting this information using EEG signals is an important problem in biomedical signal processing. In this work we propose a new algorithm for seizure onset detection and spread estimation in epilepsy patients. The algorithm is based on a multilevel 1-D wavelet decomposition that captures the physiological brain frequency signals coupled with a generalized gaussian model. Preliminary experiments with signals from 30 epilepsy crisis and 11 subjects, suggest that the proposed methodology is a powerful tool for detecting the onset of epilepsy seizures with his spread across the brain.

  16. Review of interdisciplinary devices for detecting the quality of ship ballast water.

    PubMed

    Bakalar, Goran

    2014-01-01

    The results of the ship ballast water treatment systems neutralization need to be verified in a transparent and trustful way before the ship enters a port. Some researches and results, explained in this article, confirm a need for a good verification. If there is no good methodology agreed, then it would not be accepted the solution that the BWMC (Ballast Water Management Convention) 2004 did protect the sea environment in full meaning. The main problem of ballast neutralization are remaining microorganisms (algae blooms, bacteria) ≥10 and <50. Autonomy of the future ballast water detection device has been explained and newest detection methods analyzed. The ranking analysis has been done thru PROMETHEE II (Preference Ranking Organization Method for Enrichment Evaluations) and results were shown by D-Sight software projections.

  17. Project management practices in engineering university

    NASA Astrophysics Data System (ADS)

    Sirazitdinova, Y.; Dulzon, A.; Mueller, B.

    2015-10-01

    The article presents the analysis of usage of project management methodology in Tomsk Polytechnic University, in particular the experience with the course Project management which started 15 years ago. The article presents the discussion around advantages of project management methodology for engineering education and administration of the university in general and the problems impeding extensive implementation of this methodology in teaching, research and management in the university.

  18. Theoretical investigation of the force and dynamically coupled torsional-axial-lateral dynamic response of eared rotors

    NASA Technical Reports Server (NTRS)

    David, J. W.; Mitchell, L. D.

    1982-01-01

    Difficulties in solution methodology to be used to deal with the potentially higher nonlinear rotor equations when dynamic coupling is included. A solution methodology is selected to solve the nonlinear differential equations. The selected method was verified to give good results even at large nonlinearity levels. The transfer matrix methodology is extended to the solution of nonlinear problems.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krummel, J.R.; Markin, J.B.; O'Neill, R.V.

    Regional analyses of the interaction between human populations and natural resources must integrate landscape scale environmental problems. An approach that considers human culture, environmental processes, and resource needs offers an appropriate methodology. With this methodology, we analyze problems of food availability in African cattle-keeping societies. The analysis interrelates cattle biomass, forage availability, milk and blood production, crop yields, gathering, food subsidies, population, and variable precipitation. While an excess of cattle leads to overgrazing, cattle also serve as valuable food storage mechanisms during low rainfall periods. Food subsidies support higher population levels but do not alter drought-induced population fluctuations. Variable precipitationmore » patterns require solutions that stabilize year-to-year food production and also address problems of overpopulation.« less

  20. Automating the packing heuristic design process with genetic programming.

    PubMed

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.

  1. Education in Environmental Chemistry: Setting the Agenda and Recommending Action. A Workshop Report Summary

    NASA Astrophysics Data System (ADS)

    Zoller, Uri

    2005-08-01

    Worldwide, the essence of the current reform in science education is a paradigm shift from algorithmic, lower-order cognitive skills (LOCS) teaching to higher-order cognitive skills (HOCS) learning. In the context of education in environmental chemistry (EEC), the ultimate goal is to educate students to be science technology environment society (STES)-literate, capable of evaluative thinking, decision making, problem solving and taking responsible action accordingly. Educators need to translate this goal into effective courses that can be implemented: this includes developing teaching strategies and assessment methodologies that are consonant with the goal of HOCS learning. An international workshop—"Environmental Chemistry Education in Europe: Setting the Agenda"—yielded two main recommendations for those undertaking educational reform in science education, particularly to promote meaningful EEC. The first recommendation concerns integration of environmental sciences into core chemistry courses as well as the development and implementation of HOCS-promoting teaching strategies and assessment methodologies in chemical education. The second emphasizes the development of students' HOCS for transfer, followed by performance assessment of HOCS. This requires changing the way environmental chemistry is typically taught, moving from a narrowly focused approach (applied analytical, ecotoxicological, or environmental engineering chemistry) to an interdisciplinary and multidisciplinary approach.

  2. Feature binding and attention in working memory: a resolution of previous contradictory findings.

    PubMed

    Allen, Richard J; Hitch, Graham J; Mate, Judit; Baddeley, Alan D

    2012-01-01

    We aimed to resolve an apparent contradiction between previous experiments from different laboratories, using dual-task methodology to compare effects of a concurrent executive load on immediate recognition memory for colours or shapes of items or their colour-shape combinations. Results of two experiments confirmed previous evidence that an irrelevant attentional load interferes equally with memory for features and memory for feature bindings. Detailed analyses suggested that previous contradictory evidence arose from limitations in the way recognition memory was measured. The present findings are inconsistent with an earlier suggestion that feature binding takes place within a multimodal episodic buffer Baddeley, ( 2000 ) and support a subsequent account in which binding takes place automatically prior to information entering the episodic buffer Baddeley, Allen, & Hitch, ( 2011 ). Methodologically, the results suggest that different measures of recognition memory performance (A', d', corrected recognition) give a converging picture of main effects, but are less consistent in detecting interactions. We suggest that this limitation on the reliability of measuring recognition should be taken into account in future research so as to avoid problems of replication that turn out to be more apparent than real.

  3. A Methodological Conundrum: Comparing Schools in Scotland and England

    ERIC Educational Resources Information Center

    Marshall, Bethan; Gibbons, Simon

    2015-01-01

    This article considers a conundrum in research methodology; the fact that, in the main, you have to use a social science-based research methodology if you want to look at what goes on in a classroom. This article proposes an alternative arts-based research method instead based on the work of Eisner, and before him Dewey, where one can use the more…

  4. The Dogma of "The" Scientific Method.

    ERIC Educational Resources Information Center

    Wivagg, Dan; Allchin, Douglas

    2002-01-01

    Points out major problems with the scientific method as a model for learning about methodology in science and suggests teaching about the scientists' toolbox to remedy problems with the conventional scientific method. (KHR)

  5. Fuzzy multi objective transportation problem – evolutionary algorithm approach

    NASA Astrophysics Data System (ADS)

    Karthy, T.; Ganesan, K.

    2018-04-01

    This paper deals with fuzzy multi objective transportation problem. An fuzzy optimal compromise solution is obtained by using Fuzzy Genetic Algorithm. A numerical example is provided to illustrate the methodology.

  6. Performance analysis of complex repairable industrial systems using PSO and fuzzy confidence interval based methodology.

    PubMed

    Garg, Harish

    2013-03-01

    The main objective of the present paper is to propose a methodology for analyzing the behavior of the complex repairable industrial systems. In real-life situations, it is difficult to find the most optimal design policies for MTBF (mean time between failures), MTTR (mean time to repair) and related costs by utilizing available resources and uncertain data. For this, the availability-cost optimization model has been constructed for determining the optimal design parameters for improving the system design efficiency. The uncertainties in the data related to each component of the system are estimated with the help of fuzzy and statistical methodology in the form of the triangular fuzzy numbers. Using these data, the various reliability parameters, which affects the system performance, are obtained in the form of the fuzzy membership function by the proposed confidence interval based fuzzy Lambda-Tau (CIBFLT) methodology. The computed results by CIBFLT are compared with the existing fuzzy Lambda-Tau methodology. Sensitivity analysis on the system MTBF has also been addressed. The methodology has been illustrated through a case study of washing unit, the main part of the paper industry. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  7. A network-base analysis of CMIP5 "historical" experiments

    NASA Astrophysics Data System (ADS)

    Bracco, A.; Foudalis, I.; Dovrolis, C.

    2012-12-01

    In computer science, "complex network analysis" refers to a set of metrics, modeling tools and algorithms commonly used in the study of complex nonlinear dynamical systems. Its main premise is that the underlying topology or network structure of a system has a strong impact on its dynamics and evolution. By allowing to investigate local and non-local statistical interaction, network analysis provides a powerful, but only marginally explored, framework to validate climate models and investigate teleconnections, assessing their strength, range, and impacts on the climate system. In this work we propose a new, fast, robust and scalable methodology to examine, quantify, and visualize climate sensitivity, while constraining general circulation models (GCMs) outputs with observations. The goal of our novel approach is to uncover relations in the climate system that are not (or not fully) captured by more traditional methodologies used in climate science and often adopted from nonlinear dynamical systems analysis, and to explain known climate phenomena in terms of the network structure or its metrics. Our methodology is based on a solid theoretical framework and employs mathematical and statistical tools, exploited only tentatively in climate research so far. Suitably adapted to the climate problem, these tools can assist in visualizing the trade-offs in representing global links and teleconnections among different data sets. Here we present the methodology, and compare network properties for different reanalysis data sets and a suite of CMIP5 coupled GCM outputs. With an extensive model intercomparison in terms of the climate network that each model leads to, we quantify how each model reproduces major teleconnections, rank model performances, and identify common or specific errors in comparing model outputs and observations.

  8. Global Change adaptation in water resources management: the Water Change project.

    PubMed

    Pouget, Laurent; Escaler, Isabel; Guiu, Roger; Mc Ennis, Suzy; Versini, Pierre-Antoine

    2012-12-01

    In recent years, water resources management has been facing new challenges due to increasing changes and their associated uncertainties, such as changes in climate, water demand or land use, which can be grouped under the term Global Change. The Water Change project (LIFE+ funding) developed a methodology and a tool to assess the Global Change impacts on water resources, thus helping river basin agencies and water companies in their long term planning and in the definition of adaptation measures. The main result of the project was the creation of a step by step methodology to assess Global Change impacts and define strategies of adaptation. This methodology was tested in the Llobregat river basin (Spain) with the objective of being applicable to any water system. It includes several steps such as setting-up the problem with a DPSIR framework, developing Global Change scenarios, running river basin models and performing a cost-benefit analysis to define optimal strategies of adaptation. This methodology was supported by the creation of a flexible modelling system, which can link a wide range of models, such as hydrological, water quality, and water management models. The tool allows users to integrate their own models to the system, which can then exchange information among them automatically. This enables to simulate the interactions among multiple components of the water cycle, and run quickly a large number of Global Change scenarios. The outcomes of this project make possible to define and test different sets of adaptation measures for the basin that can be further evaluated through cost-benefit analysis. The integration of the results contributes to an efficient decision-making on how to adapt to Global Change impacts. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Geodiametris: an integrated geoinformatic approach for monitoring land pollution from the disposal of olive oil mill wastes

    NASA Astrophysics Data System (ADS)

    Alexakis, Dimitrios D.; Sarris, Apostolos; Papadopoulos, Nikos; Soupios, Pantelis; Doula, Maria; Cavvadias, Victor

    2014-08-01

    The olive-oil industry is one of the most important sectors of agricultural production in Greece, which is the third in olive-oil production country worldwide. Olive oil mill wastes (OOMW) constitute a major factor in pollution in olivegrowing regions and an important problem to be solved for the agricultural industry. The olive-oil mill wastes are normally deposited at tanks, or directly in the soil or even on adjacent torrents, rivers and lakes posing a high risk to the environmental pollution and the community health. GEODIAMETRIS project aspires to develop integrated geoinformatic methodologies for performing monitoring of land pollution from the disposal of OOMW in the island of Crete -Greece. These methodologies integrate GPS surveys, satellite remote sensing and risk assessment analysis in GIS environment, application of in situ and laboratory geophysical methodologies as well as soil and water physicochemical analysis. Concerning project's preliminary results, all the operating OOMW areas located in Crete have been already registered through extensive GPS field campaigns. Their spatial and attribute information has been stored in an integrated GIS database and an overall OOMW spectral signature database has been constructed through the analysis of multi-temporal Landsat-8 OLI satellite images. In addition, a specific OOMW area located in Alikianos village (Chania-Crete) has been selected as one of the main case study areas. Various geophysical methodologies, such as Electrical Resistivity Tomography, Induced Polarization, multifrequency electromagnetic, Self Potential measurements and Ground Penetrating Radar have been already implemented. Soil as well as liquid samples have been collected for performing physico-chemical analysis. The preliminary results have already contributed to the gradual development of an integrated environmental monitoring tool for studying and understanding environmental degradation from the disposal of OOMW.

  10. [Survey on avoidable blindness and visual impairment in Panama].

    PubMed

    López, Maritza; Brea, Ileana; Yee, Rita; Yi, Rodolfo; Carles, Víctor; Broce, Alberto; Limburg, Hans; Silva, Juan Carlos

    2014-12-01

    Determine prevalence of blindness and visual impairment in adults aged ≥ 50 years in Panama, identify their main causes, and characterize eye health services. Cross-sectional population study using standard Rapid Assessment of Avoidable Blindness methodology. Fifty people aged ≥ 50 years were selected from each of 84 clusters chosen through representative random sampling of the entire country. Visual acuity was assessed using a Snellen chart; lens and posterior pole status were assessed by direct ophthalmoscopy. Cataract surgery coverage was calculated and its quality assessed, along with causes of visual acuity < 20/60 and barriers to access to surgical treatment. A total of 4 125 people were examined (98.2% of the calculated sample). Age- and sex-adjusted prevalence of blindness was 3.0% (95% CI: 2.3-3.6). The main cause of blindness was cataract (66.4%), followed by glaucoma (10.2%). Cataract (69.2%) was the main cause of severe visual impairment and uncorrected refractive errors were the main cause of moderate visual impairment (60.7%). Surgical cataract coverage in individuals was 76.3%. Of all eyes operated for cataract, 58.0% achieved visual acuity ≤ 20/60 with available correction. Prevalence of blindness in Panama is in line with average prevalence found in other countries of the Region. This problem can be reduced, since 76.2% of cases of blindness and 85.0% of cases of severe visual impairment result from avoidable causes.

  11. Pedagogy and/or technology: Making difference in improving students' problem solving skills

    NASA Astrophysics Data System (ADS)

    Hrepic, Zdeslav; Lodder, Katherine; Shaw, Kimberly A.

    2013-01-01

    Pen input computers combined with interactive software may have substantial potential for promoting active instructional methodologies and for facilitating students' problem solving ability. An excellent example is a study in which introductory physics students improved retention, conceptual understanding and problem solving abilities when one of three weekly lectures was replaced with group problem solving sessions facilitated with Tablet PCs and DyKnow software [1,2]. The research goal of the present study was to isolate the effect of the methodology itself (using additional time to teach problem solving) from that of the involved technology. In Fall 2011 we compared the performance of students taking the same introductory physics lecture course while enrolled in two separate problem-solving sections. One section used pen-based computing to facilitate group problem solving while the other section used low-tech methods for one third of the semester (covering Kinematics), and then traded technologies for the middle third of the term (covering Dynamics). Analysis of quiz, exam and standardized pre-post test results indicated no significant difference in scores of the two groups. Combining this result with those of previous studies implies primacy of pedagogy (collaborative problem solving itself) over technology for student learning in problem solving recitations.

  12. GIS and Multi-criteria evaluation (MCE) for landform geodiversity assessment

    NASA Astrophysics Data System (ADS)

    Najwer, Alicja; Reynard, Emmanuel; Zwoliński, Zbigniew

    2014-05-01

    In geomorphology, at the contemporary stage of methodology and methodological development, it is very significant to undertake new research problems, from theoretical and application point of view. As an example of applying geoconservation results in landscape studies and environmental conservation one can refer to the problem of the landform geodiversity. The concept of geodiversity was created relatively recently and, therefore, little progress has been made in its objective assessment and mapping. In order to ensure clarity and coherency, it is recommended that the evaluation process to be rigorous. Multi-criteria evaluation meets these criteria well. The main objective of this presentation is to demonstrate a new methodology for the assessment of the selected natural environment components in response to the definition of geodiversity, as well as visualization of the landforms geodiversity, using the opportunities offered by the geoinformation environment. The study area consists of two peculiar alpine valleys: Illgraben and Derborence, located in the Swiss Alps. Apart from glacial and fluvial landforms, the morphology of these two sites is largely due to the extreme phenomena(rockslides, torrential processes). Both valleys are recognized as geosites of national importance. The basis of the assessment is the selection of the geographical environment features. Firstly, six factor maps were prepared for each area: the landform energy, the landform fragmentation, the contemporary landform preservation, geological settings and hydrographic elements (lakes and streams). Input maps were then standardized and resulted from map algebra operations carried out by multi-criteria evaluation (MCE) with GIS-based Weighted Sum technique. Weights for particular classes were calculated using pair-comparison matrixes method. The final stage of deriving landform geodiversity maps was the reclassification procedure with the use of natural breaks method. The final maps of landform geodiversity were generated with the use of the same methodological algorithm and multiplication of each factor map by its given weight with consistency ratio = 0.07. However, the results that were obtained were radically different. The map of geodiversity for Derborence is characterized by much more significant fragmentation. Areas of low geodiveristy constitute a greater contribution. In the Illgraben site, there is a significant contribution of high and very high geodiversity classes. The obtained maps were reviewed during the field exploration with positive results, which gives a basis to conclude that the methodology used is correct and can be applied for other similar areas. Therefore, it is very important to develop an objective methodology that can be implemented for areas at the local and regional scale, but also giving satisfactory results for areas with a landscape different from the alpine one. The maps of landform geodiversity may be used for environment conservation management, preservation of specific features within the geosite perimeter, spatial planning or tourism management.

  13. Evaluating Writing Programs: Paradigms, Problems, Possibilities.

    ERIC Educational Resources Information Center

    McLeod, Susan H.

    1992-01-01

    Describes two methodological approaches (qualitative and quantitative) that grow out of two different research examples. Suggests the problems these methods present. Discusses the ways in which an awareness of these problems can help teachers to understand how to work with researchers in designing useful evaluations of writing programs. (PRA)

  14. Event- and interval-based measurement of stuttering: a review.

    PubMed

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be an acceptable agreement. Explanation for high reproducibility values as well as parameter choice to report those data are discussed. Both interval- and event-based methodologies used trained or experienced judges for inter- and intra-judge determination and data were beyond the references for good reproducibility values. Inter- and intra-judge values were reported in different metric scales among event- and interval-based methods studies, making it unfeasible to quantify the agreement between the two methods. © 2014 Royal College of Speech and Language Therapists.

  15. [Scientific and methodologic approaches to evaluating medical management for workers of Kazakhstan].

    PubMed

    2012-01-01

    The article covers topical problems of workers' health preservation. Complex research results enabled to evaluate and analyze occupational risks in leading industries of Kazakhstan, for improving scientific and methodologic approaches to medical management for workers subjected to hazardous conditions.

  16. Global/local methods research using a common structural analysis framework

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  17. Terminology and Methodology Related to the Use of Heart Rate Responsivity in Infancy Research

    ERIC Educational Resources Information Center

    Woodcock, James M.

    1971-01-01

    Methodological problems in measuring and interpreting infantile heart rate reactivity in research are discussed. Various ways of describing cardiac activity are listed. Attention is given to the relationship between resting state and heart rate responsivity. (Author/WY)

  18. Employee Turnover: An Empirical and Methodological Assessment.

    ERIC Educational Resources Information Center

    Muchinsky, Paul M.; Tuttle, Mark L.

    1979-01-01

    Reviews research on the prediction of employee turnover. Groups predictor variables into five general categories: attitudinal (job satisfaction), biodata, work-related, personal, and test-score predictors. Consistent relationships between common predictor variables and turnover were found for four categories. Eight methodological problems/issues…

  19. Underestimating the Educability of Down's Syndrome Children: Examination of Methodological Problems in Recent Literature

    ERIC Educational Resources Information Center

    And Others; Rynders, John E.

    1978-01-01

    For many years, the educational capabilities of Down's syndrome persons have been underestimated because a large number of studies purporting to give an accurate picture of Down's syndrome persons' developmental capabilities have had serious methodological flaws. (Author)

  20. Ensemble modeling of stochastic unsteady open-channel flow in terms of its time-space evolutionary probability distribution - Part 2: numerical application

    NASA Astrophysics Data System (ADS)

    Dib, Alain; Kavvas, M. Levent

    2018-03-01

    The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.

  1. New Advanced Technologies to Provide Decentralised and Secure Access to Medical Records: Case Studies in Oncology

    PubMed Central

    Quantin, Catherine; Coatrieux, Gouenou; Allaert, François André; Fassa, Maniane; Bourquard, Karima; Boire, Jean-Yves; de Vlieger, Paul; Maigne, Lydia; Breton, Vincent

    2009-01-01

    The main problem for health professionals and patients in accessing information is that this information is very often distributed over many medical records and locations. This problem is particularly acute in cancerology because patients may be treated for many years and undergo a variety of examinations. Recent advances in technology make it feasible to gain access to medical records anywhere and anytime, allowing the physician or the patient to gather information from an “ephemeral electronic patient record”. However, this easy access to data is accompanied by the requirement for improved security (confidentiality, traceability, integrity, ...) and this issue needs to be addressed. In this paper we propose and discuss a decentralised approach based on recent advances in information sharing and protection: Grid technologies and watermarking methodologies. The potential impact of these technologies for oncology is illustrated by the examples of two experimental cases: a cancer surveillance network and a radiotherapy treatment plan. It is expected that the proposed approach will constitute the basis of a future secure “google-like” access to medical records. PMID:19718446

  2. Multiobjective genetic algorithm conjunctive use optimization for production, cost, and energy with dynamic return flow

    NASA Astrophysics Data System (ADS)

    Peralta, Richard C.; Forghani, Ali; Fayad, Hala

    2014-04-01

    Many real water resources optimization problems involve conflicting objectives for which the main goal is to find a set of optimal solutions on, or near to the Pareto front. E-constraint and weighting multiobjective optimization techniques have shortcomings, especially as the number of objectives increases. Multiobjective Genetic Algorithms (MGA) have been previously proposed to overcome these difficulties. Here, an MGA derives a set of optimal solutions for multiobjective multiuser conjunctive use of reservoir, stream, and (un)confined groundwater resources. The proposed methodology is applied to a hydraulically and economically nonlinear system in which all significant flows, including stream-aquifer-reservoir-diversion-return flow interactions, are simulated and optimized simultaneously for multiple periods. Neural networks represent constrained state variables. The addressed objectives that can be optimized simultaneously in the coupled simulation-optimization model are: (1) maximizing water provided from sources, (2) maximizing hydropower production, and (3) minimizing operation costs of transporting water from sources to destinations. Results show the efficiency of multiobjective genetic algorithms for generating Pareto optimal sets for complex nonlinear multiobjective optimization problems.

  3. High-resolution imaging-guided electroencephalography source localization: temporal effect regularization incorporation in LORETA inverse solution

    NASA Astrophysics Data System (ADS)

    Boughariou, Jihene; Zouch, Wassim; Slima, Mohamed Ben; Kammoun, Ines; Hamida, Ahmed Ben

    2015-11-01

    Electroencephalography (EEG) and magnetic resonance imaging (MRI) are noninvasive neuroimaging modalities. They are widely used and could be complementary. The fusion of these modalities may enhance some emerging research fields targeting the exploration better brain activities. Such research attracted various scientific investigators especially to provide a convivial and helpful advanced clinical-aid tool enabling better neurological explorations. Our present research was, in fact, in the context of EEG inverse problem resolution and investigated an advanced estimation methodology for the localization of the cerebral activity. Our focus was, therefore, on the integration of temporal priors to low-resolution brain electromagnetic tomography (LORETA) formalism and to solve the inverse problem in the EEG. The main idea behind our proposed method was in the integration of a temporal projection matrix within the LORETA weighting matrix. A hyperparameter is the principal fact for such a temporal integration, and its importance would be obvious when obtaining a regularized smoothness solution. Our experimental results clearly confirmed the impact of such an optimization procedure adopted for the temporal regularization parameter comparatively to the LORETA method.

  4. Proceedings of the First Hanford Separation Science Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-05-01

    The First Hanford Separation Science Workshop, sponsored by PNL had two main objectives: (1) assess the applicability of available separation methods for environmental restoration and for minimization, recovery, and recycle of mixed and radioactive mutes; and (2) identify research needs that must be addressed to create new or improved technologies. The information gathered at this workshop not only applies to Hanford but could be adapted to DOE facilities throughout the nation as well. These proceedings have been divided into three components: Background and Introduction to the Problem gives an overview of the history of the Site and the cleanup mission,more » including waste management operations, past disposal practices, current operations, and plans for the future. Also included in this section is a discussion of specific problems concerning the chemistry of the Hanford wastes. Separation Methodologies contains the papers given at the workshop by national experts in the field of separation science regarding the state-of-the-art of various methods and their applicability/adaptability to Hanford. Research Needs identifies further research areas developed in working group sessions. Individual papers are indexed separately.« less

  5. On the Correct Analysis of the Foundations of Theoretical Physics

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2007-04-01

    The problem of truth in science -- the most urgent problem of our time -- is discussed. The correct theoretical analysis of the foundations of theoretical physics is proposed. The principle of the unity of formal logic and rational dialectics is a methodological basis of the analysis. The main result is as follows: the generally accepted foundations of theoretical physics (i.e. Newtonian mechanics, Maxwell electrodynamics, thermodynamics, statistical physics and physical kinetics, the theory of relativity, quantum mechanics) contain the set of logical errors. These errors are explained by existence of the global cause: the errors are a collateral and inevitable result of the inductive way of cognition of the Nature, i.e. result of movement from formation of separate concepts to formation of the system of concepts. Consequently, theoretical physics enters the greatest crisis. It means that physics as a science of phenomenon leaves the progress stage for a science of essence (information). Acknowledgment: The books ``Surprises in Theoretical Physics'' (1979) and ``More Surprises in Theoretical Physics'' (1991) by Sir Rudolf Peierls stimulated my 25-year work.

  6. Robust automatic line scratch detection in films.

    PubMed

    Newson, Alasdair; Almansa, Andrés; Gousseau, Yann; Pérez, Patrick

    2014-03-01

    Line scratch detection in old films is a particularly challenging problem due to the variable spatiotemporal characteristics of this defect. Some of the main problems include sensitivity to noise and texture, and false detections due to thin vertical structures belonging to the scene. We propose a robust and automatic algorithm for frame-by-frame line scratch detection in old films, as well as a temporal algorithm for the filtering of false detections. In the frame-by-frame algorithm, we relax some of the hypotheses used in previous algorithms in order to detect a wider variety of scratches. This step's robustness and lack of external parameters is ensured by the combined use of an a contrario methodology and local statistical estimation. In this manner, over-detection in textured or cluttered areas is greatly reduced. The temporal filtering algorithm eliminates false detections due to thin vertical structures by exploiting the coherence of their motion with that of the underlying scene. Experiments demonstrate the ability of the resulting detection procedure to deal with difficult situations, in particular in the presence of noise, texture, and slanted or partial scratches. Comparisons show significant advantages over previous work.

  7. Ant system: optimization by a colony of cooperating agents.

    PubMed

    Dorigo, M; Maniezzo, V; Colorni, A

    1996-01-01

    An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call ant system (AS). We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed computation, and the use of a constructive greedy heuristic. Positive feedback accounts for rapid discovery of good solutions, distributed computation avoids premature convergence, and the greedy heuristic helps find acceptable solutions in the early stages of the search process. We apply the proposed methodology to the classical traveling salesman problem (TSP), and report simulation results. We also discuss parameter selection and the early setups of the model, and compare it with tabu search and simulated annealing using TSP. To demonstrate the robustness of the approach, we show how the ant system (AS) can be applied to other optimization problems like the asymmetric traveling salesman, the quadratic assignment and the job-shop scheduling. Finally we discuss the salient characteristics-global data structure revision, distributed communication and probabilistic transitions of the AS.

  8. Impact of 5 years of lean six sigma in a University Medical Center.

    PubMed

    Niemeijer, Gerard C; Trip, Albert; de Jong, Laura J; Wendt, Klaus W; Does, Ronald J M M

    2012-01-01

    Lean Six Sigma (LSS) is an originally industry-based methodology for cost reduction and quality improvement. In more recent years, LSS was introduced in health care as well. This article describes the experiences of the University Medical Center Groningen, the second largest hospital in the Netherlands, with LSS. It was introduced in 2007 to create the financial possibility to develop innovations. In this article, we describe how LSS was introduced, and how it developed in the following years. We zoom in at the traumatology department, where all main processes have been analyzed and improved. An evaluation after 5 years shows that LSS helped indeed reducing cost and improving quality. Moreover, it aided the transition of the organization from purely problem oriented to more process oriented, which in turn is helpful in eliminating waste and finding solutions for difficult problems. A major benefit of the program is that own employees are trained to become project leaders for improvement. Several people from the primary process were thus stimulated and equipped to become role models for continuous improvement.

  9. Investigation of the reasons for not using helmet among motorcyclists in Kerman, Iran.

    PubMed

    Maghsoudi, Aliasghar; Boostani, Dariush; Rafeiee, Manoochehr

    2018-03-01

    This study was carried out to investigate reasoning and interpretation of motorcyclists for not using helmet utilizing qualitative methodology of 'grounded theory'. The field of the study was Kerman, a cultural-historical city at the south-east of Iran. Participants were 21 young male motorcyclists. Two sampling strategies were used: maximum variation and snowball sampling. To collect data, in-depth, open-ended interviews were conducted. Data analysis yielded seven categories: fatalism; a barrier to social relationships; peer group pressure and negative labelling; messing up the appearance; disturbance in hearing and vision; barrier to normal breathing; and heaviness and superfluity of helmet. Based on the findings of the current study, it could be concluded that socio-cultural contexts, motorcyclists' worldview and partly helmet-related problems are of the main factors which affect motorcycling. Therefore, the studies, policy-makings, and intervening programmes to control injury and to promote safety among motorcyclists should be focused on socio-cultural barriers to helmet use in general and changing the motorcyclists' standpoints toward fatalism in particular. Helmet-related problems should be considered, too.

  10. Enhancing Knowledge Sharing Management Using BIM Technology in Construction

    PubMed Central

    Ho, Shih-Ping; Tserng, Hui-Ping

    2013-01-01

    Construction knowledge can be communicated and reused among project managers and jobsite engineers to alleviate problems on a construction jobsite and reduce the time and cost of solving problems related to constructability. This paper proposes a new methodology for the sharing of construction knowledge by using Building Information Modeling (BIM) technology. The main characteristics of BIM include illustrating 3D CAD-based presentations and keeping information in a digital format and facilitation of easy updating and transfer of information in the BIM environment. Using the BIM technology, project managers and engineers can gain knowledge related to BIM and obtain feedback provided by jobsite engineers for future reference. This study addresses the application of knowledge sharing management using BIM technology and proposes a BIM-based Knowledge Sharing Management (BIMKSM) system for project managers and engineers. The BIMKSM system is then applied in a selected case study of a construction project in Taiwan to demonstrate the effectiveness of sharing knowledge in the BIM environment. The results demonstrate that the BIMKSM system can be used as a visual BIM-based knowledge sharing management platform by utilizing the BIM technology. PMID:24723790

  11. Enhancing knowledge sharing management using BIM technology in construction.

    PubMed

    Ho, Shih-Ping; Tserng, Hui-Ping; Jan, Shu-Hui

    2013-01-01

    Construction knowledge can be communicated and reused among project managers and jobsite engineers to alleviate problems on a construction jobsite and reduce the time and cost of solving problems related to constructability. This paper proposes a new methodology for the sharing of construction knowledge by using Building Information Modeling (BIM) technology. The main characteristics of BIM include illustrating 3D CAD-based presentations and keeping information in a digital format and facilitation of easy updating and transfer of information in the BIM environment. Using the BIM technology, project managers and engineers can gain knowledge related to BIM and obtain feedback provided by jobsite engineers for future reference. This study addresses the application of knowledge sharing management using BIM technology and proposes a BIM-based Knowledge Sharing Management (BIMKSM) system for project managers and engineers. The BIMKSM system is then applied in a selected case study of a construction project in Taiwan to demonstrate the effectiveness of sharing knowledge in the BIM environment. The results demonstrate that the BIMKSM system can be used as a visual BIM-based knowledge sharing management platform by utilizing the BIM technology.

  12. Quantitative and qualitative approaches in educational research — problems and examples of controlled understanding through interpretive methods

    NASA Astrophysics Data System (ADS)

    Neumann, Karl

    1987-06-01

    In the methodological discussion of recent years it has become apparent that many research problems, including problems relating to the theory of educational science, cannot be solved by using quantitative methods. The multifaceted aspects of human behaviour and all its environment-bound subtle nuances, especially the process of education or the development of identity, cannot fully be taken into account within a rigid neopositivist approach. In employing the paradigm of symbolic interactionism as a suitable model for the analysis of processes of education and formation, the research has generally to start out from complex reciprocal social interactions instead of unambigious connections of causes. In analysing several particular methodological problems, the article demonstrates some weaknesses of quantitative approaches and then shows the advantages in and the necessity for using qualitative research tools.

  13. Variational Bayesian Learning for Wavelet Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Roussos, E.; Roberts, S.; Daubechies, I.

    2005-11-01

    In an exploratory approach to data analysis, it is often useful to consider the observations as generated from a set of latent generators or "sources" via a generally unknown mapping. For the noisy overcomplete case, where we have more sources than observations, the problem becomes extremely ill-posed. Solutions to such inverse problems can, in many cases, be achieved by incorporating prior knowledge about the problem, captured in the form of constraints. This setting is a natural candidate for the application of the Bayesian methodology, allowing us to incorporate "soft" constraints in a natural manner. The work described in this paper is mainly driven by problems in functional magnetic resonance imaging of the brain, for the neuro-scientific goal of extracting relevant "maps" from the data. This can be stated as a `blind' source separation problem. Recent experiments in the field of neuroscience show that these maps are sparse, in some appropriate sense. The separation problem can be solved by independent component analysis (ICA), viewed as a technique for seeking sparse components, assuming appropriate distributions for the sources. We derive a hybrid wavelet-ICA model, transforming the signals into a domain where the modeling assumption of sparsity of the coefficients with respect to a dictionary is natural. We follow a graphical modeling formalism, viewing ICA as a probabilistic generative model. We use hierarchical source and mixing models and apply Bayesian inference to the problem. This allows us to perform model selection in order to infer the complexity of the representation, as well as automatic denoising. Since exact inference and learning in such a model is intractable, we follow a variational Bayesian mean-field approach in the conjugate-exponential family of distributions, for efficient unsupervised learning in multi-dimensional settings. The performance of the proposed algorithm is demonstrated on some representative experiments.

  14. IESIP - AN IMPROVED EXPLORATORY SEARCH TECHNIQUE FOR PURE INTEGER LINEAR PROGRAMMING PROBLEMS

    NASA Technical Reports Server (NTRS)

    Fogle, F. R.

    1994-01-01

    IESIP, an Improved Exploratory Search Technique for Pure Integer Linear Programming Problems, addresses the problem of optimizing an objective function of one or more variables subject to a set of confining functions or constraints by a method called discrete optimization or integer programming. Integer programming is based on a specific form of the general linear programming problem in which all variables in the objective function and all variables in the constraints are integers. While more difficult, integer programming is required for accuracy when modeling systems with small numbers of components such as the distribution of goods, machine scheduling, and production scheduling. IESIP establishes a new methodology for solving pure integer programming problems by utilizing a modified version of the univariate exploratory move developed by Robert Hooke and T.A. Jeeves. IESIP also takes some of its technique from the greedy procedure and the idea of unit neighborhoods. A rounding scheme uses the continuous solution found by traditional methods (simplex or other suitable technique) and creates a feasible integer starting point. The Hook and Jeeves exploratory search is modified to accommodate integers and constraints and is then employed to determine an optimal integer solution from the feasible starting solution. The user-friendly IESIP allows for rapid solution of problems up to 10 variables in size (limited by DOS allocation). Sample problems compare IESIP solutions with the traditional branch-and-bound approach. IESIP is written in Borland's TURBO Pascal for IBM PC series computers and compatibles running DOS. Source code and an executable are provided. The main memory requirement for execution is 25K. This program is available on a 5.25 inch 360K MS DOS format diskette. IESIP was developed in 1990. IBM is a trademark of International Business Machines. TURBO Pascal is registered by Borland International.

  15. Does the patient‐held record improve continuity and related outcomes in cancer care: a systematic review

    PubMed Central

    Gysels, Marjolein; Richardson, Alison; Higginson, Irene J.

    2006-01-01

    Abstract Objectives  To assess the effectiveness of the patient‐held record (PHR) in cancer care. Background  Patients with cancer may receive care from different services resulting in gaps. A PHR could provide continuity and patient involvement in care. Search strategy  Relevant literature was identified through five electronic databases (Medline, Embase, Cinahl, CCTR and CDSR) and hand searches. Inclusion criteria  Patient‐held records in cancer care with the purpose of improving communication and information exchange between and within different levels of care and to promote continuity of care and patients’ involvement in their own care. Data extraction and synthesis  Data extraction recorded characteristics of intervention, type of study and factors that contributed to methodological quality of individual studies. Data were then contrasted by setting, objectives, population, study design, outcome measures and changes in outcome, including knowledge, satisfaction, anxiety and depression. Methodological quality of randomized control trials and non‐experimental studies were assessed with separate standard grading scales. Main results and conclusions  Seven randomized control trials and six non‐experimental studies were identified. Evaluations of the PHR have reached equivocal findings. Randomized trials found an absence of effect, non‐experimental evaluations shed light on the conditions for its successful use. Most patients welcomed introduction of a PHR. Main problems related to its suitability for different patient groups and the lack of agreement between patients and health professionals regarding its function. Further research is required to determine the conditions under which the PHR can realize its potential as a tool to promote continuity of care and patient participation. PMID:17324196

  16. The link between descriptors 8 and 9 of the Marine Strategy Framework Directive: lessons learnt in Spain.

    PubMed

    Gago, J; Viñas, L; Besada, V; Bellas, J

    2014-12-01

    The aim of this note is to discuss the relevance of the interaction/integration of monitoring of contaminants for the protection of the marine environment and for human health safety (descriptors 8 and 9, respectively) within the Marine Strategy Framework Directive (MSFD). The identification of possible relations between contaminant levels in sediments and tissues of fish and other seafood, as well as the association of those levels to pollution sources, are major challenges for marine researchers. The Spanish initial assessment in the North-East Atlantic marine region was used as an example to show some gaps and loopholes when dealing with the relationship between descriptors 8 and 9. The main problem to deal with is that monitoring programmes intended for the assessment of marine environmental quality and for human health safety usually apply different approaches and methodologies, and even different tissues are analysed in some species (mainly fish). It is therefore recommended to make a profound revision of current sampling strategies, procedures and methodologies, including the selection of target species and tissues and to improve the traceability of samples of fish and other seafood for human consumption. On the other hand, despite the scope of descriptor 9 which is limited to commercially relevant species, this fact should not be an obstacle in the application of the 'ecosystem approach' within the MSFD. In order to appropriately solve these shortcomings, an information exchange system between authorities dealing with descriptors 8 and 9 should be strongly encouraged for the next steps of the MSFD's implementation.

  17. Definition of realistic disturbances as a crucial step during the assessment of resilience of natural wastewater treatment systems.

    PubMed

    Cuppens, A; Smets, I; Wyseure, G

    2012-01-01

    Natural wastewater treatment systems (WWTSs) for urban areas in developing countries are subjected to large fluctuations in their inflow. This situation can result in a decreased treatment performance. The main aims of this paper are to introduce resilience as a performance indicator for natural WWTSs and to propose a methodology for the identification and generation of realistic disturbances of WWTSs. Firstly, a definition of resilience is formulated for natural WWTSs together with a short discussion of its most relevant properties. An important aspect during the evaluation process of resilience is the selection of appropriate disturbances. Disturbances of the WWTS are caused by fluctuations in water quantity and quality characteristics of the inflow. An approach to defining appropriate disturbances is presented by means of water quantity and quality data collected for the urban wastewater system of Coronel Oviedo (Paraguay). The main problem under consideration is the potential negative impact of stormwater inflow and infiltration in the sanitary sewer system on the treatment performance of anaerobic waste stabilisation ponds.

  18. Urban water infrastructure asset management - a structured approach in four water utilities.

    PubMed

    Cardoso, M A; Silva, M Santos; Coelho, S T; Almeida, M C; Covas, D I C

    2012-01-01

    Water services are a strategic sector of large social and economic relevance. It is therefore essential that they are managed rationally and efficiently. Advanced water supply and wastewater infrastructure asset management (IAM) is key in achieving adequate levels of service in the future, particularly with regard to reliable and high quality drinking water supply, prevention of urban flooding, efficient use of natural resources and prevention of pollution. This paper presents a methodology for supporting the development of urban water IAM, developed during the AWARE-P project as well as an appraisal of its implementation in four water utilities. Both water supply and wastewater systems were considered. Due to the different contexts and features of the utilities, the main concerns vary from case to case; some problems essentially are related to performance, others to risk. Cost is a common deciding factor. The paper describes the procedure applied, focusing on the diversity of drivers, constraints, benefits and outcomes. It also points out the main challenges and the results obtained through the implementation of a structured procedure for supporting urban water IAM.

  19. Defining the Optimal Region of Interest for Hyperemia Grading in the Bulbar Conjunctiva

    PubMed Central

    Sánchez Brea, María Luisa; Mosquera González, Antonio; Evans, Katharine; Pena-Verdeal, Hugo

    2016-01-01

    Conjunctival hyperemia or conjunctival redness is a symptom that can be associated with a broad group of ocular diseases. Its levels of severity are represented by standard photographic charts that are visually compared with the patient's eye. This way, the hyperemia diagnosis becomes a nonrepeatable task that depends on the experience of the grader. To solve this problem, we have proposed a computer-aided methodology that comprises three main stages: the segmentation of the conjunctiva, the extraction of features in this region based on colour and the presence of blood vessels, and, finally, the transformation of these features into grading scale values by means of regression techniques. However, the conjunctival segmentation can be slightly inaccurate mainly due to illumination issues. In this work, we analyse the relevance of different features with respect to their location within the conjunctiva in order to delimit a reliable region of interest for the grading. The results show that the automatic procedure behaves like an expert using only a limited region of interest within the conjunctiva. PMID:28096890

  20. [Determination of vanadium concentration in foods produced on the Eastern Coast of Lake Maracaibo].

    PubMed

    Tudares, C M; Villalobos, H D

    1998-04-01

    In the northeastern coast of Lake Maracaibo it has been reported some years ago a high incidence of congenital malformations of the Central Nervous Systems (Neural Tube Defects Type). This epidemiological problem is present in other countries too (Ireland and New Zealand) and has been associated with oil activities. In fact, some experimental works inform about the vanadium compounds cellular toxic effects mainly in the Central Nervous System of mammals. The main goal of this work is to measure the vanadium content in foods produced in the northeastern coast of Lake Maracaibo. Lagunillas, Valmore Rodriguez, and Baralt were the districts selected for the work. The digestion of the samples achieved by the methodology reported by Myron et al., with Graphite Furnace Atomic Absorption. The amounts of vanadium in the different foods analized were higher than the controls in the bibliographic reports. At this moment, there is not definitive proofs that vanadium compounds are the etiological agents of the Neural Tube Defects, but, these compounds are presents in foods produced in the northeastern coast of Lake Maracaibo.

  1. Finite Element Method-Based Kinematics and Closed-Loop Control of Soft, Continuum Manipulators.

    PubMed

    Bieze, Thor Morales; Largilliere, Frederick; Kruszewski, Alexandre; Zhang, Zhongkai; Merzouki, Rochdi; Duriez, Christian

    2018-06-01

    This article presents a modeling methodology and experimental validation for soft manipulators to obtain forward kinematic model (FKM) and inverse kinematic model (IKM) under quasi-static conditions (in the literature, these manipulators are usually classified as continuum robots. However, their main characteristic of interest in this article is that they create motion by deformation, as opposed to the classical use of articulations). It offers a way to obtain the kinematic characteristics of this type of soft robots that is suitable for offline path planning and position control. The modeling methodology presented relies on continuum mechanics, which does not provide analytic solutions in the general case. Our approach proposes a real-time numerical integration strategy based on finite element method with a numerical optimization based on Lagrange multipliers to obtain FKM and IKM. To reduce the dimension of the problem, at each step, a projection of the model to the constraint space (gathering actuators, sensors, and end-effector) is performed to obtain the smallest number possible of mathematical equations to be solved. This methodology is applied to obtain the kinematics of two different manipulators with complex structural geometry. An experimental comparison is also performed in one of the robots, between two other geometric approaches and the approach that is showcased in this article. A closed-loop controller based on a state estimator is proposed. The controller is experimentally validated and its robustness is evaluated using Lypunov stability method.

  2. The effects of node exclusion on the centrality measures in graph models of interacting economic agents

    NASA Astrophysics Data System (ADS)

    Caetano, Marco Antonio Leonel; Yoneyama, Takashi

    2015-07-01

    This work concerns the study of the effects felt by a network as a whole when a specific node is perturbed. Many real world systems can be described by network models in which the interactions of the various agents can be represented as an edge of a graph. With a graph model in hand, it is possible to evaluate the effect of deleting some of its edges on the architecture and values of nodes of the network. Eventually a node may end up isolated from the rest of the network and an interesting problem is to have a quantitative measure of the impact of such an event. For instance, in the field of finance, the network models are very popular and the proposed methodology allows to carry out "what if" tests in terms of weakening the links between the economic agents, represented as nodes. The two main concepts employed in the proposed methodology are (i) the vibrational IC-Information Centrality, which can provide a measure of the relative importance of a particular node in a network and (ii) autocatalytic networks that can indicate the evolutionary trends of the network. Although these concepts were originally proposed in the context of other fields of knowledge, they were also found to be useful in analyzing financial networks. In order to illustrate the applicability of the proposed methodology, a case of study using the actual data comprising stock market indices of 12 countries is presented.

  3. Drug-targeting methodologies with applications: A review

    PubMed Central

    Kleinstreuer, Clement; Feng, Yu; Childress, Emily

    2014-01-01

    Targeted drug delivery to solid tumors is a very active research area, focusing mainly on improved drug formulation and associated best delivery methods/devices. Drug-targeting has the potential to greatly improve drug-delivery efficacy, reduce side effects, and lower the treatment costs. However, the vast majority of drug-targeting studies assume that the drug-particles are already at the target site or at least in its direct vicinity. In this review, drug-delivery methodologies, drug types and drug-delivery devices are discussed with examples in two major application areas: (1) inhaled drug-aerosol delivery into human lung-airways; and (2) intravascular drug-delivery for solid tumor targeting. The major problem addressed is how to deliver efficiently the drug-particles from the entry/infusion point to the target site. So far, most experimental results are based on animal studies. Concerning pulmonary drug delivery, the focus is on the pros and cons of three inhaler types, i.e., pressurized metered dose inhaler, dry powder inhaler and nebulizer, in addition to drug-aerosol formulations. Computational fluid-particle dynamics techniques and the underlying methodology for a smart inhaler system are discussed as well. Concerning intravascular drug-delivery for solid tumor targeting, passive and active targeting are reviewed as well as direct drug-targeting, using optimal delivery of radioactive microspheres to liver tumors as an example. The review concludes with suggestions for future work, considereing both pulmonary drug targeting and direct drug delivery to solid tumors in the vascular system. PMID:25516850

  4. Managing search complexity in linguistic geometry.

    PubMed

    Stilman, B

    1997-01-01

    This paper is a new step in the development of linguistic geometry. This formal theory is intended to discover and generalize the inner properties of human expert heuristics, which have been successful in a certain class of complex control systems, and apply them to different systems. In this paper, we investigate heuristics extracted in the form of hierarchical networks of planning paths of autonomous agents. Employing linguistic geometry tools the dynamic hierarchy of networks is represented as a hierarchy of formal attribute languages. The main ideas of this methodology are shown in the paper on two pilot examples of the solution of complex optimization problems. The first example is a problem of strategic planning for the air combat, in which concurrent actions of four vehicles are simulated as serial interleaving moves. The second example is a problem of strategic planning for the space comb of eight autonomous vehicles (with interleaving moves) that requires generation of the search tree of the depth 25 with the branching factor 30. This is beyond the capabilities of modern and conceivable future computers (employing conventional approaches). In both examples the linguistic geometry tools showed deep and highly selective searches in comparison with conventional search algorithms. For the first example a sketch of the proof of optimality of the solution is considered.

  5. Seeing the NICE side of cost-effectiveness analysis: a qualitative investigation of the use of CEA in NICE technology appraisals.

    PubMed

    Bryan, Stirling; Williams, Iestyn; McIver, Shirley

    2007-02-01

    Resource scarcity is the raison d'être for the discipline of economics. Thus, the primary purpose of economic analysis is to help decision-makers when addressing problems arising due to the scarcity problem. The research reported here was concerned with how cost-effectiveness information is used by the National Institute for Health & Clinical Excellence (NICE) in national technology coverage decisions in the UK, and how its impact might be increased. The research followed a qualitative case study methodology with semi-structured interviews, supported by observation and analysis of secondary sources. Our research highlights that the technology appraisal function of NICE represents an important progression for the UK health economics community: new cost-effectiveness work is commissioned for each technology and that work directly informs national health policy. However, accountability in policy decisions necessitates that the information upon which decisions are based (including cost-effectiveness analysis, CEA) is accessible. This was found to be a serious problem and represents one of the main ongoing challenges. Other issues highlighted include perceived weaknesses in analysis methods and the poor alignment between the health maximisation objectives assumed in economic analyses and the range of other objectives facing decision-makers in reality. Copyright (c) 2006 John Wiley & Sons, Ltd.

  6. Bayesian design of decision rules for failure detection

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.; Willsky, A. S.

    1984-01-01

    The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.

  7. Development of Contemporary Problem-Based Learning Projects in Particle Technology

    ERIC Educational Resources Information Center

    Harris, Andrew T.

    2009-01-01

    The University of Sydney has offered an undergraduate course in particle technology using a contemporary problem based learning (PBL) methodology since 2005. Student learning is developed through the solution of complex, open-ended problems drawn from modern chemical engineering practice. Two examples are presented; i) zero emission electricity…

  8. The Study of Socio-Biospheric Problems.

    ERIC Educational Resources Information Center

    Scott, Andrew M.

    Concepts, tools, and a methodology are needed which will permit the analysis of emergent socio-biospheric problems and facilitate their effective management. Many contemporary problems may be characterized as socio-biospheric; for example, pollution of the seas, acid rain, the growth of cities, and an atmosphere loaded with carcinogens. However,…

  9. Atwood's Machine as a Tool to Introduce Variable Mass Systems

    ERIC Educational Resources Information Center

    de Sousa, Celia A.

    2012-01-01

    This article discusses an instructional strategy which explores eventual similarities and/or analogies between familiar problems and more sophisticated systems. In this context, the Atwood's machine problem is used to introduce students to more complex problems involving ropes and chains. The methodology proposed helps students to develop the…

  10. Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh

    2017-03-01

    Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.

  11. Methodology of modeling and measuring computer architectures for plasma simulations

    NASA Technical Reports Server (NTRS)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  12. Towards a Methodology for Identifying Program Constraints During Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo

    1997-01-01

    Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.

  13. Collection Evaluation in Research Libraries: The Search for Quality, Consistency, and System in Collection Development.

    ERIC Educational Resources Information Center

    Mosher, Paul H.

    1979-01-01

    Reviews the history, literature, and methodology of collection evaluation or assessment in American research libraries; discusses current problems, tools, and methodology of evaluation; and describes an ongoing collection evaluation program at the Stanford University Libraries. (Author/MBR)

  14. Quantitative local analysis of nonlinear systems

    NASA Astrophysics Data System (ADS)

    Topcu, Ufuk

    This thesis investigates quantitative methods for local robustness and performance analysis of nonlinear dynamical systems with polynomial vector fields. We propose measures to quantify systems' robustness against uncertainties in initial conditions (regions-of-attraction) and external disturbances (local reachability/gain analysis). S-procedure and sum-of-squares relaxations are used to translate Lyapunov-type characterizations to sum-of-squares optimization problems. These problems are typically bilinear/nonconvex (due to local analysis rather than global) and their size grows rapidly with state/uncertainty space dimension. Our approach is based on exploiting system theoretic interpretations of these optimization problems to reduce their complexity. We propose a methodology incorporating simulation data in formal proof construction enabling more reliable and efficient search for robustness and performance certificates compared to the direct use of general purpose solvers. This technique is adapted both to region-of-attraction and reachability analysis. We extend the analysis to uncertain systems by taking an intentionally simplistic and potentially conservative route, namely employing parameter-independent rather than parameter-dependent certificates. The conservatism is simply reduced by a branch-and-hound type refinement procedure. The main thrust of these methods is their suitability for parallel computing achieved by decomposing otherwise challenging problems into relatively tractable smaller ones. We demonstrate proposed methods on several small/medium size examples in each chapter and apply each method to a benchmark example with an uncertain short period pitch axis model of an aircraft. Additional practical issues leading to a more rigorous basis for the proposed methodology as well as promising further research topics are also addressed. We show that stability of linearized dynamics is not only necessary but also sufficient for the feasibility of the formulations in region-of-attraction analysis. Furthermore, we generalize an upper bound refinement procedure in local reachability/gain analysis which effectively generates non-polynomial certificates from polynomial ones. Finally, broader applicability of optimization-based tools stringently depends on the availability of scalable/hierarchial algorithms. As an initial step toward this direction, we propose a local small-gain theorem and apply to stability region analysis in the presence of unmodeled dynamics.

  15. Soft System Methodology as a Tool to Understand Issues of Governmental Affordable Housing Programme of India: A Case Study Approach

    NASA Astrophysics Data System (ADS)

    Ghosh, Sukanya; Roy, Souvanic; Sanyal, Manas Kumar

    2016-09-01

    With the help of a case study, the article has explored current practices of implementation of governmental affordable housing programme for urban poor in a slum of India. This work shows that the issues associated with the problems of governmental affordable housing programme has to be addressed to with a suitable methodology as complexities are not only dealing with quantitative data but qualitative data also. The Hard System Methodologies (HSM), which is conventionally applied to address the issues, deals with real and known problems which can be directly solved. Since most of the issues of affordable housing programme as found in the case study are subjective and complex in nature, Soft System Methodology (SSM) has been tried for better representation from subjective points of views. The article explored drawing of Rich Picture as an SSM approach for better understanding and analysing complex issues and constraints of affordable housing programme so that further exploration of the issues is possible.

  16. Expert System Development Methodology (ESDM)

    NASA Technical Reports Server (NTRS)

    Sary, Charisse; Gilstrap, Lewey; Hull, Larry G.

    1990-01-01

    The Expert System Development Methodology (ESDM) provides an approach to developing expert system software. Because of the uncertainty associated with this process, an element of risk is involved. ESDM is designed to address the issue of risk and to acquire the information needed for this purpose in an evolutionary manner. ESDM presents a life cycle in which a prototype evolves through five stages of development. Each stage consists of five steps, leading to a prototype for that stage. Development may proceed to a conventional development methodology (CDM) at any time if enough has been learned about the problem to write requirements. ESDM produces requirements so that a product may be built with a CDM. ESDM is considered preliminary because is has not yet been applied to actual projects. It has been retrospectively evaluated by comparing the methods used in two ongoing expert system development projects that did not explicitly choose to use this methodology but which provided useful insights into actual expert system development practices and problems.

  17. Tunnel and Station Cost Methodology : Mined Tunnels

    DOT National Transportation Integrated Search

    1983-01-01

    The main objective of this study was to develop a model for estimating the cost of subway station and tunnel construction. This report describes a cost estimating methodology for subway tunnels that can be used by planners, designers, owners, and gov...

  18. Tunnel and Station Cost Methodology Volume II: Stations

    DOT National Transportation Integrated Search

    1981-01-01

    The main objective of this study was to develop a model for estimating the cost of subway station and tunnel construction. This report describes a cost estimating methodology for subway tunnels that can be used by planners, designers, owners, and gov...

  19. Empirical and pragmatic adequacy of grounded theory: Advancing nurse empowerment theory for nurses' practice.

    PubMed

    Udod, Sonia A; Racine, Louise

    2017-12-01

    To draw on the findings of a grounded theory study aimed at exploring how power is exercised in nurse-manager relationships in the hospital setting, this paper examines the empirical and pragmatic adequacy of grounded theory as a methodology to advance the concept of empowerment in the area of nursing leadership and management. The evidence on staff nurse empowerment has highlighted the magnitude of individual and organisational outcomes, but has not fully explicated the micro-level processes underlying how power is exercised, shared or created within the nurse-manager relationship. Although grounded theory is a widely adopted nursing research methodology, it remains less used in nursing leadership because of the dominance of quantitative approaches to research. Grounded theory methodology provides the empirical and pragmatic relevance to inform nursing practice and policy. Grounded theory is a relevant qualitative approach to use in leadership research as it provides a fine and detailed analysis of the process underlying complexity and bureaucracy. Discursive paper. A critical examination of the empirical and pragmatic relevance of grounded theory by (Corbin & Strauss, , ) as a method for analysing and solving problems in nurses' practice is provided. This paper provides evidence to support the empirical and pragmatic adequacy of grounded theory methodology. Although the application of the ontological, epistemological and methodological assumptions of grounded theory is challenging, this methodology is useful to address real-life problems in nursing practice by developing theoretical explanations of nurse empowerment, or lack thereof, in the workplace. Grounded theory represents a relevant methodology to inform nursing leadership research. Grounded theory is anchored in the reality of practice. The strength of grounded theory is to provide results that can be readily applied to clinical practice and policy as they arise from problems that affect practice and that are meaningful to nurses. © 2017 John Wiley & Sons Ltd.

  20. The experience of Greek-Cypriot individuals living with mental illness: preliminary results of a phenomenological study.

    PubMed

    Kaite, Charis P; Karanikola, Maria N; Vouzavali, Foteini J D; Koutroubas, Anna; Merkouris, Anastasios; Papathanassoglou, Elizabeth D E

    2016-10-06

    Research evidence shows that healthcare professionals do not fully comprehend the difficulty involved in problems faced by people living with severe mental illness (SMI). As a result, mental health service consumers do not show confidence in the healthcare system and healthcare professionals, a problem related to the phenomenon of adherence to therapy. Moreover, the issue of unmet needs in treating individuals living with SMI is relared to their quality of life in a negative way. A qualitative methodological approach based on the methodology of van Manen phenomenology was employed through a purposive sampling of ten people living with SMI. The aim was to explore their perceptions and interpretations regarding: a) their illness, b) their self-image throughout the illness, c) the social implications following their illness, and d) the quality of the therapeutic relationship with mental health nurses. Participants were recruited from a community mental health service in a Greek-Cypriot urban city. Data were collected through personal, semi-structured interviews. Several main themes were identified through the narratives of all ten participants. Main themes included: a) The meaning of mental illness, b) The different phases of the illness in time, c) The perception of the self during the illness, d) Perceptions about the effectiveness of pharmacotherapy, e) Social and personal consequences for participants following the diagnosis of mental illness, f) Participants' perceptions regarding mental health professionals and services and g) The therapeutic effect of the research interview on the participants. The present study provides data for the enhancement of the empathic understanding of healthcare professionals regarding the concerns and particular needs of individuals living with SMI, as well as the formation of targeted psychosocial interventions based on these needs. Overall, the present data illuminate the necessity for the reconstruction of the provided mental healthcare in Cyprus into a more recovery- oriented approach in order to address personal identity and self-determination issues and the way these are related to management of pharmacotherapy. Qualitative studies aiming to further explore issues of self-identity during ill health and its association with adherence to therapy, resilience and self-determination, are also proposed.

  1. Analysis of Turbulent Boundary-Layer over Rough Surfaces with Application to Projectile Aerodynamics

    DTIC Science & Technology

    1988-12-01

    12 V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES ....................... 12 1. COMPONENT BUILD-UP IN DRAG...dimensional roughness. II. CLASSIFICATION OF PREDICTION METHODS Prediction methods can be classified into two main approache-: 1) Correlation methodologies ...data are availaNe. V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES 1. COMPONENT BUILD-UP IN DRAG The new correlation can be used for an engine.ring

  2. A Low-Cost and Secure Solution for e-Commerce

    NASA Astrophysics Data System (ADS)

    Pasquet, Marc; Vacquez, Delphine; Rosenberger, Christophe

    We present in this paper a new architecture for remote banking and e-commerce applications. The proposed solution is designed to be low cost and provides some good guarantees of security for a client and his bank issuer. Indeed, the main problem for an issuer is to identify and authenticate one client (a cardholder) using his personal computer through the web when this client wants to access to remote banking services or when he wants to pay on a e-commerce site equipped with 3D-secure payment solution. The proposed solution described in this paper is MasterCard Chip Authentication Program compliant and was experimented in the project called SOPAS. The main contribution of this system consists in the use of a smartcard with a I2C bus that pilots a terminal only equipped with a screen and a keyboard. During the use of services, the user types his PIN code on the keyboard and all the security part of the transaction is performed by the chip of the smartcard. None information of security stays on the personal computer and a dynamic token created by the card is sent to the bank and verified by the front end. We present first the defined methodology and we analyze the main security aspects of the proposed solution.

  3. [Methodological aspects of the reconstitution and evaluation of the behavioral theories that underlie population policy].

    PubMed

    Leeuw, F

    1991-09-01

    This work discusses methodological aspects of the articulation and evaluation of behavioral theories underlying demographic policies. Such theories, called "policy theories" among other terms, may be defined as a group of hypotheses explicitly translated into predictions about behavior that underlie policy measures and that concern the relations between the measure and the objective to be attained. Interest in policy theories has been reflected in the writings of such demographers as D. Bogue, J. Blake, and T. Burch, and of researchers from other social science disciplines. 2 examples of policy theories from the Netherlands are presented to illustrate the discussion, 1 describing family planning communication programs that were intended to reduce the number of unwanted and unplanned pregnancies, and the other describing measures to increase availability of child care services in order to facilitate labor force participation of women and ultimately to increase the birth rate. Both theories are found to be comprised of 2 main parallel theories and several related hypotheses. Because political authorities do not usually make explicit the hypotheses that support political measures, their hypotheses must be articulated and reconstituted through attention to debates, written communications, interviews, and other means. The reconstitution must be done as objectively as possible, which implies the need to follow some methodologic rules. Examples are cited of principles advanced by researchers in management science, market research, and political science. 7 methodological rules or steps are then suggested for articulating policy theories: 1) identify statements relative to the political problem, such as excessive or inadequate fertility rates; 2) use the sources to identify reasons for undertaking concrete policy measures; 3) describe the role of the official in the political process; 4) inventory all declarations concerning the relationship between the objective and the means of attaining it; 5) make explicit the links and sequences left implicit in these declarations; 6) identify the normative declarations relative to the policy problem under study, and 7) try to classify all the inventoried declarations into "if-then" or "more-more" statements in a system of hypotheses where each hypothesis can be deduced from another hypothesis. Evaluation of policy theories is necessary and can be conducted according to epistemological criteria as well as criteria relating to implementation and strategy.

  4. Current Methodological Problems and Future Directions for Theory Development in the Psychology of Sport and Motor Behavior.

    ERIC Educational Resources Information Center

    Bird, Anne Marie; Ross, Diane

    1984-01-01

    A brief history of research in sport psychology based on Lander's (1982) analysis is presented. A systematic approach to theory building is offered. Previous methodological inadequacies are identified using examples of observational learning and anxiety. (Author/DF)

  5. Structural Equation Modeling of School Violence Data: Methodological Considerations

    ERIC Educational Resources Information Center

    Mayer, Matthew J.

    2004-01-01

    Methodological challenges associated with structural equation modeling (SEM) and structured means modeling (SMM) in research on school violence and related topics in the social and behavioral sciences are examined. Problems associated with multiyear implementations of large-scale surveys are discussed. Complex sample designs, part of any…

  6. Modeling, Analyzing, and Mitigating Dissonance Between Alerting Systems

    NASA Technical Reports Server (NTRS)

    Song, Lixia; Kuchar, James K.

    2003-01-01

    Alerting systems are becoming pervasive in process operations, which may result in the potential for dissonance or conflict in information from different alerting systems that suggests different threat levels and/or actions to resolve hazards. Little is currently available to help in predicting or solving the dissonance problem. This thesis presents a methodology to model and analyze dissonance between alerting systems, providing both a theoretical foundation for understanding dissonance and a practical basis from which specific problems can be addressed. A state-space representation of multiple alerting system operation is generalized that can be tailored across a variety of applications. Based on the representation, two major causes of dissonance are identified: logic differences and sensor error. Additionally, several possible types of dissonance are identified. A mathematical analysis method is developed to identify the conditions for dissonance originating from logic differences. A probabilistic analysis methodology is developed to estimate the probability of dissonance originating from sensor error, and to compare the relative contribution to dissonance of sensor error against the contribution from logic differences. A hybrid model, which describes the dynamic behavior of the process with multiple alerting systems, is developed to identify dangerous dissonance space, from which the process can lead to disaster. Methodologies to avoid or mitigate dissonance are outlined. Two examples are used to demonstrate the application of the methodology. First, a conceptual In-Trail Spacing example is presented. The methodology is applied to identify the conditions for possible dissonance, to identify relative contribution of logic difference and sensor error, and to identify dangerous dissonance space. Several proposed mitigation methods are demonstrated in this example. In the second example, the methodology is applied to address the dissonance problem between two air traffic alert and avoidance systems: the existing Traffic Alert and Collision Avoidance System (TCAS) vs. the proposed Airborne Conflict Management system (ACM). Conditions on ACM resolution maneuvers are identified to avoid dynamic dissonance between TCAS and ACM. Also included in this report is an Appendix written by Lee Winder about recent and continuing work on alerting systems design. The application of Markov Decision Process (MDP) theory to complex alerting problems is discussed and illustrated with an abstract example system.

  7. Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study

    NASA Astrophysics Data System (ADS)

    Gao, Yun; Kontoyiannis, Ioannis; Bienenstock, Elie

    2008-06-01

    Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM) with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i) For all estimators considered, the main source of error is the bias. (ii) The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii) The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv) The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in the data, and with longer word-lengths the empirical distribution is severely undersampled, leading to large biases.

  8. Changes in the reflectance of ex situ leaves: A methodological approach

    NASA Astrophysics Data System (ADS)

    Ponzoni, Flavio Jorge; Inoe, Mario Takao

    1992-04-01

    The main aspects of the interaction between electromagnetic radiation and detached leaves are presented. An experiment with Eucalipto and Araucaria detached leaves is described, including the description of the methodologies utilized in the collection and storage of the reflectance.

  9. Railroad classification yard design methodology study : East Deerfield Yard, a case study

    DOT National Transportation Integrated Search

    1980-02-01

    This interim report documents the application of a railroad classification yard design methodology to Boston and Maine's East Deerfield Yard Rehabiliation. This case study effort represents Phase 2 of a larger effort to develop a yard design methodol...

  10. Towards lexicographic multi-objective linear programming using grossone methodology

    NASA Astrophysics Data System (ADS)

    Cococcioni, Marco; Pappalardo, Massimo; Sergeyev, Yaroslav D.

    2016-10-01

    Lexicographic Multi-Objective Linear Programming (LMOLP) problems can be solved in two ways: preemptive and nonpreemptive. The preemptive approach requires the solution of a series of LP problems, with changing constraints (each time the next objective is added, a new constraint appears). The nonpreemptive approach is based on a scalarization of the multiple objectives into a single-objective linear function by a weighted combination of the given objectives. It requires the specification of a set of weights, which is not straightforward and can be time consuming. In this work we present both mathematical and software ingredients necessary to solve LMOLP problems using a recently introduced computational methodology (allowing one to work numerically with infinities and infinitesimals) based on the concept of grossone. The ultimate goal of such an attempt is an implementation of a simplex-like algorithm, able to solve the original LMOLP problem by solving only one single-objective problem and without the need to specify finite weights. The expected advantages are therefore obvious.

  11. Genetic therapy in gliomas: historical analysis and future perspectives.

    PubMed

    Mattei, Tobias Alécio; Ramina, Ricardo; Miura, Flavio Key; Aguiar, Paulo Henrique; Valiengo, Leandro da Costa

    2005-03-01

    High-grade gliomas are relatively frequent in adults, and consist of the most malignant kind of primary brain tumor. Being resistant to standard treatment modalities such as surgery, radiation, and chemotherapy, it is fatal within 1 to 2 years of onset of symptoms. Although several gene therapy systems proved to be efficient in controlling or eradicating these tumors in animal models, the clinical studies performed so far were not equally successful. Most clinical studies showed that methodologies that increase tumor infection/transduction and, consequently confer more permanent activity against the tumor, will lead to enhanced therapeutic results. Due to the promising practical clinical benefits that can be expected for the near future, an exposition to the practicing neurosurgeon about the basic issues in genetic therapy of gliomas seems convenient. Among the main topics, we shall discuss anti-tumoral mechanisms of various genes that can be transfected, the advantages and drawbacks of the different vectors utilized, the possibilities of tumor targeting by modifications in the native tropism of virus vectors, as well as the different physical methods for vector delivery to the tumors. Along with the exposition we will also review of the history of the genetic therapy for gliomas, with special focus on the main problems found during the advancement of scientific discoveries in this area. A general analysis is also made of the present state of this promising therapeutic modality, with reference to the problems that still must be solved and the new paradigms for future research in this area.

  12. A pilot exercise on comparative risk assessment in Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cebrian, M.E.; Albores, A.; Sierra, A.

    1996-12-31

    Concern in the Mexican government and academic institutions about human health problems derived from exposure to environmental contaminants has been increasing. This interest prompted us to perform a pilot study to identify and rank potentially problematic environmental situations. We were given access to files from the Instituto Nacional de Ecologia. We screened about 2,500 documents and selected about 200 reports for further analysis. We adapted methodologies developed by the U.S. Environmental Protection Agency (EPA 1993) and ATSDR (1992) to analyze environmental data. San Luis Potosi City and Region Lagunera were the areas posing greater risks. We chose San Luis Potosimore » City to perform a more detailed study, since here a smelting complex is located within an urban zone. The high levels of As, Pb, and Cd in environmental media resulted in a higher body burden in exposed children than children living 7 km away. Multiple regression analysis suggested that alterations in sensorial nerve transmission were mainly related to As in urine (AsU), whereas those in motor nerves were mainly related to Pb in blood (PbB). No apparent relationships associated with CdU were found. Slower auditory nerve conduction was associated with both AsU and PbB. These findings suggest that exposed children are also at high risk of suffering other adverse health effects. This exercise illustrates the need to perform studies aimed at identifying and ranking environmental contamination problems in industrializing countries. 5 refs., 1 tab.« less

  13. Methodological problems in the use of indirect comparisons for evaluating healthcare interventions: survey of published systematic reviews.

    PubMed

    Song, Fujian; Loke, Yoon K; Walsh, Tanya; Glenny, Anne-Marie; Eastwood, Alison J; Altman, Douglas G

    2009-04-03

    To investigate basic assumptions and other methodological problems in the application of indirect comparison in systematic reviews of competing healthcare interventions. Survey of published systematic reviews. Inclusion criteria Systematic reviews published between 2000 and 2007 in which an indirect approach had been explicitly used. Identified reviews were assessed for comprehensiveness of the literature search, method for indirect comparison, and whether assumptions about similarity and consistency were explicitly mentioned. The survey included 88 review reports. In 13 reviews, indirect comparison was informal. Results from different trials were naively compared without using a common control in six reviews. Adjusted indirect comparison was usually done using classic frequentist methods (n=49) or more complex methods (n=18). The key assumption of trial similarity was explicitly mentioned in only 40 of the 88 reviews. The consistency assumption was not explicit in most cases where direct and indirect evidence were compared or combined (18/30). Evidence from head to head comparison trials was not systematically searched for or not included in nine cases. Identified methodological problems were an unclear understanding of underlying assumptions, inappropriate search and selection of relevant trials, use of inappropriate or flawed methods, lack of objective and validated methods to assess or improve trial similarity, and inadequate comparison or inappropriate combination of direct and indirect evidence. Adequate understanding of basic assumptions underlying indirect and mixed treatment comparison is crucial to resolve these methodological problems. APPENDIX 1: PubMed search strategy. APPENDIX 2: Characteristics of identified reports. APPENDIX 3: Identified studies. References of included studies.

  14. Global-local methodologies and their application to nonlinear analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1989-01-01

    An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.

  15. Teaching mathematical word problem solving: the quality of evidence for strategy instruction priming the problem structure.

    PubMed

    Jitendra, Asha K; Petersen-Brown, Shawna; Lein, Amy E; Zaslofsky, Anne F; Kunkel, Amy K; Jung, Pyung-Gang; Egan, Andrea M

    2015-01-01

    This study examined the quality of the research base related to strategy instruction priming the underlying mathematical problem structure for students with learning disabilities and those at risk for mathematics difficulties. We evaluated the quality of methodological rigor of 18 group research studies using the criteria proposed by Gersten et al. and 10 single case design (SCD) research studies using criteria suggested by Horner et al. and the What Works Clearinghouse. Results indicated that 14 group design studies met the criteria for high-quality or acceptable research, whereas SCD studies did not meet the standards for an evidence-based practice. Based on these findings, strategy instruction priming the mathematics problem structure is considered an evidence-based practice using only group design methodological criteria. Implications for future research and for practice are discussed. © Hammill Institute on Disabilities 2013.

  16. Proposing integrated Shannon's entropy-inverse data envelopment analysis methods for resource allocation problem under a fuzzy environment

    NASA Astrophysics Data System (ADS)

    Çakır, Süleyman

    2017-10-01

    In this study, a two-phase methodology for resource allocation problems under a fuzzy environment is proposed. In the first phase, the imprecise Shannon's entropy method and the acceptability index are suggested, for the first time in the literature, to select input and output variables to be used in the data envelopment analysis (DEA) application. In the second step, an interval inverse DEA model is executed for resource allocation in a short run. In an effort to exemplify the practicality of the proposed fuzzy model, a real case application has been conducted involving 16 cement firms listed in Borsa Istanbul. The results of the case application indicated that the proposed hybrid model is a viable procedure to handle input-output selection and resource allocation problems under fuzzy conditions. The presented methodology can also lend itself to different applications such as multi-criteria decision-making problems.

  17. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  18. Integrative Problem-Centered Therapy: Toward the Synthesis of Family and Individual Psychotherapies.

    ERIC Educational Resources Information Center

    Pinsof, William M.

    1983-01-01

    Presents an overview of the Integrative Problem-Centered Therapy (IPCT) Model, and describes its core principles and premises, and basic methodological steps. The IPCT provides a technique for applying individual and family therapy and behavioral, communicational, and psychodynamic orientations to client problems. Its goal is to create efficient…

  19. Use of Problem-Based Learning in the Teaching and Learning of Horticultural Production

    ERIC Educational Resources Information Center

    Abbey, Lord; Dowsett, Eric; Sullivan, Jan

    2017-01-01

    Purpose: Problem-based learning (PBL), a relatively novel teaching and learning process in horticulture, was investigated. Proper application of PBL can potentially create a learning context that enhances student learning. Design/Methodology/Approach: Students worked on two complex ill-structured problems: (1) to produce fresh baby greens for a…

  20. The Problem-Solving Approach of Environmental Education.

    ERIC Educational Resources Information Center

    Connect, 1983

    1983-01-01

    The problem-solving approach in environmental education (EE), reports on EE programs and activities in selected foreign countries, and a report on the Asian Subregional Workshop on Teacher Training in EE are provided in this newsletter. The nature of the problem-solving approach and brief discussions of such methodologies as group discussion,…

Top